Lab Rats

Forbes has reported that Facebook “conducted secret tests to determine the magnitude of its Android users’ Facebook addiction.”  In the tests, which apparently occurred several years ago, users of the Facebook app for Android were subject to intentional crashes of the app. without being informed of the tests.

Why would Facebook want to provoke crashes that would frustrate users who were trying to wish a Facebook friend happy birthday or post their latest selfie?  Purportedly, to test the “resilience” of Facebook users.  If your app suddenly crashed, would you just say the hell with Facebook, or would you try to access Facebook through an internet browser instead, or through a different app?

paralyzed-ratsWhen you think about it, intentional crashes aren’t really testing “resilience” — they’re testing obsession and addiction.  After a crash, a rational person would avoid Facebook, for a while at least, reasoning that time was needed for anonymous techno-geeks at some far off location to address the cause of the crash and fix it.  Only somebody desperate for an immediate Facebook fix would spend time searching to get to Facebook via alternative means, because nothing time sensitive ever really happens on Facebook.  You can always send your friend an email expressing birthday wishes, or save that choice Throwback Thursday photo until next week.

But the point, of course, isn’t whether it’s resilience or obsession that is being tested — it’s the fact that Facebook is intentionally frustrating its users at all.  It sounds like the kind of experiment some evil scientist with a futuristic base on a remote island might use on hapless prisoners.  After all, why would you knowingly thwart the efforts of somebody who is trying to access your website?  Facebook no doubt would shrug and say the tests provided needed information — but really, it did the tests because it could . . . and it was confident that Facebook fans would keep coming back.

We shouldn’t be surprised by this:  Facebook has done similar kinds of tests before, and other companies do, too.  On the internet, we’re all lab rats.  Our movements are tracked constantly, but instead of scientists in white coats checking when we take a sip from the water dropper or stop running on the wheel or are responding to the electrodes placed on our hind quarters, data is compiled about which websites we visit, how long we stay there, what we click on, and whether we’re showing an interest in one product or another so that we can be bombarded with pop-up ads for that product forever.

Time for another spin on the wheel!

In The Cage With Facebook Lab Rats

Some people are very upset that Facebook has admitted conducting a psychological experiment on hundreds of thousands of randomly selected users.

In the 2012 study, Facebook data scientists decided to test the hypothesis that reading about the great things “Facebook friends” are writing about their lives depresses readers, who feel that their lives kind of suck by comparison.  So, for one week, the data scientists used an algorithm on the Facebook news feeds of almost 700,000 people to delete posts with words associated with positive, or negative, emotions to see whether it affected the kinds of posts those readers made.  The study ultimately refuted that hypothesis.

A number of people feel that the experiment treated Facebook users as guinea pigs, improperly tried to manipulate their emotions, and was unethical.  I can understand the sentiment, but I think we all need to accept that we are lab rats in a vast media stew in which the overriding goal is to manipulate our emotions and perceptions — whether the active agent is a Facebook post, an email designed to provoke us to make a contribution to a political candidate, or a banner ad that touts the miracle weight-loss qualities of a previously unknown plant.  Face it, folks — it’s all just part of navigating through our media-saturated modern culture.

Knowing about Facebook’s willingness to conduct broad-scale psychological and social experiments has its positive aspects, too — it helps to explain certain otherwise inexplicable realities of Facebook.  From my occasional review of my “news feed,” I’m guessing that Facebook is currently conducting tests on these other hypotheses:

*  What is more likely to cause “de-friending”:  incessant requests to play Facebook games or posting memes that express rote sentiments and demand “click like if you agree!”?

*  Are conservatives or liberals more likely to post ludicrously overheated, end-of-the-world-as-we-know-it reactions to current events?

*  Is there any location on a Facebook page where ads can be placed that readers will not be able to successfully ignore them?

*  Does the frequency of posts with pictures of food increase as Facebook users age?