The U.S.A. Out, But Not Down

I listened to the World Cup game between the U.S. and Belgium on my drive back from Cincinnati today and really found myself getting into it.  According to the radio announcers, at least, the U.S. got a stunning performance from goalie Tim Howard that kept them in the game, but the Belgian pressure finally yielded two goals in extra time and the United States was knocked out of the World Cup, 2-1.

I’m not going to pretend that I know all of the rules of soccer — I certainly don’t, and probably never will — and I’m not going to claim that I am as interested in soccer as I am in, say, college football.  I will say, however, that I enjoyed the U.S.A. run in the World Cup this year, and I’ll be hoping that the Americans make another, even deeper, run the next time they get to play on the world stage.  The U.S. may not be one of the elite teams yet, but it looks like the Americans may be getting there.  Good try, U.S.A.!

Now that the Americans are out, I’m not sure I’ll watch another game in this tournament — but maybe I will.  I’ll miss the British accents and the references to “nil” rather than “zero” and the other quirky elements of this global sporting event.  It’s been a fun ride.

In The Cage With Facebook Lab Rats

Some people are very upset that Facebook has admitted conducting a psychological experiment on hundreds of thousands of randomly selected users.

In the 2012 study, Facebook data scientists decided to test the hypothesis that reading about the great things “Facebook friends” are writing about their lives depresses readers, who feel that their lives kind of suck by comparison.  So, for one week, the data scientists used an algorithm on the Facebook news feeds of almost 700,000 people to delete posts with words associated with positive, or negative, emotions to see whether it affected the kinds of posts those readers made.  The study ultimately refuted that hypothesis.

A number of people feel that the experiment treated Facebook users as guinea pigs, improperly tried to manipulate their emotions, and was unethical.  I can understand the sentiment, but I think we all need to accept that we are lab rats in a vast media stew in which the overriding goal is to manipulate our emotions and perceptions — whether the active agent is a Facebook post, an email designed to provoke us to make a contribution to a political candidate, or a banner ad that touts the miracle weight-loss qualities of a previously unknown plant.  Face it, folks — it’s all just part of navigating through our media-saturated modern culture.

Knowing about Facebook’s willingness to conduct broad-scale psychological and social experiments has its positive aspects, too — it helps to explain certain otherwise inexplicable realities of Facebook.  From my occasional review of my “news feed,” I’m guessing that Facebook is currently conducting tests on these other hypotheses:

*  What is more likely to cause “de-friending”:  incessant requests to play Facebook games or posting memes that express rote sentiments and demand “click like if you agree!”?

*  Are conservatives or liberals more likely to post ludicrously overheated, end-of-the-world-as-we-know-it reactions to current events?

*  Is there any location on a Facebook page where ads can be placed that readers will not be able to successfully ignore them?

*  Does the frequency of posts with pictures of food increase as Facebook users age?