Living In The Matrix

I thought The Matrix was a terrific movie.  I like the sequel, too.  (The last film in the trilogy, eh, not so much.)

But I had no idea that reputable scientists were seriously considering the central premise of The Matrix — that what we think of as the real world is in fact a huge computer simulation run by machines and designed and policed to enslave humanity.  In fact, a scientist named Rizwan Virk has written a book, entitled The Simulation Hypothesis, about that possibility.

matrix_inThe Matrix concept is gaining traction for several reasons.  One is that computer technology, and games-playing technology, apparently is developing to the point where sophisticated multi-player, on-line games are routine and it’s becoming harder and harder to distinguish reality from simulation.  (I say “apparently” because I’m not a gamer — that is, unless I’m really trapped in a computer simulation and playing, unwittingly, just by living my life.)  If our technology is developing in that direction, the argument goes, isn’t it possible that we are living in a more advanced simulation created by more advanced computer system developed by a more advanced civilization?

And there’s also a weird statistical argument for the simulation hypothesis that goes like this:  once a civilization creates computers that are powerful enough to create plausible simulations for millions or billions of players, it’s comparatively easy to create entirely new, realistic settings for entirely new simulated players that are all artificial intelligence.  Crossing that technological-capability threshold means that trillions of AI creations could be living in games — making it statistically likely that you’re an AI creation rather than a flesh-and-blood being.

And here’s an even weirder concept:  if we’re all players in a video game, maybe our scores are being kept somewhere for some purpose that we don’t quite know yet, and won’t know until our own experience in the simulation ends.  It would help to know the rules of the game, wouldn’t it?

Are we living in a simulation?  I don’t see how you can prove or disprove that, from our perspective as potential players in an ultra-advanced game created by an ancient alien civilization.  But I do know this:  if that is our reality, I’m glad the programmers have finally allowed the weather to warm up a bit.

Advertisements

The Day The Dinosaurs Died

You’ve probably read about how a massive asteroid strike ended the era of the dinosaurs and caused their ultimate mass extinction.  The geological evidence indicates that, 66 million years ago, the asteroid hit on the Yucatan peninsula of modern Mexico and produced massive earthquakes, volcanic eruptions, tidal waves, and forest fires.  The strike threw up a dense plume of dust and debris that turned the world dark and wiped out 99 percent of life on Earth.  Thanks to that asteroid strike, the Cretaceous period ended with a bang and the way was clear for mammals — and human beings — to take the dinosaurs’ place at the top of the food chain.

sk-2017_04_article_main_mobileWhat was it like on the day, 66 million years ago, when the asteroid struck the Earth with such terrible force?  Robert DePalma, a doctoral student at the University of Kansas, has found compelling evidence of what happened on that momentous day, and this week he published his findings in the journal Proceedings of the National Academy of Sciences.  In 2012, looking at a site called Tanis, in the Hell Creek geological formation in North Dakota, DePalma found layers of perfectly preserved animals and fish fossils at the precise boundary between the Cretaceous period and the Tertiary period that followed it — the very day when the asteroid struck the Yucatan.

The geological evidence shows that the asteroid strike created a magnitude 10 or 11 earthquake that generated seismic waves that reached out thousands of miles.  In prehistoric North Dakota, which like much of the North American continent was covered by an inland sea, the seismic waves produced a water surge that threw fish onto shores to suffocate — producing the layers of fish and animals that DePalma found.  At the same time, molten material was hurled into the atmosphere.  In the geological formation, DePalma found bone, teeth, and hatchling remains of many dinosaur groups, including an intact dinosaur egg complete with embryo — indicating that the dinosaurs survived that fateful day, although their ultimate day of reckoning was coming.

In an article in the New Yorker, DePalma describes his find as “like finding the Holy Grail clutched in the bony fingers of Jimmy Hoffa, sitting on top of the Lost Ark.”  Thanks to him, we now know a lot more about the day that the ground buckled and snapped, the waters surged, the skies were lit with fire, and the world changed forever.

Studying Stonehenge

When I took a trip to England right after I graduated from college, one of the coolest places I visited was Stonehenge.  There was a strong air of ancient mystery lurking among the massive stones arranged in a circle on the Salisbury plains.  You couldn’t help but walk among the stones and think about where the enormous stones came from, who put them there, how in the world they got there — and what their mysterious purpose actually was.

02-stonehenge-dog-tooth.ngsversion.1492466772317.adapt_.1900.1Now scientists have answered the first question, at least in part:  many of the smaller stones at the Stonehenge site came from ancient quarries in the Preseli Hills of Wales, and they were consciously mined and taken to Stonehenge, not deposited on the Salisbury plains by glaciers.  Scientists used tools that allowed them to test the chemical composition of rocks in the quarry and match it to the composition of the rocks at Stonehenge.  The tests are so precise that scientists were able to determine that the Stonehenge stones came from quarries in the northern part of the hills rather than the southern part — a finding that is significant, because it means that the stones were probably transported to the Salisbury plains over land, rather than floated there on rivers.  The scientists also found mining tools at that date back to 3000 B.C., when the first stage of Stonehenge was built.

So now we know that, 5000 years ago, human beings mined large stones from Wales and then somehow dragged them 150 miles away, where they were arranged in circles that seem to be related in some way to the summer solstice.  But we don’t know why ancient humans would undertake such an enormous task, or how they accomplished it.  Unless someone invents a time machine, the answers to those questions probably will forever remain an unsolvable mystery — which is one reason why Stonehenge is so cool.

Debunking Drinking Wisdom

Shortly after I passed the legal drinking age and started drinking adult beverages, I first heard the aphorism “wine, then beer, and have no fear.”  Some years later, I heard the flip side:  “beer, then wine, and I feel fine.”  The idea behind each of the sayings — which are seemingly contradictory, in case you hadn’t noticed — was that if you sequenced what you drank, you could avoid a hangover.

wineandbeerAre either of the sayings true?

No, of course not . . . and now a study has confirmed it.  Researchers from the Witten/Herdecke University in Germany and the University of Cambridge in the United Kingdom — two countries, incidentally, that are very serious about their wine and beer — studied whether the sequence in which alcoholic beverages are consumed might affect how people who overindulge feel the next day.  One group drank beer, then wine, and another drank wine, then beer.  A third, control group drank only one or the other.

The study found that the drinking sequence made no difference in the hangover impact.  One of the researchers explained: “The truth is that drinking too much of any alcoholic drink is likely to result in a hangover. The only reliable way of predicting how miserable you’ll feel the next day is by how drunk you feel and whether you are sick. We should all pay attention to these red flags when drinking.”  (No kidding!)

And get this:  another of the researchers makes the dubious argument that hangovers actually can have positive effects.  He stated: “Unpleasant as hangovers are, we should remember that they do have one important benefit, at least: They are a protective warning sign that will certainly have aided humans over the ages to change their future behavior. In other words, they can help us learn from our mistakes.”  Boy, scientists are perverse, aren’t they?

I’d never argue that hangovers are a good thing, but I do know this — any perceived folk wisdom about drinking that rhymes and is capable of being remembered after a few drinks probably isn’t that wise after all.

Let The Sun Shine In?

I recently returned from a beach vacation.  One of our daily rituals was slathering on SPF 50 sunscreen to try to protect ourselves against the blazing sunshine.  We wanted to be in the warm sun rather than the gray cold Midwest, obviously, but we’d accepted the healthcare cautions about sunshine and skin cancer, and so the sunblock went on.

But what if the healthcare cautions that led to our lubing up are wrong — as in, 180-degree wrong?  What if exposure to sunshine is not only not bad for you, but in fact it helps you to be healthier in countless ways, by effectively and efficiently producing vitamin D, lowering blood pressure, making you feel happier, and having other therapeutic benefits?

6a00e5520572bb8834017d41062de7970c-320wiThat’s the intriguing conclusion of recent research that started with a look at the value of vitamin D supplements — which many people who avoid the sun are taking to try to compensate for the lack of solar-produced vitamin D.  Low vitamin D levels are associated with lots of bad stuff — cancer, diabetes, obesity, osteoporosis, heart attack, stroke, depression, cognitive impairment, autoimmune conditions — and vitamin D is required for calcium absorption and good bone health.  So vitamin D supplements should help, right?  But the research showed that vitamin D supplements weren’t having any discernible impact on cancer, heart disease, or stroke.

Scientists scratched their heads and looked into the unexpected result, and started to find evidence that it wasn’t high vitamin D levels that prevented the bad conditions.  Instead, the presence of vitamin D was just a marker, and the real cause for the positive health effects was that sunlight that was producing the vitamin D.  The people who had the high vitamin D and were avoiding the bad conditions were getting plenty of sunlight.  Exposure to sunshine also causes the skin to produce nitric oxide, which dilates blood vessels and reduces blood pressure — which, as the article linked above points out, helps to explain why “rates of high blood pressure, heart disease, stroke, and overall mortality all rise the farther you get from the sunny equator, and they all rise in the darker months.”

And the vitamin D/blood pressure effects may just be the start.  The article continues:  “Sunlight triggers the release of a number of other important compounds in the body, not only nitric oxide but also serotonin and endorphins. It reduces the risk of prostate, breast, colorectal, and pancreatic cancers. It improves circadian rhythms. It reduces inflammation and dampens autoimmune responses. It improves virtually every mental condition you can think of. And it’s free.”

But wait — won’t getting more sunshine cause skin cancer?  Yes, there is that risk — but the article points out that skin cancer is not nearly as lethal as the other diseases and conditions that exposure to sunlight helps prevent.  And, additionally, people who regularly get sunshine, avoid sunburns, and keep their tans going — like outdoor workers — are much less likely to experience melanoma, the less-common but potentially fatal kind of skin cancer.  In fact, the evidence indicates that long-term exposure to sun is associated with lower melanoma rates.

All of this will come as a surprise to people who are scared to death of skin cancer and buy sunblock by the carload, but it makes sense from an evolutionary standpoint.  Our half-naked distant ancestors didn’t have SPF50 to apply, and they were exposed to the sun on a much more prolonged basis than modern, largely indoor humans.  It makes sense that humans would evolve in ways that would favor those who were more efficient in using that abundant, constant sunshine in positive, healthy ways.

Think about that the next time you’re carefully applying that SPF50 sunblock and popping down vitamin D pills.

Messing Around With Genes

Since 2015, Congress has included language in its funding bills to prevent the Food and Drug Administration from approving any application to create in vitro fertilization children from embryos that have been genetically modified.  Because the prohibitory language has been included in funding bills that have expiration dates, it needs to be renewed every year.  The House of Representatives just passed legislation that includes the renewal language, as part of an effort to fund certain governmental activities like food stamps and drug approvals.

Khan1The issue of genetic modification of embryos has some special urgency these days, with the recent news that Chinese scientists have announced the birth of the first genetically modified children — twin girls whose genes allegedly have been altered to supposedly make them specially resistant to HIV.  The Chinese scientists used a protein to edit the genes on a “CRISPR” — a stretch of DNA.  Some people question the validity of the Chinese claim about these so-called “CRISPR babies,” but there is no doubt that genetic manipulation of human beings is moving from the realm of science fiction to the reality of science fact.

The bar to such activities created by Congress ensures that efforts to genetically modify humans are not going to be happening in America — at least for now.  Is that a good thing?  The FDA Commissioner has said:  “Certain uses of science should be judged intolerable, and cause scientists to be cast out. The use of CRISPR to edit human embryos or germ line cells should fall into that bucket. Anything less puts the science and the entire scientific enterprise at risk.”  Others argue that Congress has taken a “meat axe” approach when it should be crafting a more nuanced policy that recognizes that some genetic manipulation could be beneficial.

It’s hard to know what’s right.  Scientists have been involved in the reproductive process for years, and their work, through processes like in vitro fertilization, has allowed people who are struggling to conceive to realize their dream of having children.  But I think the notion of scientists tinkering with genes to create “better” human beings crosses a line in several ways.  First, I’m not entirely confident that scientists know what they are doing and that there won’t be unintended, negative consequences from the removal of the genes the scientists snip out.  Anyone who has read about the history of science knows that scientists have been wrong before, and its reasonable to think they might be wrong again — only this time, their errors wouldn’t just be about the impact of certain foods or the properties of atoms, but would directly affect specific human beings.  Second, where do you draw the line in genetic manipulation?  Modifying DNA sequences to try to avoid diseases or debilitating health conditions is one thing, but what if scientists want to edit genes to create humans who are smarter, or more athletic, or taller?  Do we really want to permit the creation of “designer people” — like Khan Noonien Singh, that memorable Star Trek character who was genetically modified to be a kind of superhuman?  And finally, as this article points out, the whole issue brings up uncomfortable memories of the eugenics arguments of the early 20th century, where certain ethnic groups and traits were considered superior and others inferior.  If “improved” humans are created, where does that leave the rest of us?

In my view, this is an area where a sweeping rule makes sense — at least initially.  I think we need a lot more evidence, and a lot more thinking, before we should allow scientists to go messing around with human genetic material.

Brown-Eyed And SAD

In the Midwest, Seasonal Affective Disorder (aptly known as SAD) is a real issue.  During the months between November and March, when the days are short and the skies are almost unrelentingly gray and gloomy — like this picture I took on Saturday from our back steps — lots of otherwise sturdy and resilient Midwesterners find themselves down in the dumps and absolutely sick to death of overcast weather.

Scientists are taking SAD seriously and have conducted several studies of the condition.  The data indicates that about five percent of Americans experience SAD — I’d be willing to bet that the percentage is a lot higher in the Midwest during the winter months — and women are about four times as likely to have the condition as men.  And now a study has concluded that people with brown eyes may be more likely to experience the SAD symptoms.  The study also indicated that blue-eyed people, in contrast, are less affected by the lack of sunlight.

Why would eye color matter?  Sunlight affects mood and vitality through the eyes.  The author of the paper about the study hypothesizes that “the blue eye mutation was selected as a protective factor from SAD as sub-populations of humans migrated to northern latitudes.” The mutation that led to blue eye color occurred about 10,000 years ago and was thought to simply be associated with “the general package of pale skin in northern latitudes.”  The scientist now thinks that “given that frequencies of blue eye coloration reach their highest proportions in the most northerly latitudes of Europe, and given SAD rates reach their highest figures at the most northerly latitudes, then another possibility is that the blue eye mutation is maintained in such areas in order to alleviate the effects of SAD.”  In short, in the northern climates natural selection may have advantaged people with the blue-eyed mutation because they were more capable of dealing with the gloom than their brown-eyed friends and therefore were more likely to survive and reproduce.

It’s now the SAD season in the Midwest.  Fortunately, I’m not brown-eyed.   My eyes are a bright burnt sienna, and I’m not prone to SAD.  But lots of people around here are, and I sympathize with their reaction to the grayness.  Many Midwest snowbirds head south not so much in search for warmth as in search for sunlight.