Identifying The First Writing

Historians generally accept that the first writing, using cuneiform script, was developed in ancient Sumer, in the region of modern-day Iraq, sometime around 3300 B.C., and that the first hieroglyphics were created in Egypt soon thereafter. In short, the prevailing view is that spoken language existed for thousands of years before written language was invented.

The consensus among historians and archaeologists is that the invention of writing began with pictures representing objects, and then the savvy Sumerians realized that they could use symbols to represent sounds in their spoken language–which is the basic concept underlying cuneiform script. The symbols in cuneiform and hieroglyphics became easily recognizable as a form of writing when the ancients began creating clay tablets and papyrus scrolls and covering them with the symbols.

But how do we know for sure that there weren’t even earlier forms of writing–forms that use symbols that are obscure to us in the modern day, and aren’t seen as obvious attempts at writing because they don’t, for example, appear to be used for record-keeping? That’s a question that scientists and historians are considering in connection with the beautiful cave paintings of Lascaux, which are believed to have been created about 20,000 years ago–long before the first cuneiform appeared in Sumer. The cave paintings include dots and dashes and geometric signs, along with the striking and colorful representations of ancient animals and hunting scenes. Could those apparently intentional, non-representational markings have been some accepted form of written form of communication, like a prehistoric Morse code? That question has generated a lively, ongoing scientific debate, with some researchers arguing yes while others are skeptical.

Of course, absent a new discovery of a Stone Age Rosetta Stone, we’ll probably never know for sure if the cave wall symbols are writing, and if so what they are meant to represent. But I suspect that the concept of writing came to early humans long before the ancient Sumerians invented cuneiform. Humans are communicating creatures, and if the creators of the Lascaux cave art used painting to communicate, as they clearly did, is it really so surprising that they might take the logical next step and use symbols, too?

The Unknown Ancestor In Our Human Family

The Earth of about 80,000 years ago must have been a pretty interesting place. That’s the point in time when our direct human ancestors left the African continent and began to spread across the face of the globe. As they spread, they encountered hominid cousins–cousins so closely related that, from time to time, our direct ancestors were able to interbreed with them and produce live, fertile offspring who, in turn, produced other children who entered into the ancient human genetic mix.

We know all of this because of the work performed in the human genome project, which is hard at work in analyzing human DNA and tracing it back to sources. The human genome project has shown that our DNA includes elements from Neanderthals, like the thoughtful fellow pictured above, and Denisovans. Now the project has identified a third, previously unknown, and as-yet-unnamed ancestor species that left an imprint on our DNA. The unknown species might be a product of separate interbreeding between Neanderthals and Denisovans or might be an entirely separate species.

Neanderthals, Denisovans, and the third species are now extinct, but they live on in their fractional contributions to our DNA, with most modern Europeans and Asians carrying a tiny part of Neanderthal DNA and most Melanesians and Australian Aboriginals carrying slightly larger amounts of Denisovan DNA. Researchers are trying to figure out what meaningful impact–if any–this “archaic DNA” has on the appearance, immune systems, and other characteristics of humans. That’s a complicated process, and the fact that we’ve now identified and welcomed to the human family another, previously unknown ancestor just makes the puzzle more challenging.

Making Oxygen On Mars

Thanks to the renewed interest in space exploration and improvements in rocketry technology developed by companies like SpaceX, we’re inching closer to the point where we might actually land human beings on the surface of Mars. But if we’re going to stay there for any meaningful length of time, we’ve got another challenge that we’ll need to overcome: the human visitors will need to breathe, and that means coming up with a reliable way to create a lot of oxygen in the pointedly carbon dioxide-rich, oxygen-poor Martian environment.

Fortunately for the future explorers of Mars, it looks like the big brains at MIT have come up with a solution. They created a lunchbox-sized device called the Mars Oxygen In-Situ Resource Utilization Experiment, or “MOXIE,” that was taken to Mars as part of NASA’s Perseverance rover mission and has been on the surface of Mars since the mission landed in February 2021. The underlying concept of MOXIE was to use the carbon dioxide on Mars to create oxygen–which is a lot cheaper than trying to cart oxygen all the way from Earth.

MOXIE sucks in the thin Martian air, filters and pressurizes it, then moves it through the Solid OXide Electrolyzer (SOXE), an instrument created by OxEon Energy. SOXE splits the carbon dioxide-rich air into oxygen ions and carbon monoxide, isolates the oxygen ions, and combines them into O2, the kind of oxygen humans breathe. Once MOXIE confirms that breathable oxygen has been created, it releases the oxygen and carbon monoxide into the Martian atmosphere.

MOXIE has been powered up on multiple occasions since its arrival, during different Martian seasons and at different times of the day and night, to see whether it works in different conditions and different temperatures. So far, MOXIE has worked under all conditions except dawn and dusk, when the temperature is changing rapidly, but the MIT team believes it has a solution to that problem. The little lunchbox-sized device creates six grams of oxygen per hour, which is about the same amount as a small tree on Earth.

When we get to the point of sending a human mission to Mars, the plan would be to send much bigger versions of MOXIE to the Red Planet ahead of the human mission, power them up, and let it generate a repository of oxygen that would supply the needs of both the human visitors and the rocket that would take the humans back home to Earth. Pretty cool!

The Perils Of Picking

Kids learn that they aren’t supposed to pick their noses at an early age, when their horrified mothers tell them it’s disgusting, ill-mannered behavior and they should stop doing it–right now! A recent study suggests that there is another potential reason to heed your mother’s edict: there could be a connection between damage to the internal tissues of the nostrils and dementia.

The study looked at the transmission of bacteria in mice and focused on Chlamydia pneumoniae, a form of bacteria that is common to mice and humans. That species of bacteria not only can cause pneumonia, as its name suggests, it also is found in many human brains that are afflicted with late-onset dementia. The study determined that the bacteria can travel quickly up the olfactory nerve that connects the mouse nasal cavity and brain to reach and infect the central nervous system and brain cells. The study also found that when the bacteria reached the mouse brains, the mice responded by creating amyloid-beta proteins to combat the infection–and amyloid beta protein clumps are typically found in humans suffering from Alzheimer’s Disease.

Moreover, the study showed that when there is damage to the nasal epithelium, the delicate membrane at the roof of the nasal cavity that separates it from the brain, the nerve infections get worse. And that’s where nose-picking–which can damage that protective layer of tissue between the nasal cavity and the brain–enters the picture.

We have a lot to learn about the causes of dementia, and studies of mice obviously don’t necessarily translate to humans. But if it’s even remotely possible that you can reduce your chances of developing dementia by refraining from self-inflicted nostril probes, it’s yet another reason to heed your Mom’s advice and keep your fingers away from your nose.

Redefining “Personal Hygiene”

Basic acts of personal hygiene have been in the headlines lately. First, an Iranian hermit described as “the world’s dirtiest man,” who hadn’t bathed in 60 years because he believed soap and water would make him sick, died recently at age 94. Someone dying at age 94 wouldn’t be especially noteworthy–except that the media reports of his death emphasized that the man, pictured above, became sick only after nearby villagers persuaded him to finally go for a wash-up, and he unfortunately went downhill after that.

Now a supposed “hygiene expert” has endorsed the approach of the hermit, contending that you not only don’t need to take a shower every morning, you don’t need to shower, period. Professor Sally Bloomfield of the London School of Hygiene and Tropical Medicine says daily bathing is “really not important” to hygiene and only became de rigueur to avoid offensive personal aromas. She says too much effort in applying soap and water could strip the human body of microorganisms that perform important functions and leave your skin dried and cracked, besides. According to the Professor, the only time you should shower off is before going into a swimming pool, because immersion could cause the microbes on our bodies to be transferred to a fellow swimmer–a concept that doesn’t exactly make me eager to go for a dip.

It’s hard to imagine what it must have been like to be within nostril range of someone who hadn’t bathed in 60 years and looked as filthy as the Iranian hermit did in the above photo. It’s also hard to imagine what the working world would be like if people who worked in physical labor jobs, or who worked in close proximity to others, stopped performing their daily ablutions. It’s even harder to imagine that anyone whose mother drilled in the notion that cleanliness is next to godliness and that daily washing, including behind the ears, is essential if you want to assume a place in polite society, could ever retreat from hopping into a morning shower for a good, hot scrub.

In short, Dial soap used the phrase “aren’t you glad you use Dial? Don’t you wish everybody did?” for a reason. Our mothers were right: odor avoidance and regular dirt removal are important parts of real personal hygiene and the general social compact. Let’s all resist the temptation to go full hermit, shall we?

An End To Nightmares

Could technology bring about an end to recurrent nightmares? Scientists think they may have found a way to redirect the sleeping brain away from those disturbing bad dreams that cause the frightened sleeper to awake with a start, with their heart hammering.

The development came during a study of people, estimated to be as many as four percent of all adults, who experience nightmares at the “clinically significant” level. Nightmare issues are deemed “clinically significant” when they occur more than once per week and cause other symptoms, like general anxiety and daytime fatigue.

The study divided 36 participants into two groups. One group received imagery rehearsal therapy (“IRT”), an existing form of treatment where they were instructed to recount their bad dreams, develop alternative, happier endings to the dreams, and then rehearse those happy endings during the hours when they were awake.

The other participants received IRT treatment, with a twist: as they envisioned more positive scenarios for their nightmares, a major piano chord was played every ten seconds, in an attempt to have the happier endings associated with the sound. The participants then went to bed wearing headbands that were capable of detecting when the sleeper had entered the rapid eye movement (“REM”) phase, when dreaming occurs, and playing the major chord associated with positive outcomes. The sound evidently helped to generate the positive outcomes, because while both groups saw a decrease in nightmares, the results were significantly better for the headband-wearing group, both immediately during the treatment and for months thereafter.

My dreams are mostly a confused rehash of things that happened during the day, as if my unconscious brain is trying to sort diverse experiences and inputs into a narrative–and since the experiences and sensations aren’t logically connected, the dream ends up making no sense. Fortunately, I don’t have recurrent nightmares, other than the “I’ve got an exam and I didn’t prepare” dream that I still get occasionally, decades after my schooling ended. I can imagine, however, that people who do experience nightmares at the clinically significant level will welcome a therapy that works. Wearing a headband and listening to piano chords would be a small price to pay to avoid waking up in terror. And the results also provide interesting insight into the power of music and its impact on the unconscious brain.

Fake Smiles And True Feelings

People have thought about fake smiles for a long time–probably for about as long as human beings have walked upright and the act of smiling became associated with happiness. They are curious about how to distinguish a fake smile from a real one, and why people fake smiles in the first place. Researchers have even examined whether working in a job where you are supposed to give a cheery smile to even unpleasant customers for your entire shift is likely to make you drink more at the end of the work day. (Spoiler: it looks like it does.)

But what about fake smiles outside the workplace, where you don’t have to give that grimace grin for eight hours while interacting with jerky customers? Does forcing a smile make you feel happier? This question had been the subject of scientific debate for so long that even Charles Darwin weighed in on the topic. In The Expression of the Emotions in Man and Animals, Darwin argued: “Even the simulation of an emotion tends to arouse it in our minds–but different studies over the years have produced different results.

Recently researchers decided to test the hypothesis, again, with a study of 3,800 people from 19 countries who were asked to respond to different prompts with a smile or a neutral expression, and then rate their happiness. The prompts were disguised, and mixed in with other facial expression requirements and even math problems, so participants presumably didn’t know that they were involved in testing whether a fake smile actually produced a happier perspective. The results suggest that faking a smile does, in fact, tend to make the fake smiler feel incrementally happier, at least in the short term.

So old Chuck Darwin apparently is right again, and forcing a grin will cause momentary changes in attitude–and at least so long as that keeping that fake smile on your face isn’t one of the requirements for your job at the neighborhood coffee shop.

The DART Hits The Bullseye (II)

When we last checked in on the NASA Double Asteroid Rendezvous Test (“DART”) probe, the golf cart-sized spacecraft had successfully smashed into Dimorphos, the asteroid circling its big brother Didymos, What wasn’t clear at that point was whether the successful navigation of the DART into Dimorphos had changed the trajectory of the asteroid.

Now we know: the DART not only hit the bullseye, it successfully changed the trajectory of the asteroid and exceeded expectations in doing so. Mission planners hoped that the DART would be able to change the length of time it takes Dimorphos to circle Didymos by 10 minutes, and tests reveal that the collision with the DART changed the orbit by 32 minutes.

The success of the DART is a big moment in developing a planetary defense to a potentially catastrophic asteroid strike. NASA Administrator Bill Nelson observed: “This mission shows that NASA is trying to be ready for whatever the universe throws at us. NASA has proven we are serious as a defender of the planet. This is a watershed moment for planetary defense and all of humanity, demonstrating commitment from NASA’s exceptional team and partners from around the world.”

Thanks to the DART, we are no longer at the mercy of the asteroids and meteors hurtling around our solar system. It’s not only cool, it’s great news for the future of homo sapiens and the other species that share planet Earth with us.

Analyzing Healthy Weight

What’s the “right” weight? It’s a question that doctors and their patients have wrestled with for years, and it’s clear that the standards are changing as human diet, nutrition, activity level, and general health are changing. Humans during the 1400s, being subject to periodic famines, plagues, and disease that stunted their growth, and engaging in day-long physical labor to put modest amounts of food on the table, probably looked a lot different from modern Americans. Even in the last century, the standards have changed. Consider, for example, that the average G.I. in World War II was about 5′ 8″ and weighed about 150 pounds. These days, you don’t see many 150-pound men in the average American city.

So what’s the “right” weight now, in an era of relative food abundance and modern medical treatments for human disease, where many people work at sedentary desk jobs?

For years, the accepted method for determining health weight has been the body mass index. The BMI was simple: it took your weight in kilograms and divided it by your height in meters, squared. The target zone for a healthy you was a BMI between 18.5 and 24.9. Now there is a debate about whether the BMI is really an effective tool, because it doesn’t consider where human fat cells have accumulated. That’s important, because the location of fat cells matters to human health and is related to conditions like diabetes, heart disease, and some forms of cancer. Abdominal fat–that “stubborn belly fat” that clickbait articles claim you can melt away with some “weird trick” or special drink–is more unhealthy than fat that accumulates around the hips, and “visceral fat,” the abdominal fat that builds up around the internal organs, is especially harmful.

As a result, some researchers are urging that use of the BMI be replaced by a focus on the waist to hip ratio. The waist to hip ratio is easy to use, too–you apply a tape measure to your waistline and your hips, and determine the ratio between them. Lower waist to hip ratios mean lower abdominal fat accumulation. And a recent study found that the waist-to-hip ratio was a better predictor of early mortality than the BMI.

There’s no doubt that losing excess weight is helpful to overall health; your hips, knees, and ankles will thank you. But the distribution of weight also matters. We’ll probably never avoid the scale at the doctor’s office, but the predictive value of the waist-to-hip ratio may mean your doctor will be taking out a tape measure, too, at your next exam.

The DART Hits The Bullseye

Our space neighborhood is filled with comets, meteors, asteroids, and other random bits of rocky flotsam and jetsam, any one of which could come plummeting through the Earth’s atmosphere and slam into our planet. Over Earth’s long history, many objects have done precisely that. That reality is of no small concern, because if the object is large enough, the impact could have catastrophic, climate-altering consequences. Some scientists theorize, for example, that the extinction of the dinosaurs occurred because of the after-effects of a gigantic and devastating meteor strike that occurred 65 million years ago.

The fact that humans haven’t had to deal with a similar random, collision-caused disaster has been the product of sheer dumb luck–until now. Thanks to the scientists and engineers at NASA, and the successful test on Monday of a suicidal spacecraft called the Double Asteroid Rendezvous Test (“DART”) probe, we’ve finally got a fighting chance.

The DART mission sought to show that the paths of killer asteroids could be deflected away from Earth by being rammed by a spacecraft. The target of the mission, at a distance about 7 million miles from our planet, was an asteroid called Dimorphos, and the goal was to change its orbit around a larger asteroid called Didymos. The DART probe, which was about the size of a golf cart and weighed 1,320 pounds, slammed into Dimorphos at a brisk 14,000 miles per hour rate, with the goal of nudging the asteroid into a speedier orbit around Didymos. Happily, the DART probe hit the Dimorphos bullseye, and as it approached it provided a continuous stream of photos, like the one above, that made the asteroid target look like a rock-studded egg in space. The ultimate crash of the DART into the target also was captured by many Earth-based telescopes. You can see the video of the collision taken from one telescope here.

So, did the ultimate sacrifice willingly undertaken by the DART probe successfully change the orbit of Didymos, as we hhope? We don’t know for sure, yet, but we’ll find out as the asteroid is monitored, and its orbit path is measured, over the next few months. But just being able to navigate a golf cart-sized spacecraft moving at 14,000 miles an hour into a moving asteroid seven million miles away is a pretty good start to developing a planetary defense system that will protect our species, and other inhabitants of planet Earth, from the ravages of killer asteroids.

Bang, Or No Bang?

Science can be great. The world of science, in most cases, allows for vigorous debate, even about the most fundamental, basic, long considered to be settled concepts–and as new data comes in, the process happens over and over again. Sometimes the novel theory actually topples the old assumptions–as when Copernicus argued that the Earth revolves around the Sun, or Einstein’s thought experiments and calculations dislodged Newtonian theories about gravity. At other times, the new theory is shown to be a bunch of hooey, and the product of shoddy science and cherry-picked data.

There’s a vigorous argument along those lines going on now in the world of astronomy and cosmology. The issue is whether the incredible photographs being produced by the James Webb Space Telescope are inconsistent with the “Big Bang” theory–the widely accepted concept that the universe started billions of years ago with an enormous explosion that occurred everywhere at once and has been expanding in all directions ever since.

An article published in early August argued that the Webb telescope photos are inconsistent with the Big Bang theory because the distant galaxies shown in the photos look different than what the Big Bang theory predicts. Other scientists reject that argument as science denialism; they note that while the Webb telescope images of faraway galaxies show structures that are more evolved and coherent than was expected, that result does not undercut the Big Bang and in fact is consistent with the theory. As one article published earlier this month on space.com puts it: “The surprising finding that galaxies in the early universe are more plentiful, and a little more massive and structured than expected, doesn’t mean that the Big Bang is wrong. It just means that some of the cosmology that follows the Big Bang requires a little bit of tweaking.” 

The constant revisiting and revision of theories as new data comes in is what makes science so cool. The Webb telescope, the data it is gathering, and the discussion it is generating, are doing exactly what the process of science contemplates.

20 Quadrillion Ants

How many ants are there in the world? It’s the kind of dreamy question you might have briefly asked yourself as a kid on a lazy summer day as you were checking out an anthill that was teeming with the busy little creatures, just in one corner of your backyard. Sometimes, though, the subject of a child’s idle wonder becomes a scientist’s challenge–and Nature has published an article that tries to answer that question.

The first step in the challenge is trying to come up with a mechanism that would allow you to approximate the number of ants on Earth, because you obviously couldn’t count them, one by one, even if all of those notoriously active insects would oblige you by holding still. To give you a sense of scale, there are 15,700 named species and subspecies of ants. They are found in and on virtually every piece of dry land in the world and in the widest possible range of habitats, including cities, deserts, woodlands, grasslands, and especially rain forests. The National Wildlife Federation website states that the only land areas that don’t have ants are Antarctica, Greenland, Iceland, and a handful of islands.

So how can the Nature researchers hope to count them? By piecing together the findings of 489 independent studies that have attempted to count ant populations on every continent and in every habitats where they are found, using standard ant-counting methods. By extrapolating from this direct data, the researchers estimate that there are 20 quadrillion–that’s 20,000,000,000,000,000–ants in the world. That’s a lot of ants. But that finding admittedly doesn’t give a complete picture, because there are no studies of how many ants live underground or in trees. 20 quadrillion therefore could easily be an undercount.

But the Nature researchers didn’t stop there. They wondered how much all of those ants would weigh, did the math, and concluded that the 20 quadrillion ants would have a biomass of 12 million tons of carbon, which is more than all of the world’s birds and animals combined. (Carbon accounts for about half the weight of ants.) And, as the researchers point out, we should be glad there are so many ants around, because they play a crucial role in the ecosystem in multiple ways, including serving as food for many species.

Ants are also kind of fascinating to watch on a lazy summer afternoon, too.

Skin Story

Many of us have spent significant chunks of time this summer dabbing and smearing lotion on ourselves and our family members. It used to be called suntan lotion; now it’s called sunscreen or even sunblock. Some worried people search constantly for ever-higher SPF numbers due to fear of sunburns and dermatologist cautions about sun-related skin cancers.

The sunscreen issue is interesting when you think about it. Our ancient ancestors obviously spent a lot of time outdoors, hunting and gathering, and they didn’t have ready access to drugstores that provided rows of 50 SPF lotions. So how did they deal with the sun?

I ran across an interesting article by an anthropologist that tries to answer that question. He notes that the early humans didn’t fear the sun, thanks to their skin–specifically, the crucial protection provided by the epidermis, the outer layer of skin that adds new cells and thickens with increasing exposure to sunshine in the spring and summer, and eumelanin, a molecule that absorbs visible light and ultraviolet light and causes skin to darken due to sunshine. Because early humans didn’t radically shift their sun exposure by, say, hopping on a jet to Costa Rica in the dead of winter, their skin could adjust to their local conditions and provide all the sun protection they needed. In effect, their skin became well adapted to providing the protection needed in their local area. (Of course, they may have looked a bit leathery by modern standards, but they weren’t worried about such things in their desperate bid for survival in an unpredictable and unforgiving world.)

The article posits that the change in the relationship between humans, skin, and sunshine occurred about 10,000 years ago, when home sapiens began to develop more of an indoor life and exposure to the sun began to distinguish the lower class from the upper class. People became more mobile, too. The disconnect was exacerbated when people started to take vacations to warmer climates that abruptly changed sun conditions without a ramp-up period allowing their skin to adapt. In short, the trappings of civilization and class removed the previous balance between skin and local conditions and deprived our skin of the time needed to adjust to gradually increasing sunshine.

Does that mean you should try to recreate the former balance by staying in the same place, spending as much time as possible outdoors, and accepting the wrinkles and leathery look that are the likely result? The article says no, because your skin probably isn’t matched to your current location, and your indoor time is going to interfere with the process. That means we all need to keep dabbing and smearing to prevent sunburns and skin damage.

Incidentally, the highest-level sunscreen that is available now is 100 SPF, which is supposed to block 99 percent of ultraviolet rays. The ancients would shake their heads in wonder,

Ancient Surgery (And Post-Operative Care)

One of the most tantalizing aspects of human history is how little we know about our ancient forebears. Once you go back more than 5,000 or 10,000 years, to the period before written records and the age of surviving structures like the Sphinx and the pyramids of Egypt, there is little specific evidence of what humans did or how they lived. And the great length of that unknown period of human prehistory, stretching back tens of thousands of years, dwarfs the length of the historical record.

We tend to assume that our prehistoric ancestors were crude, ignorant people who hunted, ate, reproduced, and lived short, dangerous, violent lives. Every once in a while, however, scientists uncover something that challenges that assumption. The latest evidence that the ancients were more knowledgeable and more capable than we might have thought comes from a cave in Borneo, where scientists unearthed a 31,000-year-old skeleton of a young adult. The remarkable feature of the skeleton was that the bones revealed a successful amputation of the individual’s ankle–and that the patient then lived for years afterward.

Successful amputations require significant medical knowledge. Practitioners must know about the structure of bones, blood vessels, and muscle tissue, where and how to cut to remove the ruined bone and flesh, the need to leave flaps of skin to cover the remaining exposed bone, and how to close the wound, stop the bleeding, and avoid infection. Before this recent discovery, the oldest known evidence of an amputation dated to 7,000 years ago in France. The Borneo discovery pushes that medical knowledge back to a point more than 20,000 years earlier, and indicates that, in at least some areas, ancient humans were much more medically sophisticated that we believed. It makes you wonder: if Borneo communities had knowledgeable doctors 31,000 years ago, what other medical knowledge did they possess, and for that matter how sophisticated were their scientific, religious, philosophical, and political beliefs?

There is another, equally compelling conclusion to be drawn from the Borneo discovery. The wound healed, and the patient, who scientists believed was a child when the injury occurred, lived for years afterward. Given the rugged local terrain, like that shown in the photo above, surviving with only one working leg would have been impossible without the help of caregivers–and in all likelihood the entire tribe or local community. That necessary reality confirms that our ancestors weren’t thoughtless savages, but were decent, generous people who took care of each other. That conclusion also makes me feel better about our species.

Earliest Memories

The other day I was thinking about what I believe is my earliest memory. It’s a difficult thing to do, because typically human memories don’t quite work that way; it’s not as if they are kept in a chronological filing cabinet. Instead, memories seem to be stored in the brain in a way that causes them to be triggered by external phenomena: a song, perhaps, or a situation, or a physical setting might provoke an avalanche of recollection. It’s therefore possible that I have an earliest memory that just hasn’t been triggered yet.

That said, the earliest recollection I can muster involved sitting in a big leather swivel chair, next to my brother Jim, at our Dad’s office when he worked as a bookkeeper for a construction company. I remember sitting on the chair as we swiveled around, looking at a safe with a big combination lock and a handle that was kept in Dad’s office to store the cash receipts. We liked rotating the chair like a merry-go-round and messing with the big lock on the safe. I’m not quite sure why I have this memory–perhaps it was because we had never been to Dad’s office before, and it was interesting to see it–but it is definitely an old one. I’m not sure exactly when Dad worked at the construction company, but the time period would have been in the pre-kindergarten years, perhaps when I was three or four.

A recent study suggests that many people can identify memories dating back to the age of two-and-a-half, and that people also tend to misdate their earliest memories and assign them to later points in their lives. It isn’t clear why two-and-a-half seems to be the cutoff point–perhaps the brain just isn’t ready to begin significant storage before then, or perhaps the things that are happening before that age aren’t specifically memorable–but the authors of the study suggest that if you want to try to remember your earliest memories, you just need to work at it, because summoning up early memories often has a kind of cascading effect. But be careful: studies also suggest that what many people think is their earliest memory is fictional, particularly if it goes back beyond the age of two or so. Those “memories” often aren’t true memories, but instead are descriptions of family photographs or ingrained family stories that have been implanted in the brain over the years.

I’m pretty sure my swivel chair memory is a true memory, and not a later implant, but of course there is no way to know for sure. The “earliest memory” issue does make you realize that your brain is kind of like your grandmother’s attic, with all kind of weird stuff stored up there, and you’re not quite sure why some memories got stashed and others didn’t.