The March Of Trivialization

Historians used to write and talk about the “march of civilization.” You can find books by that name on Amazon, and learned quotes that use that phrase on any search engine. The underlying notion was that the story of humankind was a continuous upward journey from barbarism to the glories of the modern world. And the implication was that the march would inevitably continue to ever greater heights of achievement and refinement.

I don’t think the “march of civilization” concept holds true anymore. The idea presupposes that human beings will continue to work hard and focus on bringing on the better world with its greater accomplishments and magnificence, and these days it seems like we are a lot more interested in being distracted than in knuckling down. You might say that we are in the midst of a “march of trivialization” instead.

Consider what happens when you go to the Google app on your phone to look for something. Before you can type in a word you’ll see snippets of a series of curious stories, like the ones shown on the screen shot at the top of this post, that are designed to pique your interest, get a click, and divert you from what you were going to do in the first place. And the stories that are featured are breathtakingly banal and ultimately pointless, like the story above about the “viral video” of a woman stepping out of an Amazon truck and how people you don’t know on social media have responded to it. Typically there are multiple stories about social media videos or feigned outrages, photos of celebrities and members of the British royal family, sports world “reactions” to a play or announcement, and speculation about when a fourth (or fifth, or sixth) “stimulus” payment might be made. If you went solely by the stories on the Google front page and tried to draw inferences from them, you probably would conclude that we live in a world where there are no real, significant problems that constitute news, which is why TikTok videos and celebrity fashion dominate.

Of course, that inference would not be correct. There are lots of actual problems out there that could be the subject of the Google front page stories–but they aren’t. Why do you suppose that is the case? Is it because Google once tried to feature actual news and saw that it garnered far fewer clicks than the junk stories, or that Google figured at the outset that people typically use Google for trivial purposes–like trying to find the actual name of the character called “the Professor” on Gilligan’s Island–and therefore would prefer the amazingly inconsequential fare that we see today?

Whatever the reason, the march of trivialization continues, distractions ever multiply, and the insignificant crowds out the significant. Social media has replaced religion as the so-called “opiate of the masses,” and is keeping people from paying attention to what actually counts. It’s a weird and troubling feature of modern life.

An App Too Far

Governments the world over have struggled to address the COVID-19 pandemic. In the United States, we’ve seen large-scale shutdowns of businesses, mask mandates on planes and in buildings, and social distancing and stay-at-home orders. But it is the Land Down Under — Australia — that has really pushed the envelope.

This week The Atlantic carried an eye-opening article about some of the governmental edicts that have been imposed in Australia–edicts so draconian that the article carries the provocative headline “Australia Traded Away Too Much Liberty.” Consider this partial list of emergency decrees and requirements:

  • Australia has dramatically curtailed its citizens’ ability to leave the country. The article quotes a government website (which you can see here) that states: “Australia’s borders are currently closed and international travel from Australia remains strictly controlled to help prevent the spread of COVID-19. International travel from Australia is only available if you are exempt or you have been granted an individual exemption.”
  • Travel between the six states that make up Australia also is restricted. You can access the governmental website that discloses the current restrictions, which include closing state borders, limiting ability to travel within a state, and mandatory quarantines, here.
  • States have imposed curfews, have banned anti-lockdown protests, and have used the military to disperse and arrest anti-lockdown protesters in Sydney and Melbourne. In Sydney, more than five million people have been in lockdown status for more than two months.

But the most draconian requirement of all is being tested and rolled out by the state of South Australia. It’s an app that the state would require its citizens to download, and the Atlantic article describes it as follows:

“People in South Australia will be forced to download an app that combines facial recognition and geolocation. The state will text them at random times, and thereafter they will have 15 minutes to take a picture of their face in the location where they are supposed to be. Should they fail, the local police department will be sent to follow up in person. ‘We don’t tell them how often or when, on a random basis they have to reply within 15 minutes,’ Premier Steven Marshall explained. ‘I think every South Australian should feel pretty proud that we are the national pilot for the home-based quarantine app.’”

It’s a pretty amazing development when a democratic government claims the ability to unilaterally require citizens to download an app, respond to random government texts, and be required to respond within a specified time period with a personal photo showing they are in “the location where they are supposed to be” or receive a visit from the local police. It’s even more amazing that the head of that government actually thinks citizens should be proud that their state government is the leader in imposing that kind of extraordinary government intrusion. I’d like to think that no duly elected government in America would think that kind of action was anything other than an egregious overreach–but then, I would have thought the Aussies would never have done anything like that, too.

There’s obviously a delicate balance between preserving individual rights and liberties and dealing with public health issues. As The Atlantic article notes, Australia’s dramatic decrees can be cited as allowing it to achieve COVID-related death statistics that are far below those in the U.S. But Australia also shows how the balancing of health and rights can tip decidedly to one side, in a way that strikes at the core of freedoms that are a defining characteristic of democratic societies. Citizens of other countries should be looking at what has happened in Australia and asking themselves: “Was it worth it?” and “Could that happen here?”

The Scientific Scourge Of Fake Data

Almost 10 years ago, a significant study on personal honesty was published. It indicated that simple method reduced lying by respondents who were filling out forms: if people signed an honesty declaration at the beginning of the form, rather than the end, they were supposed to be less likely to lie in their answers. The study was cited by other researchers and featured in a bestselling book written by one of its principal authors.

Now that study is being retracted. Over the years, efforts to replicate the results of the study have been unsuccessful, but now a more serious issue has been uncovered. Academics who took a close look at the underlying data cited in the study have determined that one of the main experiments cited in the study was faked, and that the data related to that experiment is fraudulent. The researchers who published the initial study agree and have asked the journal that published the initial study–the Proceedings of the National Academy of Sciences–to formally retract it.

It’s ironic that a study drawing conclusions about personal honesty would be based on fake data, but it’s the latest high-publicity example of a significant problem in the scientific community. Some have called it the “replication crisis.” We remember from our high school science classes that the scientific method involved developing a hypothesis, creating and conducting an experiment designed to test the hypothesis, describing the experiment and honestly publishing its results, and then letting the rest of the scientific community challenge the hypothesis, the experiment, and the data. The last step, in which other scientists played the role of skeptic and fact-checker and verifier by trying to replicate the experiment and test its results, was a key part of the whole process. And in the past, peer-reviewed journals played an important role in ensuring that the results of the experiments could, in fact, be faithfully replicated and the conclusions drawn were credible.

But something has obviously gone wrong, as a number of high-profile research findings can’t be replicated and there is increasing concern that data isn’t being collected or reported honestly or accurately. The “social sciences,” which encompasses the honesty study noted above, has been especially affected by the replication problem. And in the case of the honesty study, no one seems to know how the faked data was created in the first place. Four of the five authors of the study say they weren’t involved with collecting the false data, and the other one denies that he had anything to do with it. So, how did it happen, and why didn’t the initial authors of the study carefully review the faked data and question its bona fides before publishing the results? Some observers wonder if the behavioral studies that are now a staple of news feeds aren’t being influenced by the desire to create headlines and achieve clicks, leading researchers to overlook questionable data or methodologies.

You see signs these days that say that “science is real.” That’s obviously true, but the replication crisis demonstrates that not all scientific results are real. There’s nothing wrong with having a healthy skepticism about groundbreaking studies or sweeping pronouncements until after the underlying data has been thoroughly vetted and other researchers have replicated the results. As our high school science teachers instructed us, that’s what should have been happening in the first place.

Lessons From Churchill

I’ve just finished Andrew Roberts’ titanic Churchill: Walking With Destiny, about one of the leading historical figures of the 20th century. The 1,000-page volume, published in 2018, draws upon recently released historical documents to trace Winston Churchill’s life in exacting detail, from his early childhood and painful desire to be loved and respected by his father–something that never happened, sadly–through years of turmoil, disaster, and triumph. It’s a fascinating tale of a colossal figure who first came to prominence in the high Victorian era, at the apex of the British Empire, saw Great Britain and its empire fight two world wars, witnessed the dissolution of that empire, lived into the era of the Beatles, and was celebrated with one of the largest state funerals ever given to a non-royal Brit.

Roberts’ book is a compelling read about a fascinating individual. Churchill was a well-rounded figure, with many virtues, and a lot of flaws, too. He was a glory hound in his early days, and his love for the British Empire brought with it a benighted attitude about race and people in the Empire, as well as a belief in the superiority of the British approach that caused him to accept risks that shouldn’t have been accepted. On the other hand, he was extraordinarily hard-working, brilliant, a gifted writer, a great wit, a compelling speaker who turned many a memorable phrase, and the unyielding leader whose fight and pluck and rhetoric stiffened Great Britain’s resolve and kept it in the war when it faced the German war machine, alone, during the dark days of World War II.

One of the book’s themes is that, for all of his brilliance and self-confidence, Churchill was someone who could learn from his many mistakes, rise above them, and–crucially–identify and assimilate changes to his world view that allowed him to avoid repeating them. Churchill’s advocacy of the bloody, ill-fated and ultimately disastrous Dardanelles expedition in World War I could have sent a lesser person slinking off to a life of obscurity, and it haunted Churchill, and was repeatedly mentioned by his adversaries, even when Churchill began serving as Prime Minister in 1940 after the fall of France. But Churchill didn’t let that colossal failure forever cripple his career; he learned from it and other errors and ultimately profited from the very hard lessons it taught. Churchill’s approach to his stout-hearted service during World War II was strongly informed by those lessons and his prior experiences–good and bad.

I’ve been reflecting on Churchill and that important element of his personality these days, when we have seen the United States take a huge black eye with its inept, disastrous, and humiliating failure in Afghanistan. Obviously, many mistakes were made, and there is plenty of blame to go around for all of the four Presidents, and their administrations, who contributed to the Afghan debacle. But the key point now is how to react to those obvious mistakes. Those of us who lived through Vietnam feel like we’ve seen this show before, and now wonder whether our country will ever learn. Will we finally focus our attention–and treasure, and finite resources–on the matters that are truly essential to our national security? Will we resist future temptations to try to build mini-Americas in faraway countries with radically different cultures and perspectives? Will we be able to recognize and avoid “mission creep,” identify the policies and institutional processes that produced the Afghan fiasco and change them, and actually hold accountable the incompetent people who failed to do their job and, in the process, put thousands of people at risk and cost us billions of dollars in equipment and money and a considerable part of our national reputation?

What has happened in Afghanistan is an embarrassment and an epic failure that featured countless mistakes and misjudgments. Having read Roberts’ biography, I’m convinced Churchill would have learned from those errors and recognized how to avoid them in the future. Can our country do the same?

Short People Got . . . .

Short people have had a tough time of it since basketball was invented and Randy Newman sang a mean-spirited song about them in 1977. And lest you think I’m being heightist in saying so, I should point out that, from a basketball perspective, anyone who is less than 6’1″ apparently is considered short–which puts me squarely into the “short people” category.

Now short people have something else to worry about: a Singapore study concludes that shorter people (in this case, people shorter than 5’5″) are at greater risk of contracting COVID-19. The study found that COVID-infected droplets that are expelled by a sneeze or cough tend to fall slowly to the ground, and the downward trajectory supposedly puts the height-challenged among us at greater peril of breathing in the droplets. The study recommends that short people maintain an even greater than normal social distance from taller people–two meters, which equates to a bit over 6.5 feet–to avoid being caught in the droplet fallout zone and wear masks, too. The study has been published in the Physics of Fluids.

Far be it from me to question a scientific study, but color me skeptical on this finding. I’m not sure that all sneezes and coughs propel downward, but in any case, isn’t there an easy way of testing this hypothesis? Has any seven-footer become infected by COVID? And are the heights of COVID hospital patients out of whack with the spread of heights in the population at large?

One of the problems with our current atmosphere is that alarming (and often dubious) information about COVID, and the delta variant, get published every day. Before we start telling short people that they are at greater risk of contracting COVID, shouldn’t we do a bit more research to confirm that we’ve got it right, rather scaring the dickens out of the portion of the population that tops out at below 5’5″?

Ocean No. 5

In case you’ve missed it, National Geographic has decided to officially recognize the ocean immediately around Antartica as the Southern Ocean. It therefore becomes the fifth official “ocean”–as distinct from seas like the Mediterranean Sea, the Red Sea, Caribbean Sea, and the South China Sea and countless bays, coves, and inlets. If you’ve forgotten this lesson from your geography class, the other official oceans are the Atlantic Ocean, the Pacific Ocean, the Indian Ocean, and the Arctic Ocean.

The National Geographic decision is a kind of belated codification of the status of the Southern Ocean, which many countries and geographers have recognized for a while. They point out that the Southern Ocean is just different in feel, in composition, in appearance, and in danger than other oceans. The Southern Ocean is defined by the Antarctic Circumpolar Current, which is a kind of moving water barrier that is colder, and less salty, than the surrounding water in other oceans.

One article describes the Southern Ocean in a way that makes it sound like an interesting place that would be well worth visiting:

“The Southern Ocean is unlike anywhere else on Earth. ‘Anyone who has been there will struggle to explain what’s so mesmerizing about it,’ says Seth Sykora-Bodie, a marine scientist at the National Oceanic and Atmospheric Administration (NOAA) and a National Geographic Explorer.  ‘But they’ll all agree that the glaciers are bluer, the air colder, the mountains more intimidating, and the landscapes more captivating than anywhere else you can go.’

“The Southern Ocean is a violent place. It’s where many of the massive swells that run into Teahupoo and Cloudbreak are born. In 2017, a wave of nearly unheard of proportions was measured there. Not only does it look different, the Antarctic Circumpolar Current is extraordinarily important to the Earth’s climate. It transports more water than any other current in any other ocean, sucking in water from the Atlantic, Pacific, and Indian Oceans. It’s a driving force behind the global circulation system called the conveyor belt, which moves warm waters all over the planet.”

(In case you’re interested and don’t want to click on the link above, the wave that is mentioned in the above snip was 64 feet tall–in the open ocean. 64 feet!)

It’s interesting to look at that map of Antarctica and the Southern Ocean at the top of this post. Most world maps don’t show Antarctica in its full glory, and show only a bit of it at the bottom of the map. Looking at it makes me interested in potentially seeing it one of these days–as long as I have assurance that we don’t encounter any 64-foot waves.

The Trouble With Harry

The other day I called up Google on my phone to do a quick search. As always happens, clickbait articles popped up, including this one on Yahoo about Harry Windsor sharing some new photos of his son and reporting on some of his child’s first words.

You remember Harry, I’m sure. He’s the guy who moved to the United States from the U.K. because he desperately wanted to get away from the suffocating attention paid to him and his extended family and go his own way with his wife and child. But poor Harry seems confused. He doesn’t seem to get the notion that if you want to live a private life and make it on your own, you need to actually live a private life. That means not giving interviews with famous celebrities and participating in docuseries and sharing details about your life that are sure to attract more of the public attention that you claim to abhor.

Harry’s evident problem is that he seem to really like the attention, which he’s gotten his entire life. But it has to be the right kind of attention. Positive attention is just fine with Harry, but negative attention, or any criticism, makes him wonder why journalists and paparazzi and commentators can’t just leave him and his family alone.

Harry’s approach reminds me of our kitchen screen door during the summer months when I was a kid. We didn’t have air conditioning, so the only way to get air circulation in the house on a hot summer’s day was to open the inner door and let any precious breeze come through the outer screen door. But with five children in the family and a neighborhood that was chock full of rug rats, kids were constantly going in and out through the door, which had one of those spring devices that made it shut with a loud metallic clang. After putting up with a few dozen unsettling bangs, Mom would say, in exasperation: “In or out?”

And that notion applies equally to Harry. When it comes to celebrity status, you’re in or you’re out. If you want privacy, live privately. But if you crave some of that celebrity adulation, don’t come around whining when somebody makes a joke at your expense or raises questions about whether you are profiting from your family connections.

In deference to Harry’s tender sensibilities, I haven’t included a photo of him with this post, and because I’m writing this in America, where we don’t have titles–except for nicknames, like the Sultan of Swat or the Fresh Prince of Bel-Air–I’ll just call him Harry Windsor. And in further deference to Harry’s apparent wishes, I also promise that I will never write about him again.

Capturing COVID On The Cusp

Over the weekend I finished Michael Connelly’s The Law Of Innocence. It’s the latest in his series of books about Mickey Haller, the “Lincoln Lawyer” who represents all manner of criminal defendants and manages his law practice from the back seat of his Lincoln automobile.

It’s a good read. I like Connelly’s spare, reportorial writing style and plotting and could probably be entertained reading a grocery list so long as he wrote it. But what’s really interesting about this book, which is set in late 2019 and early 2020 and was published at the end of 2020, is how Connelly skillfully, and realistically, weaves in references to the looming COVID crisis. The reader, and the book’s characters, catch occasional glimpses of the coming pandemic in the far background of the main story, which involves Haller defending himself against a phony murder rap. Every once in a while there will be a reference to what was happening with sick people in China, or Seattle–and the reader remembers their own initial, occasional awareness of the COVID virus during that pre-lockdown period, and knows in the pit of their stomach what is coming, even if Mickey Haller and the book’s other characters don’t.

I don’t know how many other works of fiction have been published where the COVID pandemic plays a role; I suspect that with The Law Of Innocence Michael Connelly, who writes produces books regularly to the delight of his grateful fans, has published one of the first ones. I’m confident it won’t be the last. Fiction is shaped by what’s going on in the world, and the COVID pandemic is bound to produce a lot of books. Who knows? Perhaps one day literature professors will be debating which book should be viewed as the great COVID pandemic novel.

Rake

Normally I hate TV shows about lawyers. In the typical American TV show about lawyers, I just can’t get beyond the unreality of the plots and the outlandish depictions of our legal procedures and activities. But I’ll make an exception for shows about British lawyers, or in the case of Rake, Australian lawyers. I figure that any legal settings where barristers are wearing horsehair wigs and gowns is so far outside my experience I can’t really object to the reality, or unreality, of any of the storylines or contrived courtroom drama.

And in fairness, Rake ends up not really being a show about law at all. Sure, Cleaver Greene–the “rake” of the title who is deftly played by Richard Roxburgh–has gone to law school and does his share of work in the courtroom, but the show is mostly about his train wreck of a life. We witness his countless bad decisions, his ego-centric interactions with his ex-wife, his ex-mistress, his son who has inherited some of Cleave’s tendencies, his friends, his steadfast paralegal/assistant, and his ever-changing dalliances, and we get to hear his often hilarious observations about life in general, all set against the backdrop of an Australian political and legal system that is amazingly corrupt and inept.

And if it sounds like the show is a slam on Australia, it doesn’t come off that way. Instead, Australia is presented as a kind of charming, friendly, out-of-the-way place where everyone knows everybody else and nobody takes anything too seriously. I’d like to pay a visit to Cleaver Greene’s Australia. It’s a place where a character whose life is going to hell can say, with perfect deadpan delivery, that everything is “tickety-boo” and you know exactly what he means even if you’ve never heard that phrase before. (“Tickety-boo” dates from the days of the British occupation of India and basically means “in good order.”)

As for the arc of the show, it becomes increasingly surreal as the seasons roll on. If you’re looking for realistic courtroom drama, even of the horsehair wig variety, you really should look elsewhere. But if you’re looking for a show that will give you an interesting taste of the Land Down Under, a show that introduces you to Australian language and culture, a show that delivers some laugh out loud moments, and a show that recognizes it’s just a lighthearted frolic, you might enjoy watching Rake. We certainly did.

New Words For New Times

Germany has a checkered history, to put it mildly, but you’ve got to to give them credit in one area. As I’ve noted before, Germans have an uncanny knack for inventing useful words that capture very specific feelings or concepts.

So, it shouldn’t come as a surprise that Germany would be leading the way in inventing new words to deal with the COVID-19 world. In fact, the Leibniz Institute for the German Language did an analysis and determined that some 1,200 new words have been created during this pandemic period. One of the professors involved in the process of collecting the words concludes that the word creation process helps the German to deal with pandemic anxiety, which is captured by one of the new words: Coronaangst.

Some of the 1,200 words are pretty useful, and I’m going to try to incorporate them into my daily vocabulary. For example:

Impfneid — vaccination envy

Hamsterkauf — panic buying and stockpiling food like a hamster (This one is bound to be used in the post-pandemic world, whenever a hurricane or some other hazardous impending event is forecast.)

Coronafrisur — corona hairstyle (Who doesn’t know at least one person who hasn’t grown a special coronafrisur? I’ll be using this one the next time I talk to the Red Sox Fan, who has grown a remarkable mane during the shutdown period.)

Alltagsmaske — everyday mask

Abstandsbier — distance beer

Maskentrottel — literally, “mask idiot,” to refer to someone who wears a face covering leaving the nose exposed

When you consider the choice words the Germans have come up with, I’m afraid we Americans are losing the Words Race. About the only new phrase I can think of is “social distancing”–which I think gets absolutely blown out of the water by hamsterkauf.

15 Years Of Goldbricking

According to the BBC, an Italian civil servant is being investigated for collecting his salary, but not working . . . for 15 years. If the suspected facts turn out to be true, the public employee at issue has taken goldbricking–the ability to shirk meaningful work on the job while still getting paid–to entirely new, heretofore unexplored levels.

According to the BBC story, the individual “worked” at a hospital in the Italian town of Catanzaro. He stopped showing up in 2005, and nevertheless received full pay for the next 15 years and was reportedly paid more than 500,000 Euros during that period. His case came to light as part of a police investigation into rampant absenteeism and payroll fraud in the Italian public sector. Six managers at the hospital also are subjects of the investigation.

So, how did this happen, exactly? It’s not entirely clear, but the BBC article indicates that the employee was going to be the subject of a disciplinary charge by his manager when he threatened the manager. She didn’t file the report and then retired, and her successor, and the hospital’s HR staff, never noticed the employee’s absence. In the meantime, he kept getting his paychecks.

This impressive goldbricking feat sounds like an episode from Seinfeld or The Sopranos, or the plot for Office Space II. One thing the BBC story doesn’t disclose is what, exactly, the employee’s job was supposed to be. The reader is left to wonder: what paying position could be deemed necessary to create in the first place, but could be so inconsequential that no one would notice it wasn’t being done?

Ancient Names

Every morning on my walk around Schiller Park I see this bright red Jeep Rubicon. It’s a cool looking car, but I scratch my head at the name.

Anybody who is interested in history knows about the Rubicon. It was a shallow river that marked the boundary between the Roman Republic and the province of Cisalpine Gaul. In 49 B.C. Julius Caesar violated Roman law and custom by leading a legion in crossing the Rubicon and bringing his army into the Roman Republic. That made war inevitable. Caesar reportedly paused on the banks of the Rubicon to consider his fateful decision, then said “the die is cast” as he waded across and his soldiers followed.

Of course Caesar prevailed in the ensuing conflict; he didn’t meet his bloody fate until the ides of March some years later. But ever since Caesar made his choice “crossing the Rubicon” is used to describe doing something that reflects an irrevocable commitment to a pivotal and perilous course of action. it’s a very useful phrase.

What does any of this have to do with a red Jeep? That’s not clear to me. Did the Jeep namers just like the sound of the word, or did they have any appreciation for its historical sign?

Tread Lightly, Pranksters!

On this April Fool’s Day, here is some heartfelt advice for those who are scheming about practical jokes: tread lightly today.

Any capable prankster has to consider the setting, the nature of the prank, and the prankee. Any kid old enough to attempt an April Fool’s Day gag during his formative years intuitively understood this. You might try the “put salt in the sugar bowl” trick on your brother, but you were risking an explosion if you pulled it on your Dad as he was taking his first, wake-up sip of morning coffee. And doing anything permanently destructive, like sawing through the legs of a chair so your sister would crash to the ground when she sat down for her cereal, was clearly out of bounds.

This year, any practical jokers need to understand their audience and some reasonable boundaries, too. We’ve been pretty battered by the past year, and we’re more brittle than normal. So slipping somebody one of those dripping cups, or putting an obscene hat on the statue in Schiller Park, or sticking a “kick me” sign on Captain Kirk’s back might be funny, but nobody’s going to get much of a belly laugh out of a COVID-oriented gag. Let’s not mess around with vaccination needles, for example, or cut up vaccination cards. And I’m not sure how those who have been involuntarily housebound for more than a year now would react to a flaming bag on their doorstep, either.

The best April Fool’s Day jokes have a certain silly, timeless quality, anyway–like the 1957 BBC broadcast that convinced some gullible Brits that pasta was harvested from trees in Switzerland. If you’re interested in reading about legendary pranks of the past, take a look here and here. But if you’re going to actually try a prank, please–go easy on us!

Monty Python’s Almost The Truth

Netflix offers an awesome array of content — including documentaries. If, like me, you are a fan of Monty Python, I recommend tuning in to Monty Python’s Almost The Truth, a six-part documentary about the troupe that really bent the comedy arc.

Good documentaries answer your questions. In the case of Monty Python, there are lots of those questions. How did these guys get together in the first place? What caused them to develop such a hilarious, zany, irreverent, subversive view of the world? How did a lone American break into this supremely British group? Who came up with ideas like the fabled Parrot Sketch or the “bring out your dead” scene in Monty Python and the Holy Grail? Why did animation feature so prominently in what they did? Who came up with the great songs, like the ditty about Brave Sir Robin? And how and why did the group spin apart?

This documentary answers those questions. Made in 2009, it featured interviews with the then-surviving Pythoners, as well as comments from other people who were involved and well-known fans of the group talking about what it was like to watch their work. (I recommend fast forwarding through the comments by Russell Brand, who comes across as supremely self-absorbed and irritating.) I particularly enjoyed learning about the early days of the members of the group — including the important role now-forgotten figures like David Frost inadvertently played in the group coming together — as well as the TV and radio shows that influenced them. Later episodes drill down into the Flying Circus years, their battles with BBC censors, their creative process and some of the tensions that drove it, their legendary live performances at the Hollywood Bowl, the making of their films, and ultimately the untimely, early death of member Graham Chapman.

Influential social figures that touched the lives of millions and forever changed the way we think about their idiom — like the Beatles, or Monty Python, or the first cast of Saturday Night Live — deserve this kind of look back after years have passed and their true impact can be assessed with the perspective that only time can bring. Monty Python’s Almost The Truth gives you some of that perspective and a peek behind the curtain. It’s fascinating stuff.

Generational Monikers

I’ve never understood the silly urge to coin names for “generations” — which basically seems to exist because, once you name a “generation,” you can make grossly overbroad generalizations about the people who are members.

It started with the “Baby Boomers,” which in my view shows just how stupid the generational naming is. “Baby Boomers” include anyone born between the end of World War II and 1964. That’s my generation, although my personal experience as someone born in the late ’50s is a lot different from someone born in the late ’40s. I wasn’t at risk of serving in Vietnam, for example, I didn’t go to any Beatles concerts, and I didn’t participate in any anti-war protests. Nevertheless, I’m designated as in that “generation” that is supposed to be hopelessly narcissistic and self-absorbed and now has become the source of the “OK, Boomer” putdown that younger generations like to use.

I think the Boomers were the first example of a named “generation.” And because sociologists thought that was a good idea, they gave names to other generations–including the “Silent Generation” that came before the Boomers, with members who had somehow been able to live their lives without a generational name until somebody decided, post-Boom, to give them one. Then came “Generation X,” immediately after the Baby Boomers, followed by “Millennials” (also apparently known as “Generation Y”), then “Generation Z.”

Now CNN is suggesting that the little kids of today–as part of the as-yet unnamed generation coming after “Generation Z”–should be called “Generation C,” because their outlook on life has been permanently transformed (and scarred) by the COVID pandemic. You can make the same arguments about how stupid it is to generalize about an entire generation, some of whom may well have been traumatized by COVID while others have simply accepted the changes and gone on with their kid lives without much concern. But the core point is how unfair it is to give a generation a name based on a disease. The coronavirus period has been tough, but it shouldn’t define a generation of little kids who will now be expected, going forward, to all be brittle and hyper safety conscious.

Can we please stop giving “generations” stupid names and generalizing about their members and their experiences?