When I was a kid, it seemed like every visit to the doctor’s office was an occasion for getting some kind of shot. Mom was a fiend for making sure that her kids had every form of inoculation and immunization known to medical science, and she kept careful track of each one on individualized cards that she took to our appointments.
Smallpox, polio, MMR — all were reason enough for a Webner kid to have to drop drawers and Fruit of the Looms and get stuck in the butt by the needle-wielding family doctor. Often, the shots were accompanied by the kind of brook-no-argument statement that only mothers can plausibly deliver. My favorite bit of motherly injection-rationalizing wisdom came when I got my first tetanus shot: “You don’t want to get bitten by a rabid dog and get lockjaw, do you?” It was phrased as a question, but it clearly wasn’t an honest inquiry that you could answer in the negative. I didn’t know exactly what “lockjaw” was, but it sure sounded bad–and if Mom thought I needed to get the shot to prevent it, that was good enough for me.
Then I reached adulthood, and the frequency of shots abated. I’m sure I received some stabs, but for the most part my 20s, 30s, 40s, and 50s seemed to be largely needle-free. But when the calendar told the doctor I had hit 60, the syringe impalements resumed with a childhood-like frequency. Flu shots, multiple COVID shots, and pneumonia shots have all come my way in recent years, and today my doctor–who uses reason rather than the flat assertions of a decisive mother–strongly suggested that I should get another COVID booster, scheduled me for a shingles shot, and told me that when the autumn appointment rolls around it will be time for another tetanus shot, just in case I encounter a rabid coyote or scrape my hand on a rusty nail and need that protection against the dreaded lockjaw.
Somewhere, I am sure that my mother nodded approvingly.
So, I’m back to assuming the pincushion perspective on medical appointments. The only difference, for which I am supremely grateful, is that i have enough muscle tissue in my upper arm to allow the shots to be administered to a less embarrassing location.
1972 was a banner year for rock albums. It also happened to be the year that I started my sophomore year in high school and, not coincidentally, really began to seriously focus on music. Armed with the generous, slightly above minimum wage proceeds of my bag boy job at Big Bear, I began buying albums rather than 45s and played them on the crappy turntable in my room. The fact that great musicians produced great albums on the year of my musical album awakening was a very happy coincidence.
To be sure, 1972 was an exceptional musical year. Consider, for example, Deep Purple’s Machine Head. I bought it and played it endlessly, enjoying songs like Lazy, Space Truckin’, Highway Star, and of course Smoke On The Water, which is one of the greatest driving songs ever recorded. Then there was Stevie Wonder’s Talking Book, with fantastic songs like You Are The Sunshine Of My Life, I Believe (When I Fall In Love It Will Be Forever), and Superstition, which became a kind of funky anthem for my sophomore year. And David Bowie’s The Rise And Fall of Ziggy Stardust And The Spiders From Mars, one of the greatest concept albums ever recorded and chock full of great music from beginning to end. And Steely Dan’s Can’t Buy A Thrill, which marked the band’s emergence into the dominant creative force that it would be for the rest of the ’70s, and included classic tunes like Do It Again, Dirty Work, Midnite Cruiser, and the epic Reelin’ In The Years. And we mustn’t forget the Rolling Stones’ Exile On Main Street, or Close To The Edge by Yes, or Elton John’s Honky Chateau (which features my favorite Elton John song, Mona Lisas And Mad Hatters), or Rod Stewart’s Never A Dull Moment, or Al Green’s Let’s Stay Together. And finally, arguably the finest album of all of 1972’s offerings: Neil Young’s awesome Harvest, which seamlessly blended folk rock and electric rock and put Young at the forefront of the American music scene, where he would stay for years to come.
There were other great albums released that year, of course, because it was just an extraordinary year for music. I owned all of these records, played all of them, and loved all of them, and I listen to them still. But what really strikes me about these superb albums is two things. First, the variety of musical styles they captured, and how correspondingly broad the listening habits and musical tastes of kids of the ’70s were; in those days, radio stations played all of the songs from these albums, and we listeners weren’t confined to a single genre.
Second, can these albums really be 50? They sure don’t feel like it when you listen to them today.
I’m pretty sure that Henry IV, Part I is the first Shakespeare play I read from cover to cover. Mr. Will, the enthusiastic teacher who presided over our Shakespeare Seminar class at Upper Arlington High School, wisely picked it to be the first play we read in that course. I suspect he knew that the insult humor and abusive banter between Prince Hal and Sir John Falstaff would appeal to the simple minds of teenaged boys—and it did.
For a time the lads of Shakespeare Seminar reveled in calling each other “whoreson knaves” and “vile standing tucks” and “fat kidneyed rascals.” We loved Falstaff and Hal just as patrons of the Globe Theater did in Shakespeare’s time, and as did English audiences for years thereafter–which is why Falstaff is generally regarded as the single most popular character ever to emerge from the Bard of Avon’s prolific pen. Thus introduced to the humor and “bawdy” side of Shakespeare, we high schoolers were willing to put up with romance, and tragedy, and Hamlet’s angst as we went on to read other plays.
Of course, age brings a different perspective. As I read Henry IV, Part I now, I still enjoy the sallies between Hal and Falstaff at Eastcheap taverns (although I realize, given the changed eddies and currents of slang that have occurred in the centuries since, that I will never understand or appreciate the humor as an Elizabethan audience did)–but I see a lot more in Falstaff than I did nearly 50 years ago.
Shakespeare’s construction of the play may be a sly exercise in misdirection. He explicitly raises the contrast between the wastrel Prince Hal and the rebellious Harry Hotspur in the very first scene, as Henry IV laments how his ne’er-do-well, tavern-haunting son measures up against the victorious warrior Hotspur:
But this contrast proves to be a bit of a false lead. To be sure, we see the irresponsible Hal at the outset, but ultimately Hal is not so different from Hotspur. Hal rallies to his father’s side, fights to defeat the rebellion, and ultimately kills Hotspur in the climactic battle that brings the play to a close. No, the real contrast is between Hotspur and Falstaff–and not simply because Hotspur is a hothead and Falstaff is perfectly content in playing the clown. Hotspur is unable to curb his own vanity and sense of honor, and it ends up costing him dearly, both in causing him to rebuff the King and bring on the conflict and in needlessly insulting Owen Glendower and losing a much-needed ally. Hotspur’s pride and self-regard prevent him from looking out for his own best interests. Falstaff, on the other hand, is able to swallow jibes and ridicule in the service of his ultimate goal of wine, women, and survival–which means maintaining his relationship with Prince Hal at all costs.
Falstaff’s character leaves a lot of room for interpretation by a skilled actor. He could be played as a buffoon, to be sure, but there is a certain genius in him, and a conniving nature, with ugliness and deviousness lurking just below the surface. He’s not harmless. For all of his surface jolliness, Falstaff is not above robbing innocent travelers, or trying to cheat an honest hostess out of what he owes–but he does it with a roguish charm and shrewdness. A classic example of Falstaff’s quick wit comes when he learns that Prince Hal and Poins set him up to take the money Falstaff had stolen from travelers and are well aware that he has been lying about facing an ever-growing number of brigands. Falstaff abruptly pivots to a different approach, claiming that he was well aware that it was Hal who pilfered the booty:
Falstaff and Prince Hal then act out a scene where Hal returns to talk to his father the king, with Falstaff initially playing Henry IV before he and the prince switch roles, so that Falstaff plays Hal and Hal the king. After Hal, as the king, describes Falstaff as the devil who has led Hal astray, Falstaff, as Hal, rises to his own defense:
Prince Hal’s chilling response–“I do, I will”–presages the coming pivot in their relationship.
What did Shakespeare think of Falstaff? For all of the Bard’s ability to portray the heights of British pride and patriotism, his treatment of Falstaff shows he well understood the underside of war and the cost of valor. Falstaff recruits a ragged band of soldiers, most of whom don’t survive the final battle. After presenting Hotspur as relentlessly driven by pursuit of “honor,” Shakespeare has Falstaff, in the king’s camp before the battle, give his jaded view of the concept of “honor”:
Is cowardice defensible? Of course, Shakespeare doesn’t say so–but Hotspur dies while Falstaff lives, and indeed goes on to claim that it was he, and not Prince Hal, who finally killed Hotspur, in hopes of gaining a rich reward. But while Falstaff lives on, his special relationship with Hal has not survived. The prince has become a prince and, as the play ends, he looks forward to a further, final battle that will help to quash the rebellion.
Happy Easter to those who follow the Christian faith, and Chag Pesach Sameach to my Jewish friends who are celebrating Passover.
For many of us whose families celebrated Easter, there are happy childhood memories associated with finding Easter baskets and getting a chance to dig into a treasure trove of candy, at just about the time that the Halloween and Christmas sugar rush had fully worn off. In our house, the Easter basket routine involved the thrill of the hunt for your basket and then the enjoyment of the candy. But of course, not all candies are created equal. The other day the B.A. Jersey Girl and I discussed Easter candy and our personal favorites as we returned from lunch–which caused me to compile this ranking, in inverse order, of the candy I would find in my Easter basket.
11. Circus peanut chicks and bunnies — One year the Easter bunny put chick- and bunny-shaped candies in our baskets that were made of the same mysterious substance as circus peanuts, and just like circus peanuts, they were disgusting–stiff, chewy, with that weird circus peanut shell and gummy, slightly stale-tasting interior. This revolting development simply demonstrated that the Easter bunny was fallible. Fortunately, the Easter bunny noticed our collective negative reaction to this ill-fated experiment, and the circus peanut candies were never again to find their way into our baskets.
10. Large jelly bean eggs — As this list will demonstrate, I was not a fan of jelly beans in the Easter basket, but the worst jelly bean-related candy was large jelly bean eggs. These had a kind of thick, coarse, granular shell of sugar and then a gluey, stick-to-your-teeth interior. I would try one of these to see if they had improved from the year before–which never happened, incidentally–and then would try to work out a trade of the remaining large jelly bean eggs with one of my younger, credulous sisters.
9. Regular jelly beans — I ranked regular jelly beans ahead of the large jelly bean eggs because at least they were smaller. In our baskets, the jelly beans would get snarled in the fake plastic grass, and it took time to find all of them and put them into the trading pile. The jelly beans were a throw in, designed to entice my younger sisters with visions of quantity over quality. Some years they actually fell for it.
8. Plastic eggs with jelly beans — Our baskets usually featured a few brightly colored plastic eggs. You suspected they were filled with jelly beans, but you were never quite sure, and could hold out hope for some other form of candy until you had wrestled the eggs open and sent the jelly beans inside flying everywhere. Then you knew, of course, but I rate the plastic eggs with jelly beans higher than other jelly bean offerings because of that faint glimmer of hope that existed before the eggs were opened.
7. Fancy decorated chocolate eggs — On some Easters, our baskets would include a fancy hollow chocolate egg that was decorated with little flowers and ribbons. The flowers and ribbons were made of the same impenetrable, tooth-breaking candy that you could buy at the grocery store in number form to put on a birthday cakes. The problem with these eggs is that they were impossible to eat without creating a mess. If you bit into the egg, all structural integrity was lost and the egg broke into pieces, and then you’d have to pick up and eat the pieces, with the hard candy attached, and end up smeared with chocolate and a mouthful of chocolate and that unchewable hard candy. These often were trade fodder, too, in hopes that my younger sisters would be tempted by the gay decorations without thinking through the inevitable ramifications.
6. Foil-wrapped chocolate eggs — Finally, we’re starting to get to the good stuff. These little chocolate eggs provided a nice little wad of chocolate and a pleasant sugar rush, but the foil wrapping was the big problem. Foil wrapping simply is not designed for chubby fingers eager to get to the chocolate inside. Every year, you would bite into one of the little eggs only to realize that a shard of foil remained on the surface, and when the foil made contact with your teeth an extreme jolt of pain shot through your mouth. The foil-wrapped eggs were an effective way of forcing frantic kids to take their time and pay careful attention to detail, lest they suffer the excruciating consequences.
5. Chocolate bunny — No Easter basket would be complete without a chocolate bunny. Some years, our bunnies would be solid, and some years they were hollow. I preferred the hollow version, because it was easier to take off the ears with one large chomp, but either form was eagerly consumed. I didn’t even mind the small hard candy eye.
4. Peeps — Our baskets always included the bright yellow chick peeps, and occasionally would have pink rabbit peeps. Usually, we would get one peep. Peeps were great because you only got them at Easter. Unlike chocolate candies, you didn’t eat peeps at the movie theater or at Halloween or Christmas, so when you found them in your Easter basket you’d kind of forgotten about them and how they tasted. And then when you bit through the stiff outer shell into the softness beneath, you remembered. Few things taste as good as a bright yellow peep on a clear spring morning.
3. Chocolate covered cream or peanut butter egg — These came in an easy to open wrapper, like a regular candy bar, and had a flat appearance with a ridged chocolate covering. The cream version had a runny, sugary interior that looked like an egg yolk, and the peanut butter version had a stiffer, more granular peanut butter than was found in the household Skippy jar. It was a good Easter indeed if you could trade dozens of jelly beans and the jelly bean eggs with one of your sisters in exchange for one of these delicious treats.
2. Chocolate marshmallow egg — We’re now getting to the point pf true favorites, where it’s almost impossible to rank one above another–but difficult decisions must be made. The chocolate marshmallow eggs were like the cream or peanut butter eggs, but what nudged them into second place on the list is the quality of the marshmallow–which wasn’t like the marshmallow cylinder you’d put on a stick to roast in a campfire. No, this marshmallow was creamier, and sweeter, and delectable. When you got one of these chocolate marshmallow eggs, you knew intuitively you were enjoying some very high-end stuff.
And, number 1 is:
Speckled robin-sized malted milk eggs — These were my all-time favorite. The brittle shell outer shell, the thin coating of chocolate just underneath, and the crunchy malted milk interior that would melt in your mouth if you could resist chewing it up–this candy was the stuff of which childhood dreams were made. Back in the day, I probably could have eaten my weight in these little egg-shaped goodies. Much as I liked the marshmallow eggs, it is impossible not to put the malted milk eggs at the top of the Easter candy list.
I haven’t had any of these candies for decades, and it wouldn’t be good for my waistline to have any of them now, but it is fun to think about them and remember the simple pleasures of an Easter basket.
The sad tale of the Eastland Mall is another sign of the end of suburban American mall culture. Indoor malls were a phenomenon that swept the country in the ’60s and ’70s, putting many downtown stores out of business and shifting retail activity to the ‘burbs. Featuring “anchor stores,” countless smaller stores, food courts, and acres of parking spaces, indoor malls were generic places where people could shop, retirees could walk to the accompaniment of mall music, and kids who became known as “mall rats” could hang out with their friends.
No one who grew up in the ’60s and ’70s would have dreamed that their clean, antiseptic mall could turn into a crumbling eyesore, but the handwriting has been on the wall for years now. In Columbus, the travails of the downtown Columbus City Center mall was the canary in the coal mine that showed the indoor mall era was ending. City Center opened with great fanfare in 1989, struggled, and closed two decades later; it was then torn down and became the Columbus Commons greenspace and the location for mixed use developments. Other Columbus malls, like the once-thriving Northland Mall, also have been torn down, and the retail trends have shifted to open air shopping venues, like the colossal Easton Town Center development.
The American economy is vibrant, but ever-changing. The rise and fall of the indoor mall culture is a good sign of that reality.
For years, I’ve been classified as a part of the “Baby Boom” generation. In fact, the year of my birth has been described as the height of the Baby Boom, because it was the year of the greatest number of births in the United States.
But now people are starting to argue that those of us born between 1955 and 1964 shouldn’t be viewed as Boomers at all. Instead, we should be categorized as part of “Generation Jones.” The argument is that we just didn’t have shared experiences with the true Baby Boom generation, which was born between 1946 and 1955. We didn’t watch Howdy Doody or I Love Lucy when it was first broadcast. (Countless reruns apparently don’t count.) But it wasn’t just TV that was different. Music was different. We were too young to be true hippies during the ’60s, or to be at serious risk of fighting in Vietnam. So really, we don’t belong with the much-maligned Boomers, but should be off on our own. (“Generation Jones,” a pretty lame name, refers to our “generation’s” alleged “keeping up with the Joneses” yearning.)
This seems like a dumb thing to argue about to me, but then I think trying to divide people into arbitrarily defined “generations” is stupid, too. People born in different years and in different places, even if they are born in the same 10-year span, are bound to have as many distinct experiences as they do common ones. Sure, the same TV shows were being broadcast on the same three channels, and the music played on pop radio was the same for everyone, but if you had a sibling who was a lot older than you, you probably had no choice but to watch different TV shows and listen to different radio stations than someone who lived in a house where they controlled the dial. If you had older siblings who were fighting in Vietnam, your experiences and childhood memories were different. The closest common cultural touchstones were probably shared by people who were in high school at the same time, but even then the experiences of kids in southern California, the Midwest, and Brooklyn were bound to be a lot different. So why try to shoehorn us into one “generation” and act like we all have the same approach to the world and the same perspective on life? It’s pointless and phony.
I don’t care whether I’m officially a Boomer, or not, but don’t now try to slide me over into “Generation Jones.” At this point, I guess I’d rather just be myself.
When I was a kid, Milk Duds were my favorite movie theater candy, without a doubt. I would buy a box and then, as the movie played, put those little chocolate-covered caramel nuggets on my tongue one by one and let them dissolve slowly until nothing remained. With proper discipline and the intestinal fortitude to resist chewing, you could make a box of Milk Duds last for the whole film, in contrast to people who bought a candy bar that was long since gone by the time the credits rolled.
Weinstein initially claimed that he had brought the Milk Duds with him when he came from New York to California in July, which would mean he made a single box of Milk Duds last from July to November–which is a heck of a lot longer than the length of one movie. Jail officials reject that claim because Weinstein was thoroughly searched at that time and found to be Dud-free. It also seems to be directly contrary to Weinstein’s reported history of egregious self-indulgence and doing whatever he wanted to whomever he wanted.
I imagine the manufacturer of Milk Duds isn’t exactly thrilled that this classic movie candy is now associated with Harvey Weinstein, I know I’ll never look at a box of Milk Duds in the same way again.
I owe a debt of gratitude to P.J. O’Rourke and Doug Kenney, his cohort at the National Lampoon, because reading that magazine helped to shape my sense of humor and world view, too. And if there was one single publication that was more influential than any other in that regard, it was the Lampoon‘s legendary high school yearbook parody, the cover of which appears above. Supposedly the 1964 yearbook for C. Estes Kefauver High School in mythical Dacron, Ohio–and specifically, the copy owned by student Larry Kroger, with handwritten notes by Larry and his high school chums–the parody was a hysterical, pitch-perfect blast directed at everything pretentious and silly and weird about the super-heated, fishbowl world of high school life in small-town Ohio. Every page of the faux yearbook, from the student organization and sports team pages to the student photo pages to the principal’s message to the photos of faculties and staff, was laugh out loud funny and had the ring of truth that makes for the best satire. It was, in short, the work of a comedic genius.
The National Lampoon high school yearbook parody was published in 1974, when I was in the middle of my high school years. I devoured and loved it then and loved it again years later, when I bought an anniversary edition. After reading the yearbook for the Kefauver Kangaroos, I would never look at my own little high school world–or the world at large, for that matter–in quite the same, super-serious way again. Throw the National Lampoon yearbook parody, the Three Stooges shorts, Bugs Bunny cartoons, MAD magazine from the late ’60s and early ’70s, Jim Bouton’s Ball Four and any book or article by Hunter S. Thompson, and Blazing Saddles and early Saturday Night Live broadcasts and Richard Pryor and Cheech and Chong records into a blender, mix well, and you’d produce something like my adult (well, supposedly “adult”) sense of humor.
Thanks to P.J. O’Rourke and Doug Kenney for that. I didn’t really follow O’Rourke in his later years, but I really didn’t need to: he long ago had his impact.
The ’60s was when people first became concerned about television. Social scientists and commentators railed against the “idiot box” that was turning our brains to mush and converting formerly active, intelligent, inquisitive people into soft, slack-jawed shmoos soaking up whatever mind-numbing offering might appear on their TV set.
Those of us who lived through the ’60s somehow survived our constant exposure to the TV set that had a prominent place in our living rooms. But I’ve got news for you, folks: when it comes to TV, the ’60s was nothing compared to where we are right now. As The Hollywood Reporter noted yesterday, the number of English-language scripted TV shows that are available for viewing in the United States hit an all-time high last year. Across broadcast, cable, and streaming services, in 2021 559 English-language shows were available. That’s 13 percent more than in 2020 and 5 percent higher than the previous record in 2019. And consider this astonishing statistic reported in the THR article: “The total number of scripted shows has more than doubled in the last decade; in 2011 there were 266 scripted series.” What’s more, that 2021 record number doesn’t include any of the non-English-scripted shows that people are watching, like Squid Game or Money Heist.
In short, Americans are literally saturated with TV these days. Unlike the ’60s, when there were only three broadcast channels and one or two snowy UHF options, all of which terminated their broadcasts at some point in the early morning hours, you now could watch programming 24 hours a day, every day–and not even scratch the surface of what is available for viewing. And in the COVID era, it’s become increasingly easy to ditch the masks, slouch back on your couch, and immerse yourself in TV, rather than going out to do anything. I’m sure that part of what is driving the TV production boom is the fact that so many worried people are choosing to stay home rather than venture outside into the scary potential omicron infection zone. Rather than take that risk, why not just camp out and watch the latest hot streaming series?
As I mentioned, those of us who lived through the ’60s somehow avoided the confident predictions that we would become a bunch of brain-dead zombies–at least, I think we did– and hopefully that will prove true, again, in the aftermath of the current TV-soaked period. But it is concerning that TV shows have become such a huge part of our lives, to the point where our voracious appetite for programming is driving the TV production industry to new heights. We’d all be better off if we decided to get off the couch now and then, turn off the TV or computer, and get outside to interact with other living human beings.
How much do sound effects add to movies? Consider the Three Stooges shorts. Those of us who had our sense of humor shaped (our mothers might say “warped”) by the antics of Larry, Moe, Curly, and Shemp understand the deft comedic impact of an apt sound effect. Whether it’s a horn beep sounding when a nose gets bonked, the coconut sound of two heads colliding thanks to Moe, ripped fabric when Larry’s hair gets pulled out, or one of many other sound effects used in the shorts (many of which are found in the video clips above), the sound effects unquestionably add to the hilarity.
My favorite Stooges sound effect is the violin string pluck used when eyes get gouged by Moe, which you can hear in the clip below. Why do plucked violin strings work as a sound effect for an eye gouge? I don’t know–they just do.
I think holiday baking is a lot of fun. You have to follow the recipes, and pay attention to time in the oven to make your cookies don’t get burned, but even a failure means you can just start over without terrible consequences. In the meantime, it’s a great time to listen to your favorite holiday music. And baking requires enough attention that it inevitably takes your mind off of your “work work,” and you get to do fun stuff like rolling out cookie dough and cutting it into shapes and then decorating what comes out of the oven.
In a lot of ways, baking Christmas cookies is kind of like an updated kindergarten class for adults. To be sure, you’re working with cookie dough, not Playdoh, but you’re still cutting stuff out, using rudimentary tools, and adding color to things. The main difference is that, at some point in the process, you don’t have a teacher instructing you to roll out your towel onto the floor and take a nap with the rest of the class–although that’s not a bad idea, come to think of it.
But for me the best thing about holiday baking is the aftermath, after you’ve cleaned up the kitchen and boxed your cookies and sent them off. It’s when you start to hear from your family and friends who received the cookies, telling you how much they enjoyed the cookies or–even better–asking for the recipes of their favorites. Knowing that you helped to make someone’s holiday season a bit more tasty and festive and merry is a baker’s best reward.
One of the tough things about getting older is seeing your childhood heroes fall by the wayside. For example, it was hard to read that Michael Nesmith, one of the Monkees, died yesterday at age 78. Michael Nesmith was the “smart Monkee” who always wore a stocking cap with the ball on top; he was the favorite of the cerebral kids. Davy Jones and Peter Tork have already gone to the great beyond, so Nesmith’s death means that Mickey Dolenz, who was my favorite Monkee, is the only surviving member of the group. That just doesn’t seem possible. After all, the Monkees’ theme song said they they were the young generation, and they had something to say. So how can they be dying of old age causes like heart failure?
The Monkees were an interesting phenomenon, and in some ways a precursor for a lot of what has happened in popular culture. They were the original “fake group”–put together to be on a Beatles-knockoff TV show and also serve as the faux front band for music produced by studio musicians. As a kid, I didn’t understand how weird and groundbreaking this was: the Monkees had a TV show that I thought was funny, they drove around in a cool car, and I liked their records. (We faithfully bought all of them.) And the first record said on the back that each of the Monkees played specific instruments and sang, and you could hear their voices on the records. That had to be true, right?
Later I realized that the Monkees were in fact different from groups like the Beatles, because the Beatles actually wrote their own songs and played their own instruments and were accomplished musicians. But the realization that the Monkees were faking it didn’t change my appreciation of the Monkees’ records. We played their songs when I was in college, and I still listen to them. In fact, in recognition of Michael Nesmith’s passing, we listened to some of the Monkees’ songs last night at a gathering with friends and enjoyed them.
The difference between the Monkees and the other fakes that followed was that the creators of the Monkees didn’t scrimp; they got real songwriters (like Neil Diamond, who wrote the classic Monkees’ hit I’m A Believer) and real musicians to play the instruments, and also experimented with some cutting edge sounds that fit right in with where popular music was going at the time. My all-time favorite Monkees tune, Tomorrow’s Gonna Be Another Day, is a good example of how bringing all of that together created something really good.
After the Monkees heyday ended, Michael Nesmith went on to have an interesting career and helped to usher in the era of MTV and music videos, but of course he was always identified with the Monkees, as his New York Times obituary linked above reflects. He seemed to be at peace with his role in the popular culture of the ’60s. Those of us who enjoyed the Monkees TV show and still love the music wish him well.
When I was a kid, our standard Christmas decorations included Santa cups for every member of the family. Each of the kids had his or her own mini-cup, suited to small child hands and carefully labeled in festive red ink with our names, and Mom and Dad had cups that were larger, about the size of a coffee cup. The Santa cups went out in a line on the dining room credenza and then were put in front of our place settings at holiday meals. Mom loved to put out M&Ms for birthdays and holiday occasions, and I think she may have filled the cups with those little chocolate candies.
Amazingly, the cups survived years of excited Webner family Christmas celebrations without being broken, although my Santa cup has its paint rubbed off here and there. When Mom moved out of the family house years ago, she distributed the labeled cups to each of the kids, and now it is one of the Christmas decorations we put out in our house.
Of course, in those long-ago days I was called Bobby by everyone in our extended family. That was fine with me until I got to be 11 or 12, when I concluded that “Bobby” sounded childish and I asked everyone to start calling me “Bob” instead, which sounded a lot more grown up and adult. For some reason, it seemed very important to make that change at the time. Since then, I’ve gone by Bob, so there was a clear line of demarcation between the Bob and Bobby eras.
Now, looking at the Santa cup always makes me smile and reminds me of the long-lost Bobby days, when things were simpler and more innocent, and the appearance of a set of Santa cups on the dining room credenza was part of the build-up for the excitement and fun of a Christmas to come.
Russell and Betty are back up in Stonington. Winter comes early up there.
Stonington is located on the far eastern edge of the Eastern Time Zone, so the sun sets much earlier there than it does in Columbus, which is on the western edge of the same time zone. Once Daylight Savings Time ends, total darkness comes to Stonington during the afternoon hours. For example, my weather app says the sun will set over Stonington at 3:56 p.m. today, whereas the sunset in Columbus won’t come until more than an hour later, at 5:06 p.m. During the winter months the sun’s daily path through the sky also nudges closer to the horizon, which makes for longer shadows and less direct overhead sunlight.
That means conditions are just about perfect for the natural ice art shown in the picture above, which Russell took after he arrived. That’s a photograph of ice covering some of the rocks in our down yard. Accumulated rainwater froze over, and then the water under the ice layer evaporated while the ice remained, unmelted by direct sunlight. The result looks like an etched, frosted pane of glass that you might see in a doorway during the Victorian era.
If I recall my childhood winters accurately, that ice is just waiting for a bundled up kid in a stocking cap to step through and shatter with a satisfying crunch. I kind of wish I was there to do it.
When I was a kid growing up in Akron, I really liked going to Bob’s Big Boy. Grandma and Grandpa Neal used to take UJ and me there for lunch. I was a roly-poly kid who enjoyed a good cheeseburger, so I identified with the statue of the jolly, tubby boy in checkered overalls with lacquered hair who obviously was overjoyed to be holding a cheeseburger.
The statue was great, and so were the cheeseburgers and french fries and Cokes, but what I liked most about the Big Boy was that it had those tabletop song selectors at every table. They were just about the coolest, most futuristic thing ever. The song selectors were highly polished, gleaming metal, like all futuristic objects such as rocket ships were supposed to be. You could use a dial to flip the pages of available songs back and forth–which was fun in and of itself–to find a song you liked, read the selection code, and punch in the number right at your table. Your song then played on the big jukebox in the corner, which meant everyone in the whole place was hearing your song. My favorite choice was Nat King Cole’s rendition of The Lazy, Hazy, Crazy Days of Summer.
In those days, the tabletop song selector seemed like extremely impressive technology, as mysterious in its inner workings as TV sets with their rabbit ear antennas and transistor radios that somehow pulled images and music out of the very air around us. But that was all to be expected, because we were on a relentless march into the future, and the future was going to be a wondrous place, just like the New York World’s Fair and the Disneyland World of Tomorrow ride promised.
Now, almost 60 years later, the future that has come to be is a pretty wondrous place in some respects, when you reflect on it. I’m listening to music that I’ve selected using an app on a phone that also serves as a camera, calendar, newspaper, library, mailbox, and message sender, among countless other functions, and fits easily in my pocket, to be carried anywhere and everywhere. I’m typing this entry into a laptop computer that will transmit my musings into the ether, where they will be published for anyone in the whole world to see. I’m pretty sure the little kid who marveled at the song selector at Bob’s Big Boy would marvel at those devices, too–but of course we tend to grow out of our sense of wonder, and eventually take these things for granted. That doesn’t make them any less amazing.
Thinking about this, I’m glad my laptop has a gleaming metal finish, because my youthful self would have expected that of such a futuristic item. And the next time I buy a cell phone I might check to see whether they’ve got an aluminum case, too.