Don’t Let Them Eat Cake

In Great Britain, the chairwoman of the Food Standards Agency, Professor Susan Jebb of the University of Oxford, is mightily concerned about the nation’s health and the obesity epidemic affecting many Brits. Among the targets of her ire are people who bring cake into the office–something she considers to be harmful as exposing your co-workers to secondhand smoke.

Professor Jebb’s basic point is that you simply can’t rely on the personal willpower of people who are exposed to the tantalizing prospect of free cake. The Times article linked above quotes her as follows: “’We all like to think we’re rational, intelligent, educated people who make informed choices the whole time and we undervalue the impact of the environment,’ she said. ‘If nobody brought in cakes into the office, I would not eat cakes in the day, but because people do bring cakes in, I eat them. Now, OK, I have made a choice, but people were making a choice to go into a smoky pub.’” She raised the smoking issue because passive smoking harms others, and “exactly the same is true of food.” The upshot, in her view, is that Great Britain needs to provide a “supportive environment” to help individuals avoid bad choices that lead to weight gain.

Although Professor Jebb specifically singled out cake at the office as an example of the prevalence of bad food options at every turn, the bottom line for her is that Great Britain needs to regulate food advertising. She notes: “At the moment we allow advertising for commercial gain with no health controls on it whatsoever and we’ve ended up with a complete market failure because what you get advertised is chocolate and not cauliflower.”

If Professor Jebb is hoping to get to a a society where cauliflower is vigorously advertised, I predict her efforts are doomed to failure. I also predict that her fellow Brits won’t look kindly on any potential restrictions on a co-worker’s ability to bring cake into the office.

Putting aside time-honored employee birthday cake events, people who bring leftover cake to the office want to get it out of their homes so they won’t be tempted by it, and people who eat cake at the office like to have a treat now and then. I’m not sure that trying to regulate cake offerings is going to prevent obesity, if that cake is then consumed at home rather than at the office. I don’t think regulating TV or billboard or radio advertising is going to get there, either, so long as cake mix is sold in stores and candy and snacks are available at the point of purchase to tempt people into taking the road to perdition.

The bottom line on obesity is that we need to build up the willpower of individuals, and incentivize them to watch their weight. Restricting cake at the office isn’t really getting at the root cause.

Five Steps To Glory

My cellphone spies on me. The phone and its ever-increasing array of apps, evidently added whenever I engage in one of the required software updates, seem to be constantly monitoring my activities, conducting some kind of unknowable, algorithmic analysis, and then sending me unwanted messages to announce their conclusions. As a result, I get weird, random notices like “you’re using your phone less this week than last week.” Since I don’t personally log the time I spend on my phone, I have no way of knowing whether these reports are accurate or not. I guess I just have to take my phone’s word for it.

This week I got a new message, one that I think came from an “exercise” app that was added in a recent software/operating system update. The message said something like: “Hey, you’re using the stairs more than you usually do!” My initial reaction was that it is creepy that my phone is tracking my stair usage and trying to function as a kind of clapping, enthusiastic personal trainer, urging me to get off my keister and continue to increase my daily count of steps. But then I wondered how in the world my stair count has increased, as I have not been making a conscious effort toward that goal.

After some careful consideration, I realized that the phone’s stairstep analysis had to relate to a domino-like series of events at work. The first domino was that the coffee maker on my floor stopped functioning. That meant that I walk over to the nearest coffee maker on my floor, which happened to be one building over–a journey that requires me to go up and down the five stairs shown above. Add in the fact that I guzzle a ridiculous number of cups of coffee each work day, so that I have been constantly ascending and descending these five steps, and you evidently end up with enough stair usage for my phone to take notice and send along some encouragement.

My initial reaction to this realization was to be surprised that even a few trips up and down five steps would make a difference to my phone. Then I thought that maybe, to keep my phone pal happy, I should continue to use the coffee maker in the next building, even after my coffee maker is fixed. And I also started to think that maybe there were other things I could do to add a few additional stair-climbing episodes to my workday, so that my phone and its apps will be even more thrilled at my efforts.

Why should I care whether my phone thinks I’m a lazy lard-ass? I don’t know, but I do. Having a Type A, get a good report card mindset in the cell phone age has its challenges.

Brain-Picking

Yesterday was a red-letter day of sorts. On two separate occasions, people stopped by to ask, specifically, if they could “pick my brain” about something. It was interesting that two different people used that same precise phrase, rather than saying, for example, that they wanted to ask a question or discuss an issue. It got me to thinking about the phrase–which doesn’t exactly conjure up pleasant mental images–and what we know about its origin.

On-line sources discussing the phrase focus on the “pick” part and its relevance to eating. In days when food was scarce and a roast item was a special treat, “picking” was an important part of the meal. The diners picked at the bones of the roast fowl or the joint of mutton to try to retrieve and consume every last morsel of meat–the way your Dad probably did with the Thanksgiving turkey. This use of “pick” probably was derived from the pickaxe, a tool used since prehistoric times. “Pick” then became a way of conveying a precise form of extraction, and other uses–like nitpicking, pickpocketing, and toothpick–followed. It was probably inevitable that someone would use “pick your brain” to describe the process of getting advice or information. The on-line sources cite a letter from the 1800s that featured the phrase–although I suspect that it was commonly used before then.

Interestingly, a search for the phrase also yields an article saying that some professional people don’t like that idiom, because it isn’t precise enough and describes a one-sided transaction, with one brain being excavated for information and the other brain enriched as a result. That reaction seems a little thin-skinned to me. Sometimes specific questions can miss lurking issues that will be uncovered by a general question, and in any case it’s nice to think that people believe your brain is a resource that can be mined for some useful information. Besides, any conversation, even a one-sided one, yields new intelligence to be incorporated into the memory banks, ready to be displayed the next time the brain is picked.

Fake Smiles And True Feelings

People have thought about fake smiles for a long time–probably for about as long as human beings have walked upright and the act of smiling became associated with happiness. They are curious about how to distinguish a fake smile from a real one, and why people fake smiles in the first place. Researchers have even examined whether working in a job where you are supposed to give a cheery smile to even unpleasant customers for your entire shift is likely to make you drink more at the end of the work day. (Spoiler: it looks like it does.)

But what about fake smiles outside the workplace, where you don’t have to give that grimace grin for eight hours while interacting with jerky customers? Does forcing a smile make you feel happier? This question had been the subject of scientific debate for so long that even Charles Darwin weighed in on the topic. In The Expression of the Emotions in Man and Animals, Darwin argued: “Even the simulation of an emotion tends to arouse it in our minds–but different studies over the years have produced different results.

Recently researchers decided to test the hypothesis, again, with a study of 3,800 people from 19 countries who were asked to respond to different prompts with a smile or a neutral expression, and then rate their happiness. The prompts were disguised, and mixed in with other facial expression requirements and even math problems, so participants presumably didn’t know that they were involved in testing whether a fake smile actually produced a happier perspective. The results suggest that faking a smile does, in fact, tend to make the fake smiler feel incrementally happier, at least in the short term.

So old Chuck Darwin apparently is right again, and forcing a grin will cause momentary changes in attitude–and at least so long as that keeping that fake smile on your face isn’t one of the requirements for your job at the neighborhood coffee shop.

Virtual Tact

The Harvard Business Review recently carried an article on how to tactfully interject in a virtual meeting. “Tact” is a quality that you don’t see often associated with computer-based communications. On social media, for example, the full frontal attack often seems to be the preferred method of making a point, and one of the problems with email is that it’s far too easy to fire off a blast that you regret almost as soon as you hit the send button.

Virtual meetings, though, are a setting where trying to avoid offending colleagues and coming across as a rude jerk makes proceeding with tact an important consideration. At the same time, however, the virtuality can make it difficult to politely interject and make your point (particularly if you forget you are on mute). In-person meetings always seem to present an opportunity to have your say before the meeting breaks up and people leave the conference room, but the virtual context can be a barrier to participation. Sometimes, acting with tact seems to be at war with the need to contribute to the discussion, even if it means interrupting the flow.

So, what to do? Obviously, the first step is to self-edit a bit, and consider whether your point is really all that important. But if you conclude that it is, the HBR article suggests “signaling your interest” by using the “raise hand” feature, unmuting, using the chat feature to indicate you’d like to say something, or “gently rais[ing] your physical hand if you’re on video.” (The “gentle” means you shouldn’t make a ridiculous spectacle out of raising your hand, like Horschack on Welcome Back Kotter.) Other tactful techniques include reviewing the agenda in advance and letting the presenter know that you’d like to address some of the topics, or waiting until a natural break in the presentation to interject. The article even suggests some tactful phrases you can use as you are breaking in.

The last point in the article, however, is “be assertive when necessary.” Sometimes, visual signals don’t work–this is especially true when a PowerPoint is being presented, and the visual of you has been shrunk down to postage stamp size–and there simply might not be an obvious break where you can step in with your trenchant point. Tact is a valued quality, but you don’t want to have the meeting end without making your contribution, which could affect the next steps to be taken. Sometimes, tact and doing your job just don’t mix.

Understanding Mr. Green Jeans

When I was a kid, I enjoyed watching Captain Kangaroo. I liked the Captain, of course, and Dancing Bear and Mr. Moose and Bunny Rabbit, but my real favorite was Mr. Green Jeans. He would come on the show, wearing his trademark green jeans and usually a straw hat and flannel shirt, perhaps play a guitar or sing a song with the Captain, and maybe show you a plant or animal and talk about it. But Mr. Green Jeans was at his best in helping Mr. Moose and Bunny Rabbit play a gentle prank on the Captain–one that usually involved the Captain getting showered with dropped ping pong balls. It was a gentle prank for a gentle show.

I was thinking about Mr. Green Jeans the other day in connection with the gradually dawning concept of people having jobs. As adults, we’ve lived with the concept of work for so long that we’ve forgotten that the notion of people getting paid to do something isn’t necessarily intuitive, and has to be learned like other lessons of the world. For me, at least, Mr. Green Jeans and Captain Kangaroo were part of that process.

At first, a very young watcher would take a show like Captain Kangaroo at face value, as if the broadcast somehow gave you a brief peek into the actual life of the Captain, Mr. Green Jeans, and their friends. At some later point, you come to understand, perhaps because your Mom patiently explained it to you, that the show wasn’t “real,” in the same way life in your home was real, and that Mr. Moose and Bunny Rabbit were just puppets, and that Captain Kangaroo was a show put on for kids like you to watch and enjoy.

Later still came the realization that Captain Kangaroo and Mr. Green Jeans were actors, that being on the show was their job–hey, just like your Dad left every day to go to his job!–and that the Captain and Mr. Green Jeans were getting paid to be on the show. That last step in the understanding process was a big one, because it required you to get the concept of money, too, and why people needed to work, so they could eat and have a house and clothes and a car–and the fact that you would undoubtedly need to work, too, at some point. It was part of a bigger realization that the world was a complicated place, and there was a lot more to it than the Captain reading stories and pranks involving ping pong balls.

By then, as you watched Captain Kangaroo with your younger siblings, you thought that being Mr. Green Jeans would be fun. But by then your sights had changed a bit, and your friends were talking about being firemen or astronauts when they grew up.

“Quiet Quitting” And Labor Day

Happy Labor Day! On this day set aside to celebrate working people–and give them a day off, too–it’s worth spending a few minutes thinking about work and jobs and a supposedly recent development in the labor sector: “quiet quitting.”

“Quiet quitting” has been the subject of a lot of discussion recently, in articles like this one. It’s a seemingly elastic concept that can mean different things to different people. For some, the notion is all about setting boundaries; you will work hard during the normal workday but not take on additional responsibilities that would intrude into your private life and produce burnout. For others, it means doing the least amount of work needed to avoid getting fired by an employer who recognizes that, in the current labor market, it may not be able to find someone better to fill the position. “Quiet quitting” evidently got that name on TikTok, where “quiet quitters” have been posting videos about their decisions.

Of course, “quiet quitting” might have a modern brand, but the underlying idea is nothing new. Anyone who has worked for any length of time has had “quiet quitters” as co-workers. I remember some from my first job, as a “bag boy” at the Big Bear grocery store in Kingsdale Shopping Center circa 1973. They were the guys you didn’t want to get matched up with on a project, like retrieving abandoned carts from the parking lot so the in-store supply was fully stocked. You knew they would retrieve a few carts at a deliberate pace, but you would do most of the work so the two of you wouldn’t get reprimanded by the boss. I quickly decided that I didn’t want to be a “bare minimum” guy, always at risk of getting canned, but since then I’ve also been fortunate to have jobs in my working career that I found interesting and well worth the investment of some extra, “off the clock” time.

Is “quiet quitting” a bad thing? I don’t think it is, but in any event it is a reality. The labor market, like the rest of the economy, is subject to the law of supply and demand. “Quiet quitting” is a product of the invisible hand at work; it reflects the fact that the demand for workers right now exceeds the supply. There is nothing wrong with sending a message to an employer that employees won’t put up with having new responsibilities piled on their plate without fair compensation–that’s one of the signals that allows the invisible hand to work.

But “quiet quitting” also has a potential cost, and a potential risk. The cost might be the impact on your self perception and your reputation among your co-workers, as well as the chance you might be developing the habit of settling rather than going out and finding a new job that is better suited to your interests. The risk is that the balance of supply and demand in the labor market shifts–giving the employer the option of upgrading the workforce, leaving the “quiet quitters” without a job and, perhaps, without a recommendation as they look for a new one.

Another Empty Spot On The Desk

Our IT staff came and took away my old office land-line phone recently, as I have now fully transitioned to communication through my computer. It leaves the empty spot on my desk shown above. That gleaming empty spot now joins other empty spots that have been created over the years, as once-essential workplace items have been pitched into the dustbin, their functionality entirely absorbed into the mighty, all-purpose desktop computer.

Once my desk held a dictaphone, a telephone, a speakerphone attachment, a hole punch gizmo, and a stapler. All are now gone. The flip-top calendar that I have had for years won’t be far behind; I’ve stopped using it in lieu of total calendaring reliance on my computer. And the other essential purpose of a desk–to hold the piles of papers that I’m working on–also is falling by the wayside. I’m old school and still print out some documents to review in hard copy form, but the amount of paper in my office is a small fraction of what it once was, with most of the reviewing and editing work being done entirely on the computer. In short, there are a lot of empty spots on my desk these days.

Thanks to technology, I am finally within reach of “clean desk” status.

What’s the purpose of a desk, in an era when the computer reigns supreme? It’s a convenient place to stash the legal pads and pens that I still use, and I need its writing surface when I’m making a note. It’s a great platform for my collection of aging family photos, kid art, and things like little clocks or fancy penholders. And when people come into my office they can be pretty sure that it’s me sitting behind the desk, staring at the computer and tapping away at the keyboard.

But all of those empty spaces make you wonder how much longer people will be using large, impressive wooden desks. In the computer era, they’ve become almost an affectation, a power device, and a prop, and you wonder if they will be part of the office of the future–that is, if offices as we know them will even exist.

Rereading Dune

Lately I’ve been taking a break from my Shakespeare Project–I’ve been on the road, and my Yale Collected Works of Shakespeare volume is massive and not exactly travel-friendly–so I’ve been reading other things. Most recently I picked up an old paperback edition of Frank Herbert’s Dune that was on one of our shelves and have read it for the first time since my college years.

I enjoy rereading favorite books, and Dune is a good example of why. When I read it as a youth, I was pulled in by the story and read it as fast as possible, wanting to find out what happened to Paul Atreides (aka Muad’Dib) and his mother Jessica and the evil, repulsive Baron Harkonnen. Reading it again, knowing how the story ends, allows for a much more leisurely journey, appreciating the really good writing and–especially–the monumental task of creating such a fully realized world, as Herbert did with the desert planet Arrakis, its melange, its sandworms, and its Fremen.

It’s an amazing accomplishment that, perhaps, isn’t as obvious to a young reader as it becomes to someone who has read a lot over the decades. There simply aren’t that many books out there that have captured an entire previously unknown civilization–its culture, its people, its ecology, its economy, its religion, its institutions, and its politics–so completely. Most fiction builds on the foundation of our existing world and its history and doesn’t have to create a civilization from the sand up, as Herbert did. George R.R. Martin’s Game of Thrones books are another example of that kind of accomplishment that show just how rare such books are, and how difficult they are to create.

And writing Dune clearly took a lot of work. The back story of Herbert’s creation of Dune should encourage unappreciated writers to keep at it. According to the Dune Novels website, it took Herbert six years to research and write Dune, and the book was rejected by 23 publishers before being accepted for publication. You can imagine how dispiriting it must have been to get those rejection letters are so much time and effort. Yet, according to one ranking, at least, Dune went on to become the best-selling science fiction book of all time and continues to hold that spot, nearly 60 years after it was published. Herbert’s years of labor produced a sci-fi classic that people will be enjoying for decades to come. I wonder how the publishers who casually rejected it feel about their decisions now?

The Headset Question

We’ve got a transition underway at our workplace. The phones on our desks are being removed, after decades of faithful service, and now we’ll be doing all of our calling through our computers. I’m okay with that. In the modern world, any technology that has been around for decades has done its job but almost certainly can be replaced by an improved approach. And getting rid of the desktop phone also means eliminating the annoying need to constantly untangle the cord connecting the handset to the rest of the phone.

With the elimination of the old phone, we’re being offered options. Apparently the sound qualify if you simply talk into your computer on a phone call isn’t ideal for the person on the other end of the conversation. (And, in any event, you probably don’t want to encourage people to shout at their computers, anyway.) So we need to make a choice: do you go with a headset, or a speakerphone attachment?

Headsets probably make the most sense, but unfortunately I associate them with Ernestine, the snorting, cackling busybody character Lily Tomlin introduced on Laugh-In. There’s also a clear techno vibe to a headset, with a one-ear headset edging out the two-ear headset in the hip, technocool ranking. I frankly question whether I’m well-suited to either. So, I’m going for the speakerphone attachment as my first option, with one of the headsets a distant second in case the supply of speakerphones isn’t sufficient to meet demand.

It will be interesting to see whether speakerphones are a popular option, or whether my colleagues will go all-in on the headsets. I’m guessing that the choices will vary by age group, with the older set being more amenable to speakerphones–if only so they won’t hear “one ringy-dingy, two ringy-dingy” in that sniveling Ernestine voice whenever they use the headset to place a call.

Resume Building

I ran across an interesting article on CNBC about resumes–those printed, boiled-down summaries of a person’s educational and work life that job applicants fret about. The article said that, these days, 93 percent of employers want to see “soft skills” included on the resume, and eight of those attributes are in particular demand: “communication skills,” “customer service,” scheduling, “time management skills,” project management, analytical thinking, “ability to work independently,” and flexibility.

Resumes are always a product of the time period in which they are prepared, and some of the qualities identified by CNBC clearly reflect the recent COVID pandemic and the shift, for many employers, to remote or hybrid work. When people are working in different locations and connecting through technology, “communications skills” that help everyone keep track of the status of their joint project are a lot more important than they would be if all of the team members were working 9-5 Monday through Friday in offices just down the hallway. Similarly, the ability to work independently, time management, and flexibility have obvious value in a remote or hybrid work environment.

Many of the “soft skills” mentioned in the CNBC article, though, seem like characteristics that you would want to mention in describing your work experience, irrespective of COVID or remote work considerations, because they really all illuminate different facets of good employees and good supervisors. Good employees obviously care about customer service, manage their time efficiently, and exhibit flexibility and analytical thinking as they do their jobs. Good supervisors are good communicators, come up with rational schedules for the work their teams are doing, and display project management skills. None of these “soft skills” should come as a surprise to anyone.

Happily, I haven’t had to prepare a resume for decades, but it seems like the resume experts are always coming up with new approaches and techniques and emphases in response to changes in the workplace. When I last prepared a resume back in the ’80s, for example, the prevailing view was that you needed to include an “interests” section in your resume to show that you were a multi-faceted human being and not some soulless working automaton, and also provide fodder for job interview conversation. The debate then raged about what kinds of interests would be appealing, yet safe, and what were too edgy. “Reading” and “travel” were viewed as prudent choices, I seem to recall, but you might not want to indicate that you were a professional wrestling fan or enjoyed attending comic book conventions.

In the arc of a person’s resume life you go from having to stretch your education and work experience to fill a page and trying to come up with a description of your summer job that made it sound meaningful to the point where you have more than enough material and are simply trying to hold the puffery down to a minimum. But the point of the resume is the same: how do you put words on a page that show you would be a good member of the team, given the current circumstances? That ultimate goal really hasn’t changed.

Changing Of The Guard

If you’re not thrilled with your current job, at least recognize that you could be doing something worse—like swapping out chemical toilets at a downtown Columbus COTA bus stop on a hot summer afternoon. You can only imagine the delightful odors these poor guys were experiencing.

There might be worse jobs than that, but I really can’t think of any offhand. Can you?

Arty Party

Our firm had a party tonight at the Columbus Museum of Art. It’s a great venue for a party. We started outside in the garden, where we got to enjoy vistas like that shown in the photo above, then we moved inside for food, drinks and karaoke. Who would have thought that our law firm had so many singers? After midnight the staff had to kick us out.

Downtown Columbus has a lot of good party spots. The Art Museum is one of them.

When The Supply Chain Issues Hit The Office

Several years ago, our office went from the old-fashioned Bunn coffee maker that made entire pots of coffee to Flavia coffee machines that make one cup of joe. The Flavia machines use little packets of coffee, like those pictured above, that you insert into the machine to get your brew. My coffee of choice is the Pike Place roast. It’s a medium roast coffee that Starbuck’s describes as follows: “A smooth, well-rounded blend of Latin American coffees with subtly rich notes of cocoa and toasted nuts, it’s perfect for every day.”

And I do, in fact, drink it every day when I’m in the office. Multiple times every day, in fact.

Yesterday we ran out of the Pike Place, which caused me to experience a momentary flutter of disquiet. Later in the day, the guy who fills our coffee stopped by to refill the supply of our Flavia coffee packets. I was relieved to see him and told him I was sorry I had guzzled so much of the Pike Place. He shook his head sadly and explained that there was no Pike Place to replenish the supply on our floor. He noted that our firm was totally out of the Pike Place, and when he called the warehouse to see why our order of Pike Place wasn’t delivered, he was told that the local warehouse was totally out of it, too. He then put up a hand-lettered sign above the coffee machine to explain the situation in hopes that it would prevent Pike Place drinkers from rioting in the hallways.

We’ve all heard of the supply chain issues that the country is experiencing, post-pandemic. I had not heard of coffee being affected, but apparently I wasn’t paying attention, because there have been stories about the coffee supply being affected by the weather and shipping delays, and shipping snafus caused by congestion at ports have compounded the problem.

Of course, in the grand scheme of things a shortage in one particular coffee packet isn’t the end of the world; I can just shift to Cafe Verona or even (horrors!) decaf in a pinch. (There always seems to be a very ample supply of decaf, doesn’t there?) But the tale of Pike Place coffee packets in one office in one city shows just how precarious the supply chain can be.

Cutting The (Linguistic) Mustard

Recently I mentioned, with some asperity, that a particular effort didn’t “cut the mustard.” Two of my colleagues looked at me in bewilderment. They’d apparently never heard the phrase before, and had no idea that “cutting the mustard” meant meeting a desired standard of performance. To them, it was just another inexplicable saying that would have to be added to their growing list of quaint “Bobisms.”

Where does “cut the mustard” come from? Like many idioms, its lineage is disputed. Some sources contend it is British in origin and refers to the physical act of cutting down mustard plants, which requires sufficiently sharp tools; dull tools therefore would not “cut the mustard.” Others believe that it is an Americanism, perhaps originating in Texas, where a use of the phrase was found in a Galveston newspaper in the 1890s. O. Henry also used “cut the mustard” in some of his popular short stories in the early 1900s, which may have helped to spread the saying to the United States at large. One source argues that mustard has long been associated with being strong or sharp, and “cutting the mustard” relates to that notion.

I have a related, but slightly different, theory: I think that because mustard can be so powerfully flavored, the other ingredients of your sandwich or dinner must be sufficiently tasty to hold their own and make their presence known. I’m guessing that, out on the dusty plains of Texas, a cowboy took a bite into a sandwich and realized that the meat and other sandwich makings were so insubstantial and bland that they were overwhelmed by the pungent mustard. He then packed his saddlebags, spurred his horse, and ruefully concluded that the unsatisfying sandwich wouldn’t cut the mustard.

Can it really be that “cut the mustard” has passed totally out of usage by anyone under, say, 60? If so, that’s too bad. It’s one of those idioms that adds flavor — pun intended — to our language.