Gorilla Resilience Training

“Resilience”–generally defined as the ability to respond and adapt to challenging situations and to keep going in the face of trauma and adversity–is a prized commodity these days. Many businesses seek to encourage the development of enhanced resilience skills in their employees and offer training to help them become more resilient. Indeed, in many jobs where performance often has to occur in times of stress or under trying circumstances, resilience is a quality that may prove to be the difference between success and failure.

A recent study indicates that your next resilience training session might be taught by a gorilla, or at least draw some tips from their approach to life.

The study, undertaken by the University of Michigan, shows that gorillas are amazingly resilient–more so than humans and other animal species. The study focused on examining gorillas who had experienced trauma, such as the death of their mother, at an early age. In many species, such early life adversity is associated with shorter life spans and additional problems later in life. Gorillas apparently are different. The U of M research revealed that the more adversity gorillas experienced, the more likely they were to die young–but if they survived to the age of six, their lifespans were not shortened. In fact, gorillas who survived three or more early childhood traumas were more likely to live longer than other gorillas.

Why are gorillas more resilient than other species? The researchers who undertook the study believe that one reason is the tight-knit social structure of gorilla communities, where a young gorilla whose mother has died is not left alone, but instead is adopted and supported by the whole clan. They also suspect that the resource-rich environment in which gorillas live helps, by not adding additional stresses, like the need to constantly search for sufficient food, on top of the trauma. And, in some respects, the ability of certain gorillas to overcome devastating life-reversals may simply be an example of “survival of the fittest.”

We can learn from gorillas, and anyone who has worked under stressful circumstances will likely agree on one lesson: adversity and stress are more easily borne if they are shared, and it is a lot easier to be resolute and carry on if you are part of a good team.

The Birds’ Time

I got up early this morning, just before dawn, to do some work. I sat outside on the porch for our hotel room, cooled by a freshening breeze, and was serenaded by the calls of seemingly dozens of birds making their presence known from the mountainside out into the Oro Valley. The different bird sounds stand out dramatically in the pre-dawn stillness, uninterrupted by the sounds of passing cars or other human-generated noise.

Dawn obviously is the time for desert dwelling birds to exercise their vocal cords. The sounds range from hollow-sounding, owl-like hoots to chittering, piping, warbling, and twittering. The different calls fit well together, producing a combination of sounds that is like a feathered symphony.

Sitting outside on a cool morning and listening to birdsong is a very peaceful way to start the work week.

The NYC Rat Czar

Job titles are important to many workers. At some banks, for example, it seems that virtually every employee is an assistant vice president. Many employees want to have important-sounding titles, and many businesses are perfectly happy to accommodate that desire–especially if the employer can offer title changes in order to hold down the size of raises.

New York City has now taken job titles to a whole new level. The Big Apple has its first-ever “rat czar.” Kathleen Corradi’s official job title is “Director of Rodent Mitigation,” but of course “rat czar” sounds a lot cooler. When the czar was introduced to the news media yesterday by NYC Mayor Eric Adams, Ms. Corradi promised: “You’ll be seeing a lot of me and a lot less rats.” Fans of ’60s TV shows no doubt will wonder whether her efforts to get out into the city and hunt down the filthy, disgusting creatures will be call the “Rat Patrol.”

What sort of resume do you need to become a “rat czar”? Ms. Corradi, who was one of 900 applicants for the job, has been an elementary school teacher, land use expert, and garden coordinator at the Brooklyn Botanic Garden and was an advocate for anti-rat measures in her neighborhood. She will be tasked with trying to rid New York City’s streets and alleys of the garbage and food waste that rats love and deciding which products the City should use to exterminate the existing rat population.

One recent estimate concluded that there are about 2 million rats in New York City–and the population has soared in recent decades. The “rat czar” has a big job ahead of her. I hope she gets a cool uniform to go with the title, too.

Waiting For The Axe To Fall

McDonald’s recently announced that it would be implementing layoffs. Earlier this week, it temporarily closed its U.S. corporate headquarters, in Chicago, and asked all HQ employees to cancel in-person meetings and work from home as the company distributed layoff notices. It isn’t clear how many employees will be laid off; world-wide, McDonald’s some 150,000 employees in corporate roles and working in company-owned restaurants.

According to an internal email obtained by the news media, McDonald’s told employees it was distributing layoff notices remotely to ensure the “comfort and confidentiality” of its employees. In short, McDonald’s evidently believes that employees would rather wait for the axe to fall and then react privately at their homes, rather than hanging out for the possible visit from the Grim Reaper in the office, with their fellow corporate employees.

I’m not sure that is the case. If I knew my job was potentially on the chopping block, I would rather wait for the news coming down from the C suite in the office, with my co-workers. Being alone at home and wondering if every ping announcing a new email was the one that would determine your fate would be too nerve-wracking. At least if you were at the office, you would have some company while you bided your time, and as the news was distributed you’d have a better sense of who got the pink slip, and how widespread the distribution was in your area. That way, the Band-Aid would be ripped off, everyone would know the news, and people wouldn’t have to come into the office the following Monday and repeatedly, and sheepishly, announce their fate as they encountered their individual co-workers.

The McDonald’s news is interesting, and not just because of its decision to briefly close its corporate offices while layoffs are announced. McDonald’s net income has increased recently, but the company’s CEO thinks its structure and organization are outdated, and he says that “we will evaluate roles and staffing levels in parts of the organization and there will be difficult discussions and decisions ahead.” Some wonder whether the layoffs at McD’s are part of an ongoing “white-collar recession,” as companies that are pessimistic about the current economic outlook get rid of corporate jobs while retaining production-level workers.

Who knows? If we are in the midst of a “white-collar recession,” we may see a lot more companies having to make decisions, as McDonald’s did, about where employees should be as they wait to hear about their jobs.

Don’t Let Them Eat Cake

In Great Britain, the chairwoman of the Food Standards Agency, Professor Susan Jebb of the University of Oxford, is mightily concerned about the nation’s health and the obesity epidemic affecting many Brits. Among the targets of her ire are people who bring cake into the office–something she considers to be harmful as exposing your co-workers to secondhand smoke.

Professor Jebb’s basic point is that you simply can’t rely on the personal willpower of people who are exposed to the tantalizing prospect of free cake. The Times article linked above quotes her as follows: “’We all like to think we’re rational, intelligent, educated people who make informed choices the whole time and we undervalue the impact of the environment,’ she said. ‘If nobody brought in cakes into the office, I would not eat cakes in the day, but because people do bring cakes in, I eat them. Now, OK, I have made a choice, but people were making a choice to go into a smoky pub.’” She raised the smoking issue because passive smoking harms others, and “exactly the same is true of food.” The upshot, in her view, is that Great Britain needs to provide a “supportive environment” to help individuals avoid bad choices that lead to weight gain.

Although Professor Jebb specifically singled out cake at the office as an example of the prevalence of bad food options at every turn, the bottom line for her is that Great Britain needs to regulate food advertising. She notes: “At the moment we allow advertising for commercial gain with no health controls on it whatsoever and we’ve ended up with a complete market failure because what you get advertised is chocolate and not cauliflower.”

If Professor Jebb is hoping to get to a a society where cauliflower is vigorously advertised, I predict her efforts are doomed to failure. I also predict that her fellow Brits won’t look kindly on any potential restrictions on a co-worker’s ability to bring cake into the office.

Putting aside time-honored employee birthday cake events, people who bring leftover cake to the office want to get it out of their homes so they won’t be tempted by it, and people who eat cake at the office like to have a treat now and then. I’m not sure that trying to regulate cake offerings is going to prevent obesity, if that cake is then consumed at home rather than at the office. I don’t think regulating TV or billboard or radio advertising is going to get there, either, so long as cake mix is sold in stores and candy and snacks are available at the point of purchase to tempt people into taking the road to perdition.

The bottom line on obesity is that we need to build up the willpower of individuals, and incentivize them to watch their weight. Restricting cake at the office isn’t really getting at the root cause.

Five Steps To Glory

My cellphone spies on me. The phone and its ever-increasing array of apps, evidently added whenever I engage in one of the required software updates, seem to be constantly monitoring my activities, conducting some kind of unknowable, algorithmic analysis, and then sending me unwanted messages to announce their conclusions. As a result, I get weird, random notices like “you’re using your phone less this week than last week.” Since I don’t personally log the time I spend on my phone, I have no way of knowing whether these reports are accurate or not. I guess I just have to take my phone’s word for it.

This week I got a new message, one that I think came from an “exercise” app that was added in a recent software/operating system update. The message said something like: “Hey, you’re using the stairs more than you usually do!” My initial reaction was that it is creepy that my phone is tracking my stair usage and trying to function as a kind of clapping, enthusiastic personal trainer, urging me to get off my keister and continue to increase my daily count of steps. But then I wondered how in the world my stair count has increased, as I have not been making a conscious effort toward that goal.

After some careful consideration, I realized that the phone’s stairstep analysis had to relate to a domino-like series of events at work. The first domino was that the coffee maker on my floor stopped functioning. That meant that I walk over to the nearest coffee maker on my floor, which happened to be one building over–a journey that requires me to go up and down the five stairs shown above. Add in the fact that I guzzle a ridiculous number of cups of coffee each work day, so that I have been constantly ascending and descending these five steps, and you evidently end up with enough stair usage for my phone to take notice and send along some encouragement.

My initial reaction to this realization was to be surprised that even a few trips up and down five steps would make a difference to my phone. Then I thought that maybe, to keep my phone pal happy, I should continue to use the coffee maker in the next building, even after my coffee maker is fixed. And I also started to think that maybe there were other things I could do to add a few additional stair-climbing episodes to my workday, so that my phone and its apps will be even more thrilled at my efforts.

Why should I care whether my phone thinks I’m a lazy lard-ass? I don’t know, but I do. Having a Type A, get a good report card mindset in the cell phone age has its challenges.

Brain-Picking

Yesterday was a red-letter day of sorts. On two separate occasions, people stopped by to ask, specifically, if they could “pick my brain” about something. It was interesting that two different people used that same precise phrase, rather than saying, for example, that they wanted to ask a question or discuss an issue. It got me to thinking about the phrase–which doesn’t exactly conjure up pleasant mental images–and what we know about its origin.

On-line sources discussing the phrase focus on the “pick” part and its relevance to eating. In days when food was scarce and a roast item was a special treat, “picking” was an important part of the meal. The diners picked at the bones of the roast fowl or the joint of mutton to try to retrieve and consume every last morsel of meat–the way your Dad probably did with the Thanksgiving turkey. This use of “pick” probably was derived from the pickaxe, a tool used since prehistoric times. “Pick” then became a way of conveying a precise form of extraction, and other uses–like nitpicking, pickpocketing, and toothpick–followed. It was probably inevitable that someone would use “pick your brain” to describe the process of getting advice or information. The on-line sources cite a letter from the 1800s that featured the phrase–although I suspect that it was commonly used before then.

Interestingly, a search for the phrase also yields an article saying that some professional people don’t like that idiom, because it isn’t precise enough and describes a one-sided transaction, with one brain being excavated for information and the other brain enriched as a result. That reaction seems a little thin-skinned to me. Sometimes specific questions can miss lurking issues that will be uncovered by a general question, and in any case it’s nice to think that people believe your brain is a resource that can be mined for some useful information. Besides, any conversation, even a one-sided one, yields new intelligence to be incorporated into the memory banks, ready to be displayed the next time the brain is picked.

Fake Smiles And True Feelings

People have thought about fake smiles for a long time–probably for about as long as human beings have walked upright and the act of smiling became associated with happiness. They are curious about how to distinguish a fake smile from a real one, and why people fake smiles in the first place. Researchers have even examined whether working in a job where you are supposed to give a cheery smile to even unpleasant customers for your entire shift is likely to make you drink more at the end of the work day. (Spoiler: it looks like it does.)

But what about fake smiles outside the workplace, where you don’t have to give that grimace grin for eight hours while interacting with jerky customers? Does forcing a smile make you feel happier? This question had been the subject of scientific debate for so long that even Charles Darwin weighed in on the topic. In The Expression of the Emotions in Man and Animals, Darwin argued: “Even the simulation of an emotion tends to arouse it in our minds–but different studies over the years have produced different results.

Recently researchers decided to test the hypothesis, again, with a study of 3,800 people from 19 countries who were asked to respond to different prompts with a smile or a neutral expression, and then rate their happiness. The prompts were disguised, and mixed in with other facial expression requirements and even math problems, so participants presumably didn’t know that they were involved in testing whether a fake smile actually produced a happier perspective. The results suggest that faking a smile does, in fact, tend to make the fake smiler feel incrementally happier, at least in the short term.

So old Chuck Darwin apparently is right again, and forcing a grin will cause momentary changes in attitude–and at least so long as that keeping that fake smile on your face isn’t one of the requirements for your job at the neighborhood coffee shop.

Virtual Tact

The Harvard Business Review recently carried an article on how to tactfully interject in a virtual meeting. “Tact” is a quality that you don’t see often associated with computer-based communications. On social media, for example, the full frontal attack often seems to be the preferred method of making a point, and one of the problems with email is that it’s far too easy to fire off a blast that you regret almost as soon as you hit the send button.

Virtual meetings, though, are a setting where trying to avoid offending colleagues and coming across as a rude jerk makes proceeding with tact an important consideration. At the same time, however, the virtuality can make it difficult to politely interject and make your point (particularly if you forget you are on mute). In-person meetings always seem to present an opportunity to have your say before the meeting breaks up and people leave the conference room, but the virtual context can be a barrier to participation. Sometimes, acting with tact seems to be at war with the need to contribute to the discussion, even if it means interrupting the flow.

So, what to do? Obviously, the first step is to self-edit a bit, and consider whether your point is really all that important. But if you conclude that it is, the HBR article suggests “signaling your interest” by using the “raise hand” feature, unmuting, using the chat feature to indicate you’d like to say something, or “gently rais[ing] your physical hand if you’re on video.” (The “gentle” means you shouldn’t make a ridiculous spectacle out of raising your hand, like Horschack on Welcome Back Kotter.) Other tactful techniques include reviewing the agenda in advance and letting the presenter know that you’d like to address some of the topics, or waiting until a natural break in the presentation to interject. The article even suggests some tactful phrases you can use as you are breaking in.

The last point in the article, however, is “be assertive when necessary.” Sometimes, visual signals don’t work–this is especially true when a PowerPoint is being presented, and the visual of you has been shrunk down to postage stamp size–and there simply might not be an obvious break where you can step in with your trenchant point. Tact is a valued quality, but you don’t want to have the meeting end without making your contribution, which could affect the next steps to be taken. Sometimes, tact and doing your job just don’t mix.

Understanding Mr. Green Jeans

When I was a kid, I enjoyed watching Captain Kangaroo. I liked the Captain, of course, and Dancing Bear and Mr. Moose and Bunny Rabbit, but my real favorite was Mr. Green Jeans. He would come on the show, wearing his trademark green jeans and usually a straw hat and flannel shirt, perhaps play a guitar or sing a song with the Captain, and maybe show you a plant or animal and talk about it. But Mr. Green Jeans was at his best in helping Mr. Moose and Bunny Rabbit play a gentle prank on the Captain–one that usually involved the Captain getting showered with dropped ping pong balls. It was a gentle prank for a gentle show.

I was thinking about Mr. Green Jeans the other day in connection with the gradually dawning concept of people having jobs. As adults, we’ve lived with the concept of work for so long that we’ve forgotten that the notion of people getting paid to do something isn’t necessarily intuitive, and has to be learned like other lessons of the world. For me, at least, Mr. Green Jeans and Captain Kangaroo were part of that process.

At first, a very young watcher would take a show like Captain Kangaroo at face value, as if the broadcast somehow gave you a brief peek into the actual life of the Captain, Mr. Green Jeans, and their friends. At some later point, you come to understand, perhaps because your Mom patiently explained it to you, that the show wasn’t “real,” in the same way life in your home was real, and that Mr. Moose and Bunny Rabbit were just puppets, and that Captain Kangaroo was a show put on for kids like you to watch and enjoy.

Later still came the realization that Captain Kangaroo and Mr. Green Jeans were actors, that being on the show was their job–hey, just like your Dad left every day to go to his job!–and that the Captain and Mr. Green Jeans were getting paid to be on the show. That last step in the understanding process was a big one, because it required you to get the concept of money, too, and why people needed to work, so they could eat and have a house and clothes and a car–and the fact that you would undoubtedly need to work, too, at some point. It was part of a bigger realization that the world was a complicated place, and there was a lot more to it than the Captain reading stories and pranks involving ping pong balls.

By then, as you watched Captain Kangaroo with your younger siblings, you thought that being Mr. Green Jeans would be fun. But by then your sights had changed a bit, and your friends were talking about being firemen or astronauts when they grew up.

“Quiet Quitting” And Labor Day

Happy Labor Day! On this day set aside to celebrate working people–and give them a day off, too–it’s worth spending a few minutes thinking about work and jobs and a supposedly recent development in the labor sector: “quiet quitting.”

“Quiet quitting” has been the subject of a lot of discussion recently, in articles like this one. It’s a seemingly elastic concept that can mean different things to different people. For some, the notion is all about setting boundaries; you will work hard during the normal workday but not take on additional responsibilities that would intrude into your private life and produce burnout. For others, it means doing the least amount of work needed to avoid getting fired by an employer who recognizes that, in the current labor market, it may not be able to find someone better to fill the position. “Quiet quitting” evidently got that name on TikTok, where “quiet quitters” have been posting videos about their decisions.

Of course, “quiet quitting” might have a modern brand, but the underlying idea is nothing new. Anyone who has worked for any length of time has had “quiet quitters” as co-workers. I remember some from my first job, as a “bag boy” at the Big Bear grocery store in Kingsdale Shopping Center circa 1973. They were the guys you didn’t want to get matched up with on a project, like retrieving abandoned carts from the parking lot so the in-store supply was fully stocked. You knew they would retrieve a few carts at a deliberate pace, but you would do most of the work so the two of you wouldn’t get reprimanded by the boss. I quickly decided that I didn’t want to be a “bare minimum” guy, always at risk of getting canned, but since then I’ve also been fortunate to have jobs in my working career that I found interesting and well worth the investment of some extra, “off the clock” time.

Is “quiet quitting” a bad thing? I don’t think it is, but in any event it is a reality. The labor market, like the rest of the economy, is subject to the law of supply and demand. “Quiet quitting” is a product of the invisible hand at work; it reflects the fact that the demand for workers right now exceeds the supply. There is nothing wrong with sending a message to an employer that employees won’t put up with having new responsibilities piled on their plate without fair compensation–that’s one of the signals that allows the invisible hand to work.

But “quiet quitting” also has a potential cost, and a potential risk. The cost might be the impact on your self perception and your reputation among your co-workers, as well as the chance you might be developing the habit of settling rather than going out and finding a new job that is better suited to your interests. The risk is that the balance of supply and demand in the labor market shifts–giving the employer the option of upgrading the workforce, leaving the “quiet quitters” without a job and, perhaps, without a recommendation as they look for a new one.

Another Empty Spot On The Desk

Our IT staff came and took away my old office land-line phone recently, as I have now fully transitioned to communication through my computer. It leaves the empty spot on my desk shown above. That gleaming empty spot now joins other empty spots that have been created over the years, as once-essential workplace items have been pitched into the dustbin, their functionality entirely absorbed into the mighty, all-purpose desktop computer.

Once my desk held a dictaphone, a telephone, a speakerphone attachment, a hole punch gizmo, and a stapler. All are now gone. The flip-top calendar that I have had for years won’t be far behind; I’ve stopped using it in lieu of total calendaring reliance on my computer. And the other essential purpose of a desk–to hold the piles of papers that I’m working on–also is falling by the wayside. I’m old school and still print out some documents to review in hard copy form, but the amount of paper in my office is a small fraction of what it once was, with most of the reviewing and editing work being done entirely on the computer. In short, there are a lot of empty spots on my desk these days.

Thanks to technology, I am finally within reach of “clean desk” status.

What’s the purpose of a desk, in an era when the computer reigns supreme? It’s a convenient place to stash the legal pads and pens that I still use, and I need its writing surface when I’m making a note. It’s a great platform for my collection of aging family photos, kid art, and things like little clocks or fancy penholders. And when people come into my office they can be pretty sure that it’s me sitting behind the desk, staring at the computer and tapping away at the keyboard.

But all of those empty spaces make you wonder how much longer people will be using large, impressive wooden desks. In the computer era, they’ve become almost an affectation, a power device, and a prop, and you wonder if they will be part of the office of the future–that is, if offices as we know them will even exist.

Rereading Dune

Lately I’ve been taking a break from my Shakespeare Project–I’ve been on the road, and my Yale Collected Works of Shakespeare volume is massive and not exactly travel-friendly–so I’ve been reading other things. Most recently I picked up an old paperback edition of Frank Herbert’s Dune that was on one of our shelves and have read it for the first time since my college years.

I enjoy rereading favorite books, and Dune is a good example of why. When I read it as a youth, I was pulled in by the story and read it as fast as possible, wanting to find out what happened to Paul Atreides (aka Muad’Dib) and his mother Jessica and the evil, repulsive Baron Harkonnen. Reading it again, knowing how the story ends, allows for a much more leisurely journey, appreciating the really good writing and–especially–the monumental task of creating such a fully realized world, as Herbert did with the desert planet Arrakis, its melange, its sandworms, and its Fremen.

It’s an amazing accomplishment that, perhaps, isn’t as obvious to a young reader as it becomes to someone who has read a lot over the decades. There simply aren’t that many books out there that have captured an entire previously unknown civilization–its culture, its people, its ecology, its economy, its religion, its institutions, and its politics–so completely. Most fiction builds on the foundation of our existing world and its history and doesn’t have to create a civilization from the sand up, as Herbert did. George R.R. Martin’s Game of Thrones books are another example of that kind of accomplishment that show just how rare such books are, and how difficult they are to create.

And writing Dune clearly took a lot of work. The back story of Herbert’s creation of Dune should encourage unappreciated writers to keep at it. According to the Dune Novels website, it took Herbert six years to research and write Dune, and the book was rejected by 23 publishers before being accepted for publication. You can imagine how dispiriting it must have been to get those rejection letters are so much time and effort. Yet, according to one ranking, at least, Dune went on to become the best-selling science fiction book of all time and continues to hold that spot, nearly 60 years after it was published. Herbert’s years of labor produced a sci-fi classic that people will be enjoying for decades to come. I wonder how the publishers who casually rejected it feel about their decisions now?

The Headset Question

We’ve got a transition underway at our workplace. The phones on our desks are being removed, after decades of faithful service, and now we’ll be doing all of our calling through our computers. I’m okay with that. In the modern world, any technology that has been around for decades has done its job but almost certainly can be replaced by an improved approach. And getting rid of the desktop phone also means eliminating the annoying need to constantly untangle the cord connecting the handset to the rest of the phone.

With the elimination of the old phone, we’re being offered options. Apparently the sound qualify if you simply talk into your computer on a phone call isn’t ideal for the person on the other end of the conversation. (And, in any event, you probably don’t want to encourage people to shout at their computers, anyway.) So we need to make a choice: do you go with a headset, or a speakerphone attachment?

Headsets probably make the most sense, but unfortunately I associate them with Ernestine, the snorting, cackling busybody character Lily Tomlin introduced on Laugh-In. There’s also a clear techno vibe to a headset, with a one-ear headset edging out the two-ear headset in the hip, technocool ranking. I frankly question whether I’m well-suited to either. So, I’m going for the speakerphone attachment as my first option, with one of the headsets a distant second in case the supply of speakerphones isn’t sufficient to meet demand.

It will be interesting to see whether speakerphones are a popular option, or whether my colleagues will go all-in on the headsets. I’m guessing that the choices will vary by age group, with the older set being more amenable to speakerphones–if only so they won’t hear “one ringy-dingy, two ringy-dingy” in that sniveling Ernestine voice whenever they use the headset to place a call.

Resume Building

I ran across an interesting article on CNBC about resumes–those printed, boiled-down summaries of a person’s educational and work life that job applicants fret about. The article said that, these days, 93 percent of employers want to see “soft skills” included on the resume, and eight of those attributes are in particular demand: “communication skills,” “customer service,” scheduling, “time management skills,” project management, analytical thinking, “ability to work independently,” and flexibility.

Resumes are always a product of the time period in which they are prepared, and some of the qualities identified by CNBC clearly reflect the recent COVID pandemic and the shift, for many employers, to remote or hybrid work. When people are working in different locations and connecting through technology, “communications skills” that help everyone keep track of the status of their joint project are a lot more important than they would be if all of the team members were working 9-5 Monday through Friday in offices just down the hallway. Similarly, the ability to work independently, time management, and flexibility have obvious value in a remote or hybrid work environment.

Many of the “soft skills” mentioned in the CNBC article, though, seem like characteristics that you would want to mention in describing your work experience, irrespective of COVID or remote work considerations, because they really all illuminate different facets of good employees and good supervisors. Good employees obviously care about customer service, manage their time efficiently, and exhibit flexibility and analytical thinking as they do their jobs. Good supervisors are good communicators, come up with rational schedules for the work their teams are doing, and display project management skills. None of these “soft skills” should come as a surprise to anyone.

Happily, I haven’t had to prepare a resume for decades, but it seems like the resume experts are always coming up with new approaches and techniques and emphases in response to changes in the workplace. When I last prepared a resume back in the ’80s, for example, the prevailing view was that you needed to include an “interests” section in your resume to show that you were a multi-faceted human being and not some soulless working automaton, and also provide fodder for job interview conversation. The debate then raged about what kinds of interests would be appealing, yet safe, and what were too edgy. “Reading” and “travel” were viewed as prudent choices, I seem to recall, but you might not want to indicate that you were a professional wrestling fan or enjoyed attending comic book conventions.

In the arc of a person’s resume life you go from having to stretch your education and work experience to fill a page and trying to come up with a description of your summer job that made it sound meaningful to the point where you have more than enough material and are simply trying to hold the puffery down to a minimum. But the point of the resume is the same: how do you put words on a page that show you would be a good member of the team, given the current circumstances? That ultimate goal really hasn’t changed.