Tinkering With The “Work Week”

A New Zealand company called Perpetual Guardian, which manages trusts and estates, decided to experiment with moving its 250 employees to a four-day work week.  In the experiment, employees worked four eight-hour days, rather than five eight-hour days, and researchers from the Auckland Institute of Technology studied the results.

videoblocks-african-young-man-with-glasses-in-white-shirt-and-black-tie-working-in-office-african-man-shaking-hand-another-worker-indoor_rsrwtwcxb_thumbnail-full01The experiment worked so well that Perpetual Guardian has decided to permanently implement a four-day work week option.  The researchers found that, during the trial period, there was less absenteeism, employees showed up on time, didn’t leave early, and took fewer breaks.  The employees also reported increased productivity, more energy and focus, lower stress, and a better work-life balance under the new system.  The experiment also indicated that workers at Perpetual Guardian identified where time was being wasted — such as in unnecessarily long meetings or office chatter — and changed their practices to be able to get their work done in a shorter work week.

And, because the Perpetual Guardian workers are completing the same quantity of work under the new system, they’ll continue to be paid what they were being paid for working a five-day week.

It all sounds good, but would it work in the United States?  During my more than 40 years of working, changes to the standard 9-5 five-day work week — whether it’s shorter working days, or fewer working days — have been the Great White Whale of workplace reformers . . . and the five-day work week still generally prevails.  But during that 40-year period many standard practices have changed.  Leaves of absence and work-at-home arrangements are much more common.  Workplace attire rules are much more relaxed.  And employers generally seem to be a lot more flexible about taking time off to pick up kids or take an aging parent to a doctor’s appointment.

Of course, the morphing of the 9-5 five-day work week has worked in the opposite direction, too.  With the advent of smartphones and laptops, white-collar workers are no longer tied to their office desks — and many find themselves toiling after hours and on weekends to answer emails or finish reports.

Will the four-day work week catch on?  I’m skeptical — not because it’s not workable, but because I think the old days of standard, across-the-board practices applying to all workplaces and all businesses are behind us.  Technology is allowing employers to shape their practices to their individual needs.  For some employers, it might be a four-day week, for others, it might be an understanding that certain work needs to get done, without much concern about when or where that occurs, and for still others it might be something entirely different.  And employers seem to have a much better attitude about the need to keep productive, capable workers on the job, even if it means bending or changing rules to accommodate their needs.  I’m convinced that the American workplace will continue to morph.

 

 

Advertisements

The Peak Productivity Period

When, during the standard work day, work week, and work year, are employees at their most productive?  One company took a look at the issue and tried to come to some conclusions based on quantifiable data.

59d679b82b6f9ad9b10ee36a24b6e1e8The study looked at Redbooth, a project management software company, and examined an anonymized data set of 1.8 million projects and 28 million discrete tasks.  It concluded that the peak productive time on any given day is 11 a.m., the most productive day is Monday, and the most productive month is October.  At the other end of the spectrum, workers completed the least tasks after 4 p.m., the least productive day is Friday, and the lowest percentage of tasks were completed in January.

You can’t draw really meaningful conclusions from one study of one company in one industry, of course, and it would be interesting to know how those 28 million separate “tasks” were defined.  (Is logging on to your computer a “task”?  How about submitting your time records to your boss, or sending a quick status update email versus a full-blown report?)  Nevertheless, the study seems to confirm what should be obvious — productivity ebbs and flows during the work day, work week, and work year.

I’m also convinced based on my own work history that productivity is uniquely individualized, and varies a lot based on the circadian rhythms, personality types, and social mores of individual workers and individual workplaces.  I feel like I am at my most productive first thing in the morning, when I can get in early and immediately knuckle down to work and there are fewer phone calls and work flow interruptions and distractions; I’m not a big late-night worker except in emergencies.  Other people get to the office later, like to do some visiting to start their day, and seem to pick up steam as the day goes on and the night hours arrive.  Averages tend to smooth out the real, material differences between people’s work habits and practices.

The one conclusion from the study that most surprised me was the productivity variance between seasons and months.  I would have bet that winter was the most productive month — in the Midwest, at least.  When your alternative is raw, cold weather, a bustling day at the office looks pretty good by comparison.

The Office Microwave Smell Zone

Yesterday I was walking down the office hall at about 11:30 when I encountered a sphere of odor so pungent it had an almost physical impact.  It had the kind of potency that made me think “Whoa!” and quicken my step to get away as quickly as possible.

Yes, I was passing the office microwave.  There’s a reason why, on virtually every floor in our firm, the office closest to the microwave is vacant.  Unless you’ve experienced a tragic childhood accident that cost you your sense of smell, you’re going to get away from the zone of noxiousness at the earliest possible opportunity.

IMG_0130In our office, around the lunch hour, the microwave area is a kind of no-go zone.  During the morning, the machine might be used for more innocent activities, like coffee warming or preparing a bowl of instant oatmeal.  But at lunchtime, the appalling aromas emerge.  Maybe it’s that kind of preservative-laden putrescence that inevitably accompanies bad takeout Chinese food or a one of those ready-made diet meals.  Perhaps it’s that overcooked to the edge of burnt aroma that you get from some home-cooked leftovers. Or you might be treated to the thin, almost tinny taint of reheated tuna fish casserole that paints a firm mental image of a congealed mass of overdone noodles so hard you could break a tooth if you took a bite.

And then there’s reheated fish, which is easily the worst of all.  It’s quite possible that minor Balkan wars have been started over people who are on some new diet and insist on heating up fish in the microwave so they can stick to a strict regimen.  Microwaved fish is almost certainly the biggest cause of hysterical, pathetically pleading, exclamation pointed, passive-aggressive signage in the office.  (“Will whoever is using the microwave to reheat fish please have mercy on us and stop!!!”)  And, when someone transgresses and uses the microwave for fishy purposes, the smell seemingly never fully vanishes.  It lingers, like the guest who wouldn’t leave, and ultimately sinks down into the carpeting so that it can always stay with us.

In fact, conducting interrogations in the same room where people are microwaving fish could be a very effective method to break the will of terrorism suspects, but that tactic probably would violate multiple provisions of the UN’s Universal Declaration of Human Rights.

Irrefutable Visual Evidence That Starburst Candy Sucks

We bought too much candy for the wet and rainy Beggars’ Night in New Albany.  Or, more precisely, we bought too much of the wrong candy — namely, Starburst.

IMG_1588On Beggars’ Night, we had our customary basket of multiple candy options to offer trick-or-treaters.  Only the youngest and most inexperienced ghosts and goblins grabbed Starbursts.  Every other Halloweener dug furiously through the contents of the basket, like a dog clawing the ground to uncover a bone, in a desperate attempt to find Butterfingers, Reese’s minis, or even Skittles.  When the last trick-or-treater had rung the doorbell, taken a sad look at what was left in the basket, and departed with a painful sigh, we were left with enough Starbursts to float a small battleship.

We didn’t want them around the house, obviously.  No problem! I thought.  I’ll just take them to the office, plop them next to the coffee station on our floor, and the perpetually hungry denizens of the fifth floor would feel the urge of their sweet tooth and consume all of the candy in the blink of an eye.  Donuts, other baked goods, and anything with chocolate have been known to disappear faster than the speed of light, and occasionally there are tense standoffs as secretaries, paralegals, and attorneys eye the last brownie or piece of birthday cake.  So I put the Starburst in a bag, took it to work, and left it to be rapidly consumed.

Imagine my surprise, then, when I found this half-full bag of Starburst when I was leaving for the day at 6 p.m. tonight.  It is an unheard-of development that speaks volumes about the quality of the candy.  So I decided to conduct the crucial acid test and leave the bag for the overnight cleaning crew to enjoy.  If any Starburst are left tomorrow morning, it can only mean one thing:  Starburst candy truly sucks.

A Recarpeting Realization

Our firm is being recarpeted.  Somewhere, someone decided that our old, bland office carpeting needs to be replaced by new, bland office carpeting. It’s being done overnight, office by office and floor by floor.

This means that all of the furniture in each office must be moved out so new carpet can be laid down.  More importantly — from the standpoint of the officeholder, anyway — it also means that every book, picture frame, desk toy, and scrap of paper that has been resting comfortably atop every desk, credenza, chair, and table also must be moved out.  Last night was my recarpeting night, so I’ve been in the box-up zone for the last few days.

A few observations from this experience:  First, dust is an amazing thing.  Once you start exposing hard-to-reach areas, you realize how much dust there is in the world.  Where does it come from?  If we didn’t have a nightly cleaning crew, everything in my office would be covered by an inches-thick layer of dust.

Second, how much of the stuff in your office do you even use?  I have shelves and a credenza lined with books.  In the pack-up period, I saw books, like my law books from school, that were dust-covered and adhered to each other, cover to cover, because they hadn’t been moved or opened in years.  I obviously don’t need them and — unlike photos of Kish and the boys — they don’t add much to the office atmosphere.  They once seemed solid and impressive with their solemn, thick covers; now it seems silly to keep them around.

In short, you don’t realize how much debris you’ve accumulated until you have to move it — and then you wonder why in the world you have it in the first place.

Two-Monitor Cool

Lately I’ve noticed that more and more attorneys at our firm seem to have two monitors for their computers.  When you walk past their offices you do a double-take.  (Bad pun alert!)

Why would attorneys need two monitors for their computers?  I’m not sure.  If you ask them, they give you some song-and-dance about how the extra monitor makes it easier for them to work with spreadsheets, or review  documents, or perform some other important function.  I’m guessing, however, that their decision to add a monitor was motivated, at least in part, by their belief that it will make their office cooler.  And for the most part, it does!  Working with that second screen makes them look hip and sharp, like Tom Cruise in Minority Report.

There is a risk in this, of course.  The number of monitors on office desks could become a competition that rapidly escalates out of control, like the arms race or the size of fins on American cars in the ’50s.  If two monitors looks rad, what would three look like, or four?  Soon offices could be like the Taxi episode where the Reverend Jim becomes motivated for a mysterious reason, works like a dog, and the other characters ultimately learn that he was saving all his money to erect an entire wall of TV sets.

And there’s a limit to what adding computer hardware can accomplish.  As the photo accompanying this article indicates, not everyone with extra monitors will necessarily look cool.

The Horrors Of Recycled Paper Napkins

Our office has long tried to be “green.”  We recycle paper products and aluminum cans.  We don’t use styrofoam coffee cups.  And, recently, we started using recycled paper napkins at our coffee stations.  The napkins are brown and are proudly stamped with the green recycle stamp and the messages “Made with 100% recycled material” and “Save the environment, one napkin at a time.”

The napkins are in a dispenser right next to the sink and the coffee brewer.  Their intended use is plain:  they are supposed to help you as you rinse out your coffee cup in the morning and scrub out the remaining coffee film.  At this simple chore, however, the recycled paper napkins are a complete, abject failure.  A single napkin is so flimsy that it dissolves and falls to pieces at the slightest touch of liquid.  So, you use three napkins together — thereby saving the environment, three napkins at a time — but as you clean out the cup you realize that you are leaving behind moist, rice-sized paper pellets adhered to the bottom and sides of the cup.  The brown paper residue then needs to be swept from the cup, by hand.

We used to have sturdy, presumably non-recycled, napkins for this purpose.  One napkin filled the bill admirably and left you with a spotless, shining cup, ready to accept the morning’s first touch of black magic.  Now, what used to be a simple, mindless part of the morning routine has become a source of grim frustration, all because the environment-saving recycled napkins suck at their job.  The first rule of napkin technology should be, “no residue left behind.”  Our gossamer recycled paper napkins not only are incapable of removing existing residue, they compound the problem by leaving their own trail of brown crud.

I’m all in favor of recycling — really I am — but we shouldn’t be guilt-tripped into buying inferior recycled products that fail to perform their intended function.  No amount of “green office” awards can make up for the horrors of trying to use recycled paper napkins.