The Impact Of Tax Cheats

The Internal Revenue Service estimates that, each year, about 16.3 percent of the nation’s federal taxes go unpaid — and that’s after the IRS takes whatever action it takes to try to achieve compliance.  This “compliance gap” leaves a pretty big hole in the federal budget.  In 2018, if all of the federal taxes that were owed were actually paid, it would have meant another $643 billion in revenue for the federal goverment — which would have covered about 83 percent of our ridiculously large federal budget deficit.

celebrity_tax_cheats-624x300Why don’t people just suck it up and pay what they owe?  That’s not a self-answering question.  The Government Accountability Office says there are three main reasons for non-compliance:  third-party reporting issues, reduced IRS budgets and staffing, and the complexity of the Internal Revenue Code.  The first and third reasons involve mistakes — where third parties don’t correctly report what a taxpayer has earned, or has received in a taxable transaction, or where a taxpayer has legitimately tried to figure out what they owe, and simply been wrong — but the second category clearly relates to the ability of the IRS to ferret out, audit, and penalize those who are knowingly cheating.  In short, if you had perfect compliance, reduced IRS budgets and staffing wouldn’t make a difference.  And the lines between the three categories may be blurry, too.  If a taxpayer professes confusion about how to treat a particular source of income but adopts a stretched reading that dramatically minimizes their taxes, is that cheating, or a product of tax code complexity?

So, what can we do to improve the compliance numbers, recognizing that getting perfect, 100 percent compliance is an unattainable goal?  The answer to that question seems to turn on political inclinations and your view of human nature.  Some people, like the author of the article linked above, think that simplifying the tax code would result in a higher compliance rate — an argument that presupposes that people honestly try to figure out, and pay, what they actually owe.  The flip side argues that increasing the IRS budget for oversight and compliance is the best way to promote compliance.  In short, if more people fear they’re going to get caught, it will have a prophylactic impact on a wider group of taxpayers who will choose to simply pay their taxes rather than risk audits and penalties.

There’s undoubtedly merit in both arguments, although being somewhat cynical about human nature, I tend to agree more with the latter camp — but it’s also true that neither of these solutions has much promise in the short term.  Tax simplification has been the Great White Whale of politics for as long as I’ve been filling out 1040 forms, and it never quite happens.  And campaigning for office on a platform of increased IRS funding and more aggressive tax enforcement doesn’t seem like the ticket to political success.

So we’re likely to bump along as we have been, with many people accepting their federal tax burdens, a segment of the population consciously cheating on their tax obligations, and a continually growing deficit because we can’t actually do something about the “compliance gap.”  It makes you wonder:  at some point, is that “compliance gap” going to grow even larger?

“Burn-out” As A Medical Condition

Every few years, the World Health Organization produces a new version of the International Classification of Diseases, a catalog of acknowledged medical conditions that is used as a diagnostic guide by health care providers.  With every new version of the ICD, there seems to be some controversy about whether or not a particular ailment or complaint should be recognized.

burnoutThis year, the “should it be included or not” controversy swirls around “burn-out.”  Apparently there has been a long, ongoing debate about whether “burn-out” should be recognized as a medical condition, and the WHO has now weighed in with a “yes”:  the ICD-11 lists “burn-out” and defines it as “a syndrome conceptualised as resulting from chronic workplace stress that has not been successfully managed.”  According to the WHO, “burn-out” syndrome is characterized by “1) feelings of energy depletion or exhaustion; 2) increased mental distance from one’s job, or feelings of negativism or cynicism related to one’s job; and 3) reduced professional efficacy.”  Notably, the ICD-11 tries to draw a kind of line in the sand by stating that “burn-out” “refers specifically to phenomena in the occupational context and should not be applied to describe experiences in other areas of life.”

My guess is that many — if not all — workers have, at some particular point or another in their careers, experienced “burn-out” as defined by the WHO.  Jobs typically involve stress, and it’s almost inevitable that there will be periods where multiple obligations pile on top of each other, leaving the worker feeling overwhelmed, exhausted, and dissatisfied.  But . . . should “burn-out” be viewed as a medical condition?  What, exactly, is a doctor supposed to do for a patient who presents with classic “burn-out” symptoms — prescribe a three-month vacation, or a new job, or new job responsibilities, or a change in the patient’s workplace manager?  Will employers be required to allow leaves of absence, beyond their designated vacation periods, for employees whose doctors diagnose them with “burn-out,” and will health insurers be required to pay for vacations as a form of treatment?  By classifying “burn-out” as a diagnosable health condition, aren’t we really going far down the road of “medicalizing” common aspects of our daily lives?

And can “burn-out” really be limited to the “occupational context,” as the ICD-11 instructs, or will the same concepts underlying workplace “burn-out” ultimately be recognized in other areas, like family or marital or college “burn-out”?  Here’s a possible answer to that question:  the ICD-11 now recognizes video gaming, with cocaine, and alcohol, and gambling, as a potential source of addiction.

Monkey Head On A Bridge

When you walk to work, moving to and from the office at a deliberate pace, you notice things that speeding drivers simply don’t see — like this curious, colorful monkey head that has recently appeared on the Third Street bridge over I-70.  It looks to be made of carefully painted clay, and it is affixed directly to the concrete on walkway side of the bridge overpass.

What’s the significance of the purple monkey head?  I freely admit that I gave that issue some thought as I walked by, but my analysis hasn’t gotten very far.  The head has the telltale xs on its eyes that have long been a cartoon artists’ way of indicating death, drunkenness, or unconsciousness, but other than that, I found nothing to tell me the backstory of the monkey head, or why it was placed on the bridge.  Google searches for drunken monkey, dead monkey, and unconscious monkey didn’t turn up anything particularly helpful, either — although the searches did cause me to become aware of the scientific theory that the human taste for alcohol has deep evolutionary roots that go all the way back to our primate ancestors consuming overripe, fermented fruit as a primary food source and the fact that the Caribbean island of St. Kitts is also known as the Island of Drunk Monkeys because of the alcoholic likings of the green vervets that were brought to the island in the 1700s.  Alas, there doesn’t seem to be any connection between these stories and the purple monkey head on Columbus’ Third Street bridge.

Perhaps the monkey head is the start of some artist’s project, a la Christo, or some clever marketing campaign, where similar heads have been positioned in other parts of town and, after some kind of buzz is generated by curious people like me, we’ll learn that the monkey heads are advertising the introduction of some new restaurant or bar or rock band in the Columbus area?  Or maybe the monkey head is a tribute to someone who met his maker on the bridge.

Whatever the backstory is, I’m intrigued by the monkey head on the Third Street bridge.  I’d be interested in any theories about what the monkey head means, and why it is there.

Deepfaking Mona Lisa

These days, it’s hard to tell the real from the fake.  You never know if a quote, or a photo, or a Facebook meme is truthful or manufactured as part of some scheme or for some deep political purpose.  Video footage seems more reliable, but we’ve all seen examples of how careful editing can change the context and the perception.

mona-lisa-1883925Now, it’s going to get even harder to distinguish the real from the fake.  The development of artificial intelligence programming and facial recognition software is allowing for the development of increasingly realistic, seemingly authentic video footage that is in fact totally fictional.  The new word to describe the result is “deepfake,” which refers to the use of AI technology to produce or alter video to present something that didn’t occur in reality.  And the use of rapidly improving technology to produce deepfake video is erasing boundaries that used to allow humans to spot video frauds by focusing in on gestures, subtle facial movements, and other “real” human behavior that computers just couldn’t effectively simulate.  The avatars in even the most advanced video games still look like, well, avatars.

But that is all changing.  A team of engineers from the Samsung AI Center and the Skolkovo Institute of Science and Technology in Moscow has developed new algorithms that are far more advanced and successful in replicating realistic human faces.  The software is the product of studies of thousands of videos of celebrities and ordinary people talking to cameras.  It focuses in on “landmark” facial features and uses a neural network to convert the landmark features into convincing moving video.  The new software also self-edits by critically scanning the individual video frames that are produced, culling out those that seem unnatural, and substituting improved frames.

As a result of all of this, the new software can produce realistic video from a single, static image.  Take a look at the video of a chatty Mona Lisa embedded in this article, created from the application of the new software to the single image in the famous portrait by Leonardo da Vinci, and then tell yourself that it doesn’t look astonishingly, and disturbingly, realistic.  If Mona Lisa can talk, it sure seems like we’ve crossed a new boundary in the ongoing battle of real versus fake.

Like any new technology, the AI that allows for the creation of realistic video footage from a single image could have positive applications, and negative applications.  It’s just hard not to focus on the negative possibilities in the current era of fakery and fraud, and wonder how this new technology might be used for political dirty tricks or other chicanery.  We’re all just going to have to be increasingly skeptical about what is real, and what is false and realize that passing the “eye test” might not be much of a test any more.

A Taste of Grilling History

Memorial Day is probably more identified with outdoor grilling than any other day on the modern American calendar.  So . . . exactly when and how did Americans become so enamored with outdoor cooking, anyway?

weber1aOf course, humans have been cooking outdoors since the discovery of fire by our primitive ancestors tens of thousands of years ago, before the dawn of recorded history.  But in the ensuing millennia, outdoor cooking didn’t advance much beyond the basics of skewering a piece of meat on a metal spit and turning it over flames or coals until the fat dripped off — which wasn’t exactly well-suited to people cooking for their families.

Charcoal has been made since the early days of human civilization, and had been used for smelting, blacksmithing, and other industrial processes.  After the individual charcoal briquet was invented in the 1890s, people tried cooking outside on various flimsy devices, but the traditional problems that are familiar to any outdoor cook — food that is burnt on the outside and undercooked on the inside, thanks to poor temperature control — was a constant problem, and as a result outdoor cooking remained unpopular.

In 1952, George Stephen, a welder at the Weber Brothers Metal Works in Chicago, came up with the idea for the first modern outdoor grille.  Apparently inspired by marine buoys, he devised a sturdy, stand-alone kettle grille with a lid for temperature control.  Later, the Weber grille design with the familiar dome was introduced, and the gas grill was invented in 1954.  Those inventions coincided with the development of the American suburb, the Baby Boom, and the rapidly growing American economy in the years after World War II and the Korean War, and soon every American household had its own outdoor grill on the patio of their suburban home.  It was just natural that the first big grilling weekend would be the Memorial Day weekend, when the improving weather marked the start of the outdoor grilling months.

Any kid who grew up in the ‘burbs in the ’50s or ’60s remembers sitting at a picnic table eating cheeseburgers and hot dogs cooked by the Dads in the neighborhood who were clustered around their grills — typically while they wore embarrassing cooking outfits and swigged Budweisers — while the Moms brought out the potato salad and buns and condiments and sported brightly colored cat-eye sunglasses.  There’s a reason why the Monkees sang about “charcoal burning everywhere” in their ode to the generic American suburb, Pleasant Valley Sunday.

Of course, grilling has advanced since then, but the association of Memorial Day with outdoor cooking remains strong.  On this Memorial Day, grill on, America!

The Coming Storm

It was clear it was going to rain this morning. Knowing that, you can go inside, shut the door, and watch TV.

Or, you can sit outside on your porch, drinking your morning coffee and listening to the thunderstorm approach from the west. You watch the sky over the neighboring houses grow dark and roiled, illuminated by the occasional flash of distant lightning, and listen to the booms and cracks grow steadily louder.

I prefer the latter course. Thunderstorms make a lot of interesting sounds — ultimately ending in the patter of sheets of rain striking roofs and patio umbrellas and the leaves on overhanging trees. And it’s interesting, too, how the birds respect the storm — they hold their chirps when growling sky puts on its performance, fit a little snippet of song in between the rolls of thunder, and then find a quiet, sheltered spot when the rain ultimately comes.

I find the sounds of thunderstorms comforting. It’s a set of sounds that really hasn’t changed much since I was a kid. To this native Midwesterner, thunderstorms mean . . . Summer is finally here!

Working For The Three-Day Weekend

In the distant, early days of Homo sapiens, there was no concept of “work” in the modern sense, and thus there were no holidays, either. Every day involved its many toils, from hunting and gathering to working to find shelter and water and protection against predators.

Then, as civilization developed, designated jobs became an inevitable part of the process. No city could exist without people charged with performing essential functions like laboring in the fields to bring in the crops, delivering food from the countryside, serving as scribe for Pharoah, or building the new pyramid or ziggurat.  The concept of holidays came later still. First, there were only religious holidays or seasonal holidays, to mark the Feast Day of Set or commemorate the harvest with a day of celebration. In the medieval era, when a saint’s day arrived, the duties of the job were replaced by lengthy religious obligations and, perhaps, fasting and the ritual wearing of a hair shirt.  It wasn’t exactly a laugh riot.

As humanity advanced even more, the concept of a work week was introduced and, then, secular holidays. When some brilliant soul realized that secular holidays really didn’t have to be tied to a specific date on the calendar and instead could float — so that the holiday could combine with a normal weekend to create a three-day weekend — it was a huge step forward in human development. And when an even more enlightened individual realized that we could use those three-day weekends to bookend the summer months, so that the joys of summer could begin with a glorious three-day revel in the warmth, it marked a true pinnacle in the annals of human achievement.

As we celebrate the joys of this three-day Memorial Day weekend, let’s remember those forgotten figures of human history who came up with the ideas that led us here — and be grateful that wearing sweaty hair shirts isn’t part of the equation.

Drinking The Beer The Monks Drank

Important news from Belgium for beer lovers — the monks of Grimbergen Abbey have managed to piece together long-lost information about the ingredients and methods used to brew their different beers going back in the Middle Ages, and have started to brew beer again.  The rediscovery of the recipe is a kind of historical detective story where language plays a key role.

rtx6vw88The story starts with the monks of the abbey, who like other monks of the Middle Ages, brewed, and enjoyed, beer.  (In fact, some monks fasted during Lent and drank only specially brewed beer that was a kind of liquid bread during that period — which probably made for an interesting Lenten season.)  The Grimbergen Abbey brews were known far and wide, and their ingredients and the methods used by the monks were set down in books first written in the 12th century.  The monks continued to brew their beer, changing their recipes periodically, until 1798, when French Revolutionaries, who were no friends to religion, burned the monastery to the ground.  The 1798 incident is one of three times that the monastery has burned down.

But the monks of Grimbergen Abbey are resolute.  Fortunately, some of the monks rescued the 12th-century books and stored them, but the recipes and methods were thought to be lost because no one could read the writing, which was in a mixture of old Latin and old Dutch.  Four years ago, the monks at the monastery decided to tackle the problem and invited volunteers from the community to help them in trying to decipher the writings.  Together they were able to identify ingredient lists, the types of hops and bottles and barrels that were used, and even the names of the different beers the monks brewed over the centuries.

Now the monks, in partnership with Carlsberg which offers a number of the Abbey’s previously known beers for sale, have built a new microbrewery on the site of the original brewery and have started to brew a beer based on some of the old recipes and methods.  It’s a heady brew — 10.8% alcohol, by volume — and will be sold by the glass in Belgium and France.

A toast to the indomitable beer-loving monks of Grimbergen Abbey, and the volunteers who helped them to recover a bit of liquid history!

The Golden Age Of Pizza

I’m in Boise for work. Last night I felt like pizza for dinner, so I asked the desk clerk for a recommendation. She raved about the pizza at The Wylder, and particularly the Honey Badger. The Wylder had the advantage of being only a block away from my hotel on a rainy night, so my choice was an easy one.

The Wylder offers lots of interesting pizza options, but I felt I needed to go with the clerk’s enthusiastic endorsement– and I’m very glad I did. The Honey Badger is one of the best pizzas I’ve ever tasted. It’s a white pizza that starts with a crunchy sourdough crust and is topped with fennel Italian sausage, caramelized onion, ricotta cheese, and some kind of chili-infused honey and garlic oil. Taking a bite results in a flavor explosion in your mouth, and the combination of tastes and textures is incredible.

As I sat, happily munching away on slice after slice, I reflected on the development of pizza cuisine in my lifetime. My first pizza was a red sauce mozzarella cheese pizza, where the sauce probably came out of a can and the crust and cheese had zero flavor. Over the intervening years pizza has moved from a convenience food to a chance for chefs to strut their culinary skills with great concoctions like the Honey Badger.

I think we’re living in the Golden Age of Pizza.

In Fear Of Facial Recognition

One of the features that was added to the technology mix during the period between the purchase of my old phone and the purchase of my new iPhone is facial recognition software.  During the set-up process at the Verizon store, I held the iPhone as if I were looking at messages, moved my head from side to side and up and down until the phone had acquired about a 270-degree look at my head and indicated that it had seen enough, and the facial recognition feature was activated.

facialrecognition_1-672x372Now, whenever I pick up the phone, the software kicks in automatically and substitutes for the entry of passcodes.  It’s pretty amazing technology, really, and it’s a lot faster and less clumsy than the passcode-entry process.  I really like the convenience element.

But . . . as a result of this Apple has got my face memorized and digitized and stored somewhere.  And, the modern tech sector world of information-selling and data-trading being what it is, who knows who else now has the capability to instantaneously identify my less-than-noble features.  My cell phone service provider?  Every Apple subsidiary and affiliate and technology partner?  The FBI, the CIA, or the Department of Homeland Security, or some Russian or Chinese hackers?

Recently San Francisco passed a ban on the use of facial recognition software by police and other agencies, and other cities are considering similar legislation.  The proponents of such measures tout them as a victory for privacy and a safeguard against governmental overreach that could conceivably allow governmental agencies to track citizens as they go about their daily lives.  Opponents note that facial recognition software can help the authorities solve crimes — as the article notes, the technology was used to identify a mass shooting suspect last year — and that it can help to secure our borders and airports.

I’ve long since concluded that while privacy is nice, in the modern world you have to make countless choices that can affect your privacy in different ways.  Do you pay with a credit card that tracks your purchases, or cash?  Do you use a cell phone that keeps track of your location?  Do you participate in social media and share some of your life through Facebook, Twitter, and the countless other outlets?  Have you traveled outside of the U.S. recently and returned to the country using one of those passport and facial scanning re-entry terminals?  It’s hard to argue, too, that a face that you show to the world each day, that appears on your driver’s license, and that is captured regularly by the various surveillance cameras positioned throughout American society, is something that is extraordinarily private.

All things considered, I’m not too troubled by the use of facial recognition software.  It’s the protection of other highly personal information — such as health information and financial information — that is of much more concern to me.

Good Neighbor

This sign appeared recently on the telephone pole at the corner of Livingston Avenue and Third Street, on my walking route to work.  At first I didn’t notice it, but when I read it I thought about what a nice, neighborly thing it was for a dental office to give up one day of paid work in order to offer a free filling, a tooth extraction, or a cleaning to someone who just couldn’t afford dental care otherwise.  And the people offering this free benefit were serious about letting people know about their effort to give back to the community — Kimberly Parkway, where the dental office is located, is miles to the east of the German Village location of this particular sign.  I imagine that similar signs could be found at many locations in our city.

In the hurly-burly of our lives in modern America, we sometimes tend to forget, or take for granted, the nice things that people do for each other.  We really shouldn’t.  There are still a lot of nice people in the world who are willing to help others and donate some of their time in doing so.

Grading The “Experts”

In our modern world, we’re bombarded with the opinions of “experts.”  Virtually every news story about a development or an incident features a quote from an “expert” who interprets the matter for us and, typically, makes a prediction about what will happen.  “Experts” freely offer their forecasts on specific things — like the contents and results of the Mueller Report, for example — and on big-picture things, like the direction of the economy or geopolitical trends.

d36a6136-6dfd-425a-b7f7-2b2a1b446b1eThere are so many “experts” giving so many predictions about so many things that it’s reasonable to wonder whether anyone is paying attention to whether the “experts” ultimately turn out to be very good at making their predictions.

The Atlantic has a fascinating article about this topic that concludes that so-called “experts” are, in fact, dismally bad at predicting the future.  That’s not a surprising conclusion for those of us who’ve been alive, paying attention, and recalling some of the confident forecasts of days gone by.  Whether it’s the “population bomb” forecasts noted in The Atlantic article, or the predictions in the ’80s that Japan would soon own the world, or the prognostications about how elections will end up or whether one party or another has that elusive “permanent majority,” recent history is littered with failed expert predictions.

Why are would-be “experts” so bad at their predictions?  The article notes that academics and others who focus on one field tend to be especially wrong in their foretelling because they typically ignore other forces at work.  They also are often so invested in their specialty, and their belief in their own evaluations, that they react to failure by doubling down on their predictions — like doomsday cult leaders who tweak their calculations after a deadline has passed to come up with a new day the world will end.  People who are less invested in the belief in their own infallibility, and who are less focused on one discipline or area of study, tend to be much better at making predictions about the future than the “experts.”

Does the consistent thread of “expert” predictive failure mean that we shouldn’t try to see ahead at what the future may bring?  Of course not.  But it does mean that we should take the dire forecasts of “experts” with a healthy dose of skepticism.  Keep that in mind the next time a talking head says we need to make some dramatic change in order to avoid certain doom.