There is no doubt that — for some people, at least, including me — there has been an inverse relationship between age and receptivity to new technology.
Once, as a callow youth, I was dazzled by new technology. Of course, the Mercury and Gemini and Apollo space programs made everyone interested in new advancements in computers and technology as a point of national pride, and that carried over into home life and school. When our family signed up for a crude early version of premium cable TV called Qube, I wanted to know how it worked and what it offered. And I was excited when Dad brought him one of the first Atari game systems, so UJ and I could play Pong in our family room. I even took a computer course in high school and learned some of the basics of FORTRAN programming using punch cards, and thought it was fantastic that the computer did what my carefully arranged stack of punch cards commanded.
This happily open-minded approach to new technology continued through college and into law school. In college, I learned how to use video display terminals (“VDTs”), one of the first forays into stand alone word processing units, and spent hours in front of a VDT, with its greenish glow. I believe that is where I first learned the word “cursor.” And my friends and I happily enjoyed every cool new video game that appeared in our favorite bars, whether it was Tron or Pac-Man or Ms. Pac-Man or Asteroids or Galaga. And in law school I learned to use computer legal search engines, and even took on a job where I had to “back up” the hard disks on a computer system that crashed regularly.
But at some point, my receptivity to the new technology changed, and I can’t quite put my finger on when it happened, or exactly why. Some of it might have been repeated experience with technology that overpromised and underdelivered, or that focused on bells and whistles — “look, you can program your own individualized message to appear as your screen saver!” — without meaningfully improving the basics, like word processing capabilities, that were the meat and potatoes uses of the system. Some of it no doubt was brutal experience, where one false key stroke, or one ill-timed system crash, caused hours of work to maddeningly vanish and have to be recreated. And with each glitch and crash, skepticism began to replace receptivity, and fear of disaster began to replace eager interest.
The pace of technological change didn’t help things, either. With new computers, search engines, phone systems, cell phone systems, remote access fobs, security systems, constant annoying password changes, and other developments being introduced all the time, it seemed like things were never really set — at least, not for long — and there was always some overwhelming new training to take. And that reality caused another reaction to enter into the mix: “why can’t things just stay the same for a while?”
I had that kind of jaded reaction to new technology — but then the coronavirus pandemic hit the world, and everything changed again. For many of us, technology saved our butts and allowed us to keep working remotely in a way that really wouldn’t have been possible even 5 years earlier — much less 20 or 30. I’ve learned how to use a number of new programs and applications, and have been grateful for the opportunity. And whenever I talk to one of our IT people these days, I thank them and acknowledge just how crucial this wonderful technology has been.
I wouldn’t say I’ve quite returned to the wide-eyed fascination I had as a kid, but this current experience has definitely moved the needle back by a lot of years. The next time you hear me fulminating crotchety old man views about technological advancements, just remind me of COVID-19 and 2020, and I’ll shut up.