When IBM’s “Watson” supercomputer defeated two human champions on TV’s Jeopardy! in 2011, news reports highlighted cries of discomfort from people like me who grew up watching science fiction movies like 2001: A Space Odyssey, The Terminator, and The Matrix. In all of those popular films, heroic and flawed human beings found themselves pitted against the implacable digital intelligence of machines bent on destroying them.
Watson’s designers playfully tweaked 2001 by having the computer appear on Jeopardy! in the form of a monolithic monitor that recalled the mysterious otherworldly slabs of the movie. Turnabout is fair play: the name of 2001’s HAL had been derived by starting with the initials “IBM” and changing each by one place in the alphabet (H⇐I, A⇐B, and L⇐M), a not-so-subtle dig at the computer company’s vision of a clean, technological future. Watson’s dispatch of the human champions on television was almost as bloodlessly efficient as HAL’s attempt to kill off his carbon-based spacemates on the way to Jupiter.
Most of us, though, took the news of the computer’s triumph pretty calmly. After all, we’d been getting our butts kicked by computers in games regularly, for years. And wasn’t this merely another game?
Computer games arrived just as my generation was getting ready to enter college. The first hit was Pong, which allowed two players to compete against each other in a tennis-like match that employed a glowing cursor on the home TV rather than a bouncing ball. If you were playing solo, you could compete against the computer. My first real introduction to video-game culture came during a 1978 summer trip to Japan, where the kids from our host family took me to a Kyoto arcade where everyone was crouched over tables playing Space Invaders half a year before the craze caught on big in the States. Most of the Japanese kids were expert warriors; the Invaders slaughtered me. Next thing I knew, when I got back home, the pinball machines at my college were being replaced by Pac-Man consoles, and electronic gaming was here to stay.
I never really got into it in the way that a lot of my friends did — and certainly not in the way that our younger brothers and sisters and cousins did, many of whom grew up with Atari and Nintendo consoles in their homes. They enjoyed the flow and rush of the games, and the ways in which they could immerse themselves in the unfolding stories that the games told. I always had trouble getting past the notion that no matter how much I practiced, and how skilled I became, I was still playing a game by the machine’s rules, and that it would ultimately overwhelm me with its logic and relentlessness.
Dedicated gamers, by contrast, loved pitting themselves against the machines: The Matrix is, in essence, a gamer’s paranoid fantasy in which reality is revealed as artificiality — a video-game illusion with real-world stakes — and the human hero is a free-spirited savant who both refuses to play the game by the machine’s rules and at the same time is willing to enter its virtual world in order to defeat it.
Like the Matrix, computer games ask us to accept the rules by which their imaginary world operates and give ourselves over to the story that the game spins for us — a story in which we seem to have free will and the ability to act independently, but are in fact playing out a role that has been scripted for us by the game’s designers. The essential difference, of course, is that those designers are humans rather than machines and the game is just a game, not a ruse by which we become a natural resource to be exploited for energy. If gamers sometimes get carried away by their game-worlds, even the most fanatical knows, at a certain level, that it’s just play.