Saturday, September 10, 2011

Another September 11

I thought I was over September 11, but this article in Friday’s Washington Post brought it all back.
In it, an Air Force pilot recalls how she and her flight commander scrambled to intercept United Airlines Flight 93, which had been hijacked by al-Qaeda fanatics intent on crashing it into the Capitol Building. Neither of the Air Force jets was armed. The fliers intended to knock down the airliner by turning their own planes into guided missiles aimed at Flight 93: A suicide mission. To save our country.
The passengers on Flight 93 made the sacrifice instead, bringing the airliner down in a Pennsylvania field rather than let the hijackers have their way. And as I read about it, I found myself all teary again, the way I was on September 11 when I heard the story of how the Republican activist Barbara Olson, wife of the U.S. Solicitor General, had been among those killed when American Airlines Flight 77 crashed into the Pentagon. Her husband reported that she’d called him while the plane was in the air, and asked what she should do, and then the phone went dead.
Barbara Olson had been one of my least favorite people. She was a smart, pugnacious and (I felt) mean-spirited advocate for conservative politics—one of those people you loved to hate, if you were on the other side politically. She’d been all over the television during the 2000 election cycle, and was a frequent guest on Bill Maher’s TV show, “Politically Incorrect,” snarking about Al Gore and liberalism. But all of a sudden she was dead, along with the nearly three thousand others who departed this world in the wreckage of the Twin Towers, at the Pentagon, and in the Pennsylvania field.
At that moment, what had come before didn’t matter, and I could see my bitterness and irritation at her for what it was—something essentially false, and trivial, and unworthy. We were all Americans, and we were all standing together against a great evil. At a service in a small Episcopal church the next night, I found myself crying and praying for Barbara Olson, repenting of my sins, ready to do my part—whatever that was—in the fight that was to come.
For the next few months, that amazing feeling of solidarity of purpose with other shocked Americans remained strongly with me:
—As I played volleyball with friends a day later, not far from Dulles Airport, under a deep blue sky surreally empty of air traffic, trying to act normally.
—As I waited stuck in traffic on the Washington Beltway, where the drivers were suddenly calm and courteous, and no one cut anyone else off.
—When I watched the congressmen from both political parties singing together on the Capitol steps.
—When I watched President Bush take a megaphone and climb on the smoking ruins of Lower Manhattan to defy Al Qaeda.
That feeling was still with me months later when I visited New York for a convention, and made the pilgrimage down to Ground Zero, where the wreckage had been mostly cleared away but  still smelled of smoke and ruin. And I felt it too the following spring, on a hiking trip in England, when the sympathy and kindness of the English people I met helped me see that we were not alone, that much of the world shared our shock and sympathized with our grief.
That was ten years ago.
Today, in some ways, it is as if it never happened. It is as if our nation, which woke up for a few months there in 2001 and 2002 with a shared sense of purpose and connection, has slipped back into the same old dream we were dreaming before the planes hit. The political battles are more vicious than ever, hypocrisy and cynicism exacerbated by a severe economic downturn in which those who have been backed up to the edge of a cliff kick at those who are already falling off of it.
It turns out that there were few clear-cut enemies to mobilize against. Instead, we mobilized against ourselves. 
The so-called War on Terror mostly set us against one another. Those who argued against new restrictions on civil liberties or against ill-conceived military action in Iraq the following year were branded “unpatriotic” and vilified. Who were the true patriots? Who were the true Americans? Our  readiness to share in sacrifice for our country was manipulated and twisted and used for political gain. Those of us who had opened ourselves up to it felt like we’d been played for suckers, victims, like the heroic Pat Tillman in Afghanistan, of friendly fire.
As this tenth anniversary of the September 11 attack nears, the retrospectives on TV and radio refer knowingly to that day as a day when “everything changed.” Several of them have asked viewers and listeners, “how did it change you?”
I don’t feel that the attacks changed me. If anything, they brought out something in me that was already there: my best self, willing to believe, sacrifice, and act selflessly. They brought out the most hopeful, patriotic, naively optimistic parts of me that I’d previously walled off from a world where suckers get taken advantage of, and no good deed goes unpunished. 
But what came after September 11, with its internecine culture wars—red America against blue America, Tea Party against New Deal, ideology against pragmatism—was far worse than al-Qaeda’s attacks. What came after September 11 changed me.
Today I feel beaten down emotionally, and pessimistic about the direction in which the country is moving. I see fear, and greed, and revenge, and mistrust ruling our lives. I see people shutting down and lashing out. Even though I no longer commute on the Washington Beltway, road rage is worse than ever. The political divisions seem even more intractable. A President whose campaign asked us to hope and work together is increasingly trammeled into a neat political box.The class anxieties feel even more pronounced.
Reading about the pilots brought it all back. Everything, including the tears. For a moment—just for a moment—I again recognized the part of me that truly believes in America and the principles that it was founded on. It is still there, if buried deeply by all the rubble of the decade since September 11, 2001.
I read in the Post story that the pilots eventually returned to base on that day. They were not required to make a dramatic sacrifice for their country. Later, they would go on to fly and fight in Iraq, armed this time with all the high-tech weaponry that could be packed onto their planes. It was a fight that was less clear, and a cause that was harder to define, but they were ready to do their duty.
I feel the same way. If only I knew what my duty was.

Saturday, May 28, 2011

Thor in 3-d (lowercase)

I was an avid comic book reader and collector throughout high school, and one of my favorites was Marvel's The Mighty Thor. At one point I had a nearly complete run of the series. I think what I liked best was the epic quality of the stories—which tended to be an amalgam of science fiction, mythology and fantasy, with occasional connections to the "Marvel Universe" in which dwelt other heroes, like Spider-Man and the Hulk. Even the language, a cheesy faux-Shakespeare pastiche of dosts and thous and verilys, was fun. So, how could I resist going to see the new Thor movie and watch the long-tressed one swing his Uru Hammer in the multiplex
. . . in 3-D, no less?

Movies can convey epic scale effectively, as did Lawrence of Arabia , and grand, sweeping vistas of destruction and chaos have become expected parts of CGI battle scenes in movies such as The Lord of the Rings. The cinematic Thor contains its share. But watching it through my 3-D glasses was a strange experience. Rather than immersing me in the spectacle, and expanding the scale of the show, the 3-D effects shrank and contained it. It was like watching Thor in a puppet theatre, or . . . (more to the point) . . . reading it in a comic book. At one point, for instance, during a grand battle scene the camera blasts through a crumbling wall to reveal a vast hall containing an army of warriors. The moment should have been spectacular. Instead, it was as if the camera had taken the top off an anthill and revealed all the tiny creatures scurrying around.

Roger Ebert has written at length about how much he dislikes 3-D movies, His points are hard to argue with: it darkens the movie, the effects are often distracting, and too often they are not used in service of the story. The only other 3-D movie that I've seen recently, James Cameron's Avatar, seemed to use the process more effectively: I did get a feeling of depth and beauty from that film that seemed connected to the 3-D format. Thor is one of those movies that was filmed in 2-D and then converted, using a digital process, to 3-D for the screen. (Perhaps I'll go back and see Thor in 2-D, and see whether it seems more epic in its scope and imagination.)

But, strangely enough, watching Thor in 3-D brought me back powerfully to my days as a teenage comic-book collector. By shrinking the scope of the film, and, in effect, putting it back in the comic book frame in which I first encountered the characters and stories, the effect was more true to the original—more so than a full-scale movie epic, where vast expanses and teeming masses of characters fill the wide screen in a way that the eye can't process, overwhelming the senses. If I'd seen it in two dimensions, I'd probably have just found it to be an overly grandiose rendering of a comic book story that doesn't really stand up as a story for grown-ups. Instead, it took me back to a time when I didn't expect that of books I read.

Seeing Thor in 3-D made it more two-dimensional. I feel sure that wasn't what the filmmaker intended, but I kind of liked it.

Saturday, May 21, 2011

Jellybean . . . boom! (I feel fine.)

So the world didn't end today, with either a bang or a whimper. We can all feel smug and laugh at the poor deluded clowns who were talked into getting rid of all their worldly possessions in anticipation of the Rapture. But are we any less risible in thinking that it began with a bang?
The Big Bang theory, by which most scientists today would explain the origins of the universe, was first proposed in 1927 by a Jesuit priest who taught physics.
That seems more ironic today, when religion and science have become, in the words of Stephen Jay Gould, “non-overlapping magisteria,” than it was a century ago. But it really shouldn’t come as such a surprise.
According to current scientific consensus, some 13 billion years ago all of existence consisted of what physicists call a singularity—a state of infinite density, pressure, and temperature in which laws of time and space as we know them did not operate. From that singularity, the matter that we call the universe expanded with incredible force and rapidity, sending what would become the stars, the galaxies, and other stuff that we can’t see or detect whirling outward, faster and faster. It’s a hard concept to wrap your brain around, because answers to the common-sense questions that it occasions seem more like philosophy than physics. (What existed before the singularity? If nothing existed, where did the something come from? If something existed, where did that come from? If the universe is now expanding faster and faster, what lies beyond the zone of expansion? How can matter emitted from a “bang” actually speed up?) The answers call into question basic human concepts of reality such as existence, space, before-and-after, and so forth.
Most contemporary cosmologists, like Stephen Hawking, scoff at the idea of a divine cause for all this, but one can see why a Roman Catholic priest might find it compelling. When you get past the mathematical proofs, the theory appears to leave room for an act of Creation that defies the natural rules we live by. Whether it does or doesn’t is as much a matter of faith for scientific cosmologists as it is for religious believers.
I like to imagine that those who pooh-pooh the idea of any sort of supernatural agency would prefer to rewrite Genesis as a sort of self-executing computer algorithm:
In the beginning when the Program rendered the heaven and the earth, there was a singularity (a One) where before there had been nothing (a Zero). Now since the One was undifferentiated, and the Zero was void, an “on” and “off,” therefore darkness was upon the face of the deep and the face of the waters, which were variables derived from the Zero and the One but had yet to be defined. And calculating that the creation required further definition, the program caused ones and zeros to propagate, and there was light. This was version 1.0, the first day.
Such a digital account of Creation begs the question of where the One comes from, as well as the origins of the Program itself or the reason why it self-executed. Its main difference from the mythological account in Genesis is that it lacks an analog God created in Man’s image. We have a lot more data than did the author of Genesis. I'm not sure we have more answers.

Monday, May 16, 2011

Why I'm not worried about Watson . . . yet

When IBM's Watson beat the human champions at Jeopardy!, what made made its victory a little unexpected was that instead of us playing a game within the world created by the computer, the computer competed in a game that we ordinarily play against each other, in our world. In a sense, it beat us at our own game.
But was that really so frightening? Unlike us, Watson did not know it was playing a game, in the sense that we view games as differing from “real life.” No one is yet claiming that Watson is sentient: it remains a logic engine with access to a vast database of factual information, and instructions about how to respond to real-world ambiguities; its answers depend on probabilities stemming from how well or poorly it understands the questions asked.
The real challenge for Watson was not so much in knowing the answer as it was in understanding the question: once its programmers had taught it how to make sense of natural language, and the computer could translate that input into the sort of logical query required to analyze the vast amounts of data available to it, it became a matter of processing power to make it practical to compete in real time against human opponents. And its power was such that within the limited universe defined by the rules of Jeopardy! it became very hard for a human being to beat it.
We’ve yet to create, or encounter, self-conscious artificial intelligence of the sort Hollywood likes to portray. Watson plays Jeopardy! well because the game exists within a carefully limited set of rules that constrain what is possible, not because it finds the challenge consciously stimulating, or fun.
We play our games for different reasons than computers. Yes, we like rules: Basketball courts are ninety feet long, with ten-foot goals, and five players on each side who score by putting the ball in the hoop; there are no forward passes in rugby; chessboards have sixty-four squares; two cards are dealt face-down in Texas Hold ’em. For human beings, the challenge is playing within the rules. That’s not a challenge for Watson, which ultimately depends on rules. Within them, it uses raw computing power to explore all possible solutions before settling on the most probable or efficient: its real challenge would be playing where there are no rules, or making them up as you go. Humans do that all the time.

Saturday, May 14, 2011

Watson and "Pong"

When IBM’s “Watson” supercomputer defeated two human champions on TV’s Jeopardy! in 2011, news reports highlighted cries of discomfort from people like me who grew up watching science fiction movies like 2001: A Space Odyssey, The Terminator, and The Matrix. In all of those popular films, heroic and flawed human beings found themselves pitted against the implacable digital intelligence of machines bent on destroying them.
Watson’s designers playfully tweaked 2001 by having the computer appear on Jeopardy! in the form of a monolithic monitor that recalled the mysterious otherworldly slabs of the movie. Turnabout is fair play: the name of 2001’s HAL had been derived by starting with the initials “IBM” and changing each by one place in the alphabet (H⇐I, A⇐B, and L⇐M), a not-so-subtle dig at the computer company’s vision of a clean, technological future. Watson’s dispatch of the human champions on television was almost as bloodlessly efficient as HAL’s attempt to kill off his carbon-based spacemates on the way to Jupiter.
Most of us, though, took the news of the computer’s triumph pretty calmly. After all, we’d been getting our butts kicked by computers in games regularly, for years. And wasn’t this merely another game?
Computer games arrived just as my generation was getting ready to enter college. The first hit was Pong, which allowed two players to compete against each other in a tennis-like match that employed a glowing cursor on the home TV rather than a bouncing ball. If you were playing solo, you could compete against the computer. My first real introduction to video-game culture came during a 1978 summer trip to Japan, where the kids from our host family took me to a Kyoto arcade where everyone was crouched over tables playing Space Invaders half a year before the craze caught on big in the States. Most of the Japanese kids were expert warriors; the Invaders slaughtered me. Next thing I knew, when I got back home, the pinball machines at my college were being replaced by Pac-Man consoles, and electronic gaming was here to stay.
I never really got into it in the way that a lot of my friends did — and certainly not in the way that our younger brothers and sisters and cousins did, many of whom grew up with Atari and Nintendo consoles in their homes. They enjoyed the flow and rush of the games, and the ways in which they could immerse themselves in the unfolding stories that the games told. I always had trouble getting past the notion that no matter how much I practiced, and how skilled I became, I was still playing a game by the machine’s rules, and that it would ultimately overwhelm me with its logic and relentlessness.
Dedicated gamers, by contrast, loved pitting themselves against the machines: The Matrix is, in essence, a gamer’s paranoid fantasy in which reality is revealed as artificiality — a video-game illusion with real-world stakes — and the human hero is a free-spirited savant who both refuses to play the game by the machine’s rules and at the same time is willing to enter its virtual world in order to defeat it.
Like the Matrix, computer games ask us to accept the rules by which their imaginary world operates and give ourselves over to the story that the game spins for us — a story in which we seem to have free will and the ability to act independently, but are in fact playing out a role that has been scripted for us by the game’s designers. The essential difference, of course, is that those designers are humans rather than machines and the game is just a game, not a ruse by which we become a natural resource to be exploited for energy. If gamers sometimes get carried away by their game-worlds, even the most fanatical knows, at a certain level, that it’s just play.