Art and War at the Digital Frontier 
Bruce Jackson
 

(A talk given at “The Digital Summit” Sponsors’ Dinner, 1 November 2001)
 

In calendar terms, centuries begin at midnight on the first day of years ending in zero-one: January 1, 1901 was the first day of the 20th century and January 1, 2001 was the first day of the 21st.

But human sensibility doesn’t really work like that. Calendar facts are far less important to us than historical events, which is why dates figure far less in our sense of the world than do those singular moments at which the world seems to pivot, moments after which nothing is ever quite what it was before. Sometimes we know those moments immediately; sometimes it’s years before we know them for what they were.

Most historians I know would say that the 20th century began on June 28, 1914, in Sarajevo, with the assassination by seven tubercular Serbian terrorists of Archduke Franz Ferdinand, heir to the Hapsburg throne. That started a war at the end of which the United States was something it had never been before, which it had not wanted to be, and  which it has remained ever since: a major world power. I think  the 21st century began in Manhattan at 8:45 a.m. on Monday, September 11, 2001, when the U.S. was attacked for being the world power it had become a century before.

We entered the 20th century trying to deal with three ideas purporting to define or describe or explain three spheres of action, development and conflict: Darwin on the natural world, Freud on the internal world, Marx on the economic world.

We’ve entered the 21st century not so much on a wave of new ideas, but rather trying  to cope with some very difficult old ones—most notably religion and nationalism—and doing it in a time when our ability to transmit, store and utilize information is changing with a velocity so great that the only things we can say for sure is, the way we transmit, store and utilize information tomorrow will not be the way we transmit, store and utilize information today.

In a famous 1893 essay, the historian Frederick Jackson Turner argued that the American character changed forever with the closing of the frontier in 1890. Whatever we were before that, he said, we would have to be something else from then on. Turner’s frontier was one that got smaller and smaller until it disappeared entirely. The digital frontier is exactly the opposite: every step we take into the digital frontier opens up new territories we hadn’t dreamt were there.

When I was 13 years old and a student in Stuyvesant High School in New York, I got to hanging out at the physics lab at NYU where, in exchange for letting me play with oscilloscopes and signal generators, they set me to replacing burnt-out electronic tubes—what the Brits call “valves”—in double-triode flip-flop circuits, each of them about half the size of a cigar box. Those assemblies of two triodes and a few transistors did two things: they either let current flow or they blocked the flow of current, and they made sure whatever current did flow flowed in one direction only. That pair of tasks is the core operation of every electronic computer in the world. The monstrous machine I puttered at the edges of more than fifty years ago utilized about sixty thousand of those triodes and they generated huge quantities of heat that required enormous ventilation and cooling devices to keep them from burning the building down.

I never saw the whole thing actually at work, though they assured me that on occasion it did work. I don’t know how much or what kind of work it did. I suspect it didn’t do much of either. Programming and inputting data took forever. Constantly and randomly in that vast array, triodes died, and had to be found and replaced.

Everything that went on in those huge hot rooms takes place a thousand times faster in the ten-dollar handheld solar powered HP calculators used by every high school student in the United States. They don’t even need batteries. And .in my Palm Pilot which also fetches mail, is a memo pad, and plays chess better than I do.

Some of my technical friends say spectacular progress results from interesting inventions and some say that interesting inventions come along and then people think of spectacular things to do with them. The famous Bell Labs team of John Bardeen, Walter Brattain and William Shockley invented the transistor in 1947 to replace exactly the kind of triode control circuit I was just telling you about; it was a combination of a transmitter and a resistor, hence “transistor.” I doubt that they could in their wildest dreams have imagined the kind and magnitude of memory and computational capability and activity we’ll be hearing about in The Digital Summit tomorrow and Saturday. Or that every one of us in this room uses in the course of our professional and personal lives every single day. Once the transistor existed, ever more complex uses were invented for it, and those uses in turn invoked ever more sophisticated and efficient ways and devices for managing, bits and bytes and pixels.

Digital imaging now permits detailed explorations of parts of the body that only a few years ago required painful and often dangerous physical incursions. It permits exploration of the surface of Mars that couldn’t have been done at all. Digital information management and imaging has altered the way bankers, generals, surgeons,  criminals, educators, photographers, politicians,  and publishers work.

Visit the web site of the Electronic Poetry Center and you’ll find words and sounds and images and links. What goes on there couldn’t happen in a book or even on a CD-ROM. My colleague Robert Creeley, one of the great literary correspondents (his letters to and from the poet Charles Olson is available in nine published volumes from Stanford University Press) now does his correspondence and poetry writing on the computer. Creeley is famous for his collaborations with visual and plastic artists; he says his current digital explorations are letting him engage in collaborations that were heretofore unthinkable.

Perhaps you saw the beautiful photograph in Tuesday’s New York Times of the new Hearst Corporation headquarters on 8th avenue in New York. It’s a gorgeous 42 story crystal and steel building designed by Norman Foster that uses for its base a landmark 6-story building completed in 1928. But don’t drive by the site expecting to see it. The building doesn’t exist. It was created electronically, using CAD programs and PhotoShop. If you were in Foster’s office, you could see a virtual display of that building from any angle at any time of day, in any kind of light.

The Buffalo and Fort Erie Public Bridge Authority recently began using a virtual traffic model in their planning operations. What happens if there’s a breakdown in one of the lanes at 3 am or 3 pm? What happens if you rotate this planned access ramp 15 degrees? Before, the answers came in numbers and words. Now the answers come in numbers, words, and moving images. They say it works.

Movie effects that previously required the physical marriage of strips of film, and film editing that used to be done with a sharp blade and tape, are now done with a system called Avid on a Mac. The big studios and kids in our film classes use exactly the same technology. You don’t like the way two shots abut? Change them: click click. You want to see how another juxtaposition works or how that would look if it were a bit more red or two seconds longer or shorter? Click click.. You want to go back where you were before you started meddling? Click click.

That fine technology doesn’t mean movies are any better, by the way, it just means they’re made differently. The devices in themselves are nothing. They’re like salt or yeast or rubber or iron or pencils. They are as useful and as meaningful, or as useless and devoid of meaning, as human imagination and intelligence can make them.

You don’t have to look far for lousy digital work. Perhaps you’ve noticed CNN’s recent MTV makeover in pursuit of a yuppie audience. Sometimes there are so many horizontal scrolling and bragging bars and sidebar boxes that the anchorperson and remote shot are pushed up into two thirds of the upper third of the screen.

Sometimes they are so desperate to run images they run images with no content whatsoever. The day the Afghanistan bombing started the anchorperson, formerly an actress on “NYPD Blue,” told us we were seeing “Live pictures from a nightscope camera near the front lines,” She was on the left side of the screen. The right side was a dark green background with flecks of light. It was, as anyone with normal vision and a modicum  of intelligence had to know, just green noise, no information whatsoever. It’s what you get if you point a nightscope at the night sky. They showed that screen off and on until morning light came to Afghanistan, with not an iota of difference in what we saw. .

And how many times have you sat through a PowerPoint presentation that, had the speaker not had PowerPoint, would have taken half as long, been a quarter as boring, and ten times as useful–because there would have been time for discussion. Nearly all the PowerPoint presentations I go to are pretty; but the great majority of them are computer graphics for morons because what’s on the screen is what the speaker is saying. And on the way out, someone usually hands me a stapled bundle with printouts of all the slides. When they do that I wonder, “Why did I have to come here? Why didn’t they just send it as an email attachment and we could have saved all that time and xerox paper?”

There are dangers at the digital frontier far more troublesome than bad show-and-tell.

All that information that is so easy to get and manipulate can, in the wrong hands, contribute to profound mischief. Thieves who once had to go somewhere to steal big, now, if they have the right access codes, need only go to a telephone jack.  It’s comforting to know that the government can use electronic records to track villains, but I’m not happy—and neither I suspect is anyone who remembers the witch hunts of the McCarthy days—that anyone with the proper knowhow can find out what time my car entered and exited any toll road or bridge in the northeast, what calls I made to whom from just about anywhere, what URLs I visit most, what my book buying preferences are. I’m not happy about guys I don’t know who wear thick eyeglass frames with clear glass in them transmitting videos and sound recordings of conversations I hadn’t meant to share with strangers.

There are often prices paid when we replace older mechanical technologies with newer digital technologies. At present, no digital projection medium is close to film for projected image quality and no computer memory medium or system has the longevity of silver  film or of graphite pencil or particulate ink on acid-free paper. You can go to George Eastman House in Rochester and see movies made a century ago. I don’t mean videotape or DVD copies: I mean the original films. I can go to my darkroom and make excellent prints from glass plate negatives made before the Civil War. You can go to the Library of Congress and see the Declaration of Independence. Not a copy. The one those guys wrote out. And you can see the letters they wrote one another while they were figuring out how to do it. The ceiling of the Sistine Chapel, the walls of the Prado, the Louvre, the Met are full of extraordinary singular objects that have survived hundreds and even thousands of years.

Does anyone in this room have a computer that can read a 5¼-inch disk written in CP/M, the prime competitor to MS-DOS fifteen years ago? Or even a 5¼-inch disc written in MS-DOS, the predecessor to Windows? How much of what is presented in the Digital Summit conference this weekend will be readable by any functioning machine fifteen years from now? How long will the information on CD-ROMs and hard drives last?

I’m have on a shelf in my study a Seagate five-megabyte drive, the awesome capacity of which I delighted in when it came out in the 1980s. It’s more of a historical curiosity than a mule-pulled plow, because that old plow still works if you hitch it to a mule, but I have no computer to which I could hitch that beast. It weighs five pounds, about the same as my Dell laptop, which has 41 gigabytes of storage memory,  the equivalent of 215 of those Seagate five-megabyte drives. Those 41 gigabytes of memory are stored and managed in a drive smaller than half a deck of cards. In 1956, a megabye of memory cost $10,000. Twenty-five years later, in 1981, that megabyte cost $340. A decade later, the same megabyte of memory cost $7, and a decade after that, July of this year, it cost four-tenths of a cent.

And if the claims made by Harvard chemistry professor Charles Lieber are true, the best we've got now is chump change. Lieber is working with logic and memory circuits next to which all current logic and memory circuits are elephantine. He says he has produced a transistor 10 atoms wide.  This promises machines that are not just smaller, but which are different in kind from anything we now have. We can no more envision what will be done with this new technology than John Bardeen and his Bell Labs colleagues back in 1947 and the designers of that double-triode behemoth in NYU’s physics lab in 1949 could have envisioned what we’ll see in the Center for the Arts tomorrow and Saturday.

Frederick Jackson Turner’s 19th century frontier shut down because the land was all owned; there was no more of it for the getting. The only thing that will shut down the digital frontier is if minds get closed, if we get stupid or smug, if we let politics overwhelm intelligence, if we let the noise of rhetoric drown out the music of knowledge. That conflict was depicted 15 years ago by filmmaker Ridley Scott (director of Alien, Blade Runner, Thelma and Louise, Gladiator) in the commercial Advertising Age called "the best commercial of all time." It was shown only once, at halftime during the 1984 Superbowl. It’s one minute long and with it I shall end these remarks:
 

[At that point, I played a VHS tape of Scott's Apple Macintosh commerical.  I cannot provide that here, but it now resides online so you can access it easily. Click on the runner for 13.1MB and 1.9MB Quicktime and Realplayer 56K and 28K stream links. If that link is busy, click on Big Brother for another site with the 13.1MB file.]


 
copyright 2001 Bruce Jackson