At last, a week late, my notes.
Bruce Sterling does not worry about a Vingean Singularity that renders humankind a powerless annoyance to transcendent artificial intelligences. Instead he worries about plain old human-driven technological change and nasty WMDs.
Cynthia and I drove up to the City to hear Bruce Sterling’s lecture for the Long Now Foundation at the Fort Mason Center. We had planned a quiet evening at home, watching Brian DePalma and Robert Altman, but Bruce gives great lectures, and after his recent talk at Microsoft I didn’t want to miss what could be a great talk. Thanks Cyn.
The talk can be found as an audio stream in Ogg and MP3.
Stewart Brand from the Long Now Foundation introduced Sterling. The topic was The Singularity: Your Future as a Black Hole. Brand observes are discontinuities are potholes for group that’s planning for the next 10,000 years of human history.
Sterling starts with two definitions of The Singularity:
- Von Neumann to Ulam:
An unpublished speculation on the condition where the rate of change exceeds human control and comprehension.
- Vernor Vinge, a professor of mathematics in San Diego
The 1993 paper on the singularity is the cannonical defintion.
Within thirty years, we will have the technological means to create superhuman intelligence. Shortly after, the human era will be ended.
Fred Moulton reminds me that there’s been some philosophical papers in the past few years questioning if an emergent AI would decide to murder us all. Sterling didn’t mention these in his presentation.
However, Sterling’s not impressed with AI’s track record so far. He is not convinced that we’ll see ‘emergent’ AI.
He detours before heading to his next topic and discusses how the idea of the ‘singularity’ is a hard for SF writers to grapple. The technological singularity is impossible to communicate across, and thus the first way to read the title of his talk.
Sterling then puts up one of Vinge’s slides from his stump speech on the singularity. I can’t find these on Google or on Vinge’s site at San Diego State.
The slides are trend lines for computational power of machines compared to biological entities. A late 1990′s Mac is akin to a nematode. But while the Vax is a museum piece, the bacterium it supposedly superannuates still thrives.
He also has questions about Vinge’s definitions. Vinge talks about machines becoming self-aware, or ‘waking up.’ Biology does not, as of 2004, have a answer to what self-awareness is, so we cannot say if networked computers, ants, or a forest can or will have ‘woken up’.
There’s the matter of enhancing human intelligence, and Sterling’s open to that as being plausible. He then lays out alternatives to being super-smart. For example the psychologist Howard Gardner suggests we have multiple intelligences: cognition, emotional, physical, etc.
So instead of becoming some sort of human computing machine as the Mentats in Dune, enhancements might make us more able to be mindful, empathic, and realize what horribly rude people we are. One would hope they have good Prozac after that singularity.
Sterling lists three events that have singularity nature:
- The Atomic Bomb
- Computer Viruses
All three have changed the world, but only briefly.
Obsolescese and the Singularity
Sterling suggests the future will be a glut of undigested technical riches.
He continues with a new slide, Gartner Research’s Hype Cycle, a five-phase life-cycle of technology adoption. How ‘grown ups’ think about technology.
Of course, he adds, Gartner won’t tell you your business is dead as long as you have a budget for consultants.
The ‘S’ or logistic curve was the earlier form of Gartner’s hype cycle.
Returning to obsolescence, he asks the audience if we’d bother to pick up a copy of Windows 3.0 we found at the curb.
“The street didn’t pick up on the singularity.”
“There aren’t factions in the singularity movement.”
“The singularity has no end users.”
Schools of Thought
- Just No Way
- Superbian Transhumans
- Rapture of the Nerds
- Singularity Resisters
Science Fiction and Singularity
It’s a great way to make plot, “we had a singularity blow through”.
Ken McLeod’s Engines of Light novels, The Stone Canal, and The Sky Road are all about people living in the ruins of singularities.
We May be on the Edge of Nothing Important.
But we may be edging towards something important.
Like virus writers, the infrastructure of the singularity makers are well-contained. If you lock-up, bomb them, or take away their funding, they go away long before they produce anything self-sustaining.
There’s a large following for the singularity, but that crowd does not actively try to bring it about. This is the “geek rapture” crowd for whom Vinge is the equivalent of Left Behind.
They don’t imagine that a singularity could be monopolized, like the Biblical Fundamentalist version, or that it may be short lived: “And you have burned so brightly Roy.”
Science doesn’t reward thinking through consequences. We reward scary science that gives us things like hydrogen bombs, even the moral titans of science: Einstein, and Sakorov did their heavy lifting in the WMD area.
He suggests commercialization and broad patenting might stop a future singularity, but technologies with the biggest threat potential may pay off well in the market.
He suggests that two NGO superpowers may emerge who will attempt to marginalize the “kooks” on either side.
The conservative/religous opposition to stem-cell research may be an example of one of these new ‘superpowers’. The President’s Council on Bioethics’ report Reproduction and Responsiblity, talks about a biological singularity and opposes it.
Then there’s force. Sterling asks why wouldn’t a government ready to wage endless war on terrorism declare endless war on the singularity. The ‘nowhere to hide’ rhetoric of President Bush may extend from caves in Afghanistan to labs in China.
At this point, I must give a shout out to the Global Frequency, the sort of NGO one might want to have in this circumstance.
What Can We Say, Pace the Singularity
- Posthuman is a soundbite.
- Not just one singularity.
- The posthuman condition is banal from a post human’s point of view.
- Messy, embarassing, reversible singularites are preferable to the alternative.
- It’s hard to be just a little bit dead.
Going back to Judith Berman, Sterling closes with the observation that the most adept political actors in the world right now are people who blow themselves up.
To get past that, we must go back to treating the future as process and not a destination.
Possibly Related posts (machine generated):