Prey

Prey is the first fiction book I’ve read in a while; a welcome change from having to take notes on every page of similarly-themed scientific works. It explores a nanotechnological disaster, with self-replicating machines working as swarms to overwhelm their creators. Crichton, as always, has done his homework on this one, with several pages of references contained in the back of the book.


The overwhelming theme of the book is man’s hubris and our headlong rush into technological achievement. I was reminded several times of Jeff Goldblum’s character in Jurassic Park saying “Your scientists were so preoccupied with whether or not they could, they never stopped to think if they should.” It was difficult to believe the extent the characters went to in protecting their precious achievement, though explained dubiously by the machines invading human bodies and influencing them biologically.

Many signs point to “emergence” (book, by Steven Johnson) being the next thing I need to understand. In a nutshell, this is the idea that you can predict the actions and responses of individual agents in a system, but not the agents acting together. As Crichton puts it on page 173:

The results of these interactions could not be programmed. It just emerged, with often surprising outcomes…For the first time, a program could produce results that absolutely could not be predicted by the programmer. These programs behaved more like living organisms than man-made automatons.

The scientists’ solution to this unpredictability was to program the agents as predators–whose single desire is to feed. This consistent goal keeps the pack working together.

However, I enjoy the idea of machines reacting unpredictably–in a controlled environment. Working in the computer industry has made me both reliant upon and bored with the utter predictability of machines. If my systems weren’t consistent, I’d be out of a job. But on the other hand, great breakthroughs seem always to happen via mutations, and mutuations never happen when you control the outcome of every action.

It reminds me of Dean Kamen in Code-Name: Ginger saying “If I gave you objectives, you might reach them, and that would be terrible, because it might keep you from doing something really great.” I take comfort in the fact that currently only humans are endowed with what we call “creativity”–it gives me career security in a time that most tasks are becoming automated. But I know that to succeed when even creativity is done by machines will require being even further ahead of the automation curve.

As the authors admit in The Experience Economy, “Those who decried previous economic shifts…failed to stop the progression of economic value to higher-echelon offerings. It happened despite their protestations.” The question is not whether machines will think for us–it is how we can benefit from that.

2 Comments