Monday, July 18, 2011

Shortcuts in World-Building

Austin science fiction writer Marshall Maresca wrote a number of posts about world-building recently. I liked this series because I've become more fascinated with the patterns and problems encountered by writers doing this, and how they solve them, than in the details of the worlds they build.

In particular, it's very difficult to extrapolate the impact of certain technologies far into the future, so writers often explicitly state why development of that technology stopped. The interruption of trends in science fiction typically takes the form of a war, a treaty or other deliberate moratorium, or most ambitiously, a violation of the principle of mediocrity. Usually these seem to have more to do with the writers' realizing it's too hard to extrapolate that far, rather than thinking that there really will be some event that stops the technology from progressing. This would seem to shirk Asimov's definition of science fiction: "That branch of literature which is concerned with the impact of scientific advance upon human beings." But optimistically, writers may just be trying to avoid a story that would be hard to read. Cynically they may be trying to avoid one that's even harder to write.

For example:

Frank Herbert's Dune - The Butlerian Jihad. In Herbert's future history, there was a war that resulted in the destruction of all artificial intelligence, and a religious edict not to re-create it. Useful, when you're writing about people in the year 25000 or so and you need their lives to be recognizable to people in the year 2000 or so.

See, if these guys are around in the year 12000, why aren't they all over the place in 25000? Because the story would be too weird and difficult, that's why, so you need a jihad. From the cover of The Machine Crusade at

Vinge's slow zone. Vinge posits an area of the galaxy where physical law has been manipulated to disallow superluminal motion and superintelligence, limiting travel and technology but also protecting from the predatory intelligences that apparently fill the rest of the universe and would certainly be central to the experience of any human that encounters them. (See the Warhammer 40k/Event Horizon discussion; but even Warhammer 40k has tricks for this problem. Incidentally, this raises the question of why people seem so much less concerned with continuity problems in video games than in prose fiction.)

Asimov's Foundation Series - the lack of robots. Since the Foundation series turned out to be in the same history as the Spacer Era, Asimov needed a reason to explain why robots weren't critical to the fabric of life in the Galactic Empire, which would surely have induced changes that would take a long time to think through, and produced a far stranger cultural setting. The given reason was that robots agreed that their presence was harmful to humans and agreed to disappear from history, with a few remaining to manipulate things behind the scenes (a conceit which he and his appointees used more in the later novels).

(Note that all three of these workarounds involve the development of artificial intelligence, something which Singulatarians should find interesting. Singulatarians should also feel a need to explain where the alien AIs are in the real world, if they really believe intelligence explosions are an inevitable outcome for any tool-using species. Note also that Vinge was first to name the concept of the Singularity.)

Asimov's Foundation series - the lack of aliens. Asimov famously omitted aliens from most of his work, most conspicuously in the Foundation series (this is discussed further here). A galactic empire with aliens is much harder to create and much harder to claim as predictable by a science of psychohistory. It must be said that there are also no intelligent aliens in the best-selling science fiction novel of all time (Dune) yet this seems to have escaped the same degree of discussion. (At other points Asimov also felt the need to explain why there were never nuclear weapons used in all of galactic history; it seems humans were just too civilized to consider it. Would that it were true.)

Kingsbury's Psychohistorical Crisis - genetic standards. This outstanding but non-canonical addition to the Asimov canon is discussed in more depth here. The idea is that although humans are capable of genetic engineering, to avoid the species losing coherence and keep everyone capable of mating with everyone else, a genetic standard is adopted. The effect for the writer is that characters and the social universe in which they live remains more comprehensible (reflecting this in the political writings of the real world, Frank Fukuyama has observed that history is predictable only to the extent that human nature remains constant). One novel by Kingsbury may seem a strange place to focus on this concept but its appearances are scattered through many other science fiction writers' work.

Non-climate-altered futures from Kim Stanley Robinson. I once saw a talk by KSR in which he made the following interesting point: anyone writing science fiction today that takes place on Earth in the next few centuries, and does not represent climate change, is writing what is already guaranteed to be alternative history or just outright fantasy. Although I've been writing so far about conscious choices made by writers to avoid having to extrapolate technology to the far future, KSR is criticizing writers for avoiding our real future for no reason other than wishful thinking.

There are other approaches besides coming up with an explicit workaround. Some writers just don't address obvious plot- or setting-holes that exist (if time travel is possible, why isn't everybody fighting back at the causal high ground of the Big Bang?), whether through deliberate editorial control - that's just not what their story is about - or through omission. Another popular technique is just to wipe out civilization between now and your future history. You know this one: there's an apocalyptic war that serves as a future-historical reset button, and now the world can be any way the author wants it to be. Except culturally it will somehow remain remarkably similar to the U.S. in the year the author wrote the story. (Works from the Buck Rogers TV series through Brave New World have fallen prey to this; of course you can use an apocalyptic war for more than a convenient culture-eraser, but most don't.) The Butlerian Jihad above might seem cheesy but it's more interesting and less predictable than "The Great Cataclysm of 2096". [Added later, a more minor version of this: Vernor Vinge, at an appearance promoting Children of the Sky in October 2011, said he corrected some architectural inaccuracies in his description of the UCSD campus in Rainbow's End by referring to the Great Rose Canyon earthquake of 2017.] The Past Apocalypse is the world-building equivalent of quantum flux.

Of course, there are writers who work hard to squeeze out every last drop of implication from technology in their story, although they (perhaps wisely) tend to focus tightly on one theme. For my money, Robert Sawyer and Larry Niven's near-future work are the best examplars. Charles Stross also drives trends unapologetically to conclusion, although he's very ambitious and sometimes seems to neglect a consequence that would undermine his plots. Because I like him, I assume that's writerly editorial control rather than omission.

1 comment:

Anonymous said...

If you keep reading the post-Herbert Dune books, you find out where the robots went.

I haven't read Herbert's notes. It's either a plotline that was in the making for a very long time, or a somewhat neat extrapolation of these sort of worldbuilding conveniences.

A writer who wants an audience in his lifetime, writes about 'now,' even if he writes about the future. Writing only to the future is pretty much like putting a message in a bottle.