Monday, September 26, 2011

Exploits in the Human Biocomputer: Buffet Edition
















Exploits in the Human Biocomputer (NLP Edition)


I suspect that NLPers have their own name for the Doctor Fox effect. That said, I don't see the point of saying something so precisely meaningless in terms of propaganda, except when preaching to the choir. There is something to be said for having confidence that if you are ever caught with nothing to say you can say nothing quite well and convince everyone you said what they wanted to hear, however. Perhaps I will do this the next time I give a presentation.

Sunday, September 4, 2011

Redefining scifi, again

Science fiction has been defined and redefined fairly frequently, both by fans and by authors. Since I have some fairly strong feelings on the subject, I figured I might as well cram them into your ears before they get too clogged.

I'm defining here what I consider to be the core of what makes something science fiction, but it differs enough from other people's idea of science fiction that we might as well call it something else. If you have a different conception of science fiction, perhaps you can agree to call this 'Fiction A is Fiction A', after its primary attribute.

First, I'd like to go over previous categorization systems and why they fail.

It should be clear to us, first of all, that setting something in outer space or having it involve computers is neither necessary nor sufficient to make it sci-fi. There was a time when the latter seemed less ridiculous. Let us settle for the idea that science fiction is not predicated upon the existence within the story of unfamiliar technology -- otherwise, a story full of technology that has become familiar will cease to be science fiction (as Terminal Man may soon be, since precisely the technology discussed in that novel is now being used by epileptics). Science fiction should not be vulnerable to the old chestnut about AI -- if everything it describes exists, it's no longer sci-fi.

An old standard, though, is the idea that sci-fi stays with known science as much as possible. I do not consider this a defining element, because it invalidates nearly all science fiction past and present. 1984 is based in large part around a naiive interpretation of the Sapir-Whorf hypothesis; if that turns out to be provably false, does 1984 cease to be science fiction? Dune (and many other science fiction novels from the 1950s-1970s) had in large part a basis in the assumption that precognition and remote viewing are possible. Are they no longer science fiction? Even if we limit it to the body of knowledge that was known clearly at the time of writing, we have some problems. Anathem uses the Everett-Wheeler 'multiple world interpretation', but (like every other use in science fiction) ignores the fact that in order for MWI to hold, no continuity is allowed to interact with any other; this is built into the formalism for MWI. Is Anathem not science fiction, since it selectively ignored science for the sake of a story?

So, it is neither necessary nor sufficient to keep with known science, and it is neither necessary nor sufficient to operate in the domain of not-yet-distributed futures. Placement in time itself is also no good; 2001: A Space Odyssey remained science fiction, as did 1984. So, how do we delineate what falls into the domain of science fiction?

My answer is consistency. Science fiction need not be consistent with known science, but it should be self-consistent. Its capacity for self-consistency, like that of mathematics, will likely be a boon to it in the future. It could also be claimed that by being consistent, science fiction makes itself diverge even moreso from reality, which is famously inconsistent (and when it behaves consistently, does so in baroque and mysterious ways).

So, what cornerstones of science fiction do we in this way eliminate? Star Wars, Star Trek, and Doctor Who do not retain consistency, but they all fail at other tests of science fiction purity. They fall into the domain of fantasy, which is perhaps where they belong.

Perhaps more interestingly, what falls into science fiction under this definition that would not otherwise? The Age of Unreason series does, despite being set in the eighteenth century and focusing on alchemy, because the rules set down for alchemy are not contradicted later on. The Laundry Files series also falls into science fiction, though due to some flukes in early books, the Dresden Files series does not. Hackers, while ludicrously and hilariously at odds with reality, does not break its own rules, and thus falls under the aegis of sci-fi.

Thursday, September 1, 2011

Rule 34 and sidestepping superorganisms

Before I start this rant, I should mention that I absolutely love Charlie Stross's books and his blog, and that Rule 34 was no exception. I say this because the rest of this post will be fairly critical of the book, and may be somewhat critical of the man himself in passing. I should also note that everything I know of Stross I have learned from his blog and his various talks, which are posted there; as a result, I probably have a skewed view of what he does and doesn't know or think.

So, if you haven't read Rule 34 yet (and you should; I had to order it from Amazon, but if you don't live in the sticks your local bookstore probably has it), the basic idea behind it is this: in the future, Scotland is its own country, and the Edinburgh police force has a special operations squad studying internet memes and keeping an eye out for dangerous ones (ranging from things like planking, which are dangerous due to stupidity rather than malice, to things like copycat suicides, which are dangerous due to the autotoxic nature of the meme itself). Great setup. Then, there are a series of strange murders framed as bizarre suicides. If you keep up with this blog, you know where this is going -- but you'd be wrong.

While I won't spoil the twist ending (which is interesting in of itself, for all kinds of reasons even tangential to plot), the perp is not in fact a meme, nor is it some superorganism. While memetic perps are hard to write, there are several minor ones in that book particularly (and memetic perps of the Young Werther mould have been a dime a dozen since Gothe's time -- while they probably have about the same audience familiarity as laughing plagues or the Boston Molasses Disaster and are considered odder than Strange Rains); as for superorganism characters, they are still rare but they have been handled convincingly even in Count Zero.

So, why did Stross sidestep this idea?

He's aware of it. I mentioned it in a comment on his blog, which he responded to. Perhaps he doesn't feel like he can handle it as well? Perhaps he thinks it won't make such a ripping yarn?

Unfortunately, despite his excellent record in terms of legitimately new ideas, Rule 34 is innovative only by 1982 standards. This is not a big deal -- Neuromancer has certainly kept its flavour. However, Stross' books have been consistently stuck twenty minutes into the future of science fiction authorship, leading the way for all sorts of new twists on old genres that haven't even come up from the underground yet. I might even be tempted to blame The Atrocity Archives for the popularity of Magic-A-is-Magic-A urban fantasy thrillers, if not for the distinct lack of vampires and the distinct lovecraft-nerd flavour.

Now, if you haven't read Rule 34, I still recommend reading it. Read Halting State first. Then, read everything else he's ever written. He's impressive, and his adherence to hardness even in things like the Laundry series borders on the obsessive; in Iron Sunrise he talks the reader through the mathematics of physics fermi-figures on a fairly regular basis. I just feel like he missed an opportunity to really wow his readers with the resolution of Rule 34, and I hope he will approach the subject of superorganisms with agency in some later book.