Monday, December 17, 2012

A good social engineering article from the BBC

The Alan Woodward wrote a fairly good article for the BBC on the vulnerabilities of the human biocomputer. Ignore the pseudo-religious language in the introduction; this is good stuff.

The phenomenon of "social engineering" is behind the vast majority of successful hacking.
This isn't the high tech wizardry of Hollywood but is a good, old-fashioned confidence trick.
It's been updated for the modern age, and although modern terms such as "phishing" and "smishing" are used to describe the specific tricks used, they all rely upon a set of human characteristics which, with due respect to Hieronymus Bosch, you might picture as the "seven deadly sins" of social engineering.
Apathy:
To fall for a confidence trick, or worse, we assume others "must" have taken the necessary steps to keep us secure.
Sadly this leads to a lack of awareness, and in the world of the hacker that is fatal. When we stay in a hotel and we programme our random number into the room safe to keep our belongings secure, how many of us check to see if the manufacturers override code has been left in the safe?
It's nearly always 0000 or 1234 so try it next time.
Curiosity:Humans are curious by nature. However, naive and uninformed curiosity has caused many casualties. Criminals know we're curious and they will try to lure us in. If we see an unfamiliar door appear in a building we frequent, we all wonder where it leads.
We might be tempted to open it and find out, but in the online world that might just be a trap waiting for an innocent user to spring it. A colleague built a website that contained a button that said Do Not Press, and was astonished to find that the majority of people actually pressed it.
Be curious, but exercise a healthy degree of suspicion.Gullibility:It is often thought of as a derogatory term, but we all suffer from this sin. We make assumptions.
We take others at face value, especially outside of our areas of expertise. Put a uniform on someone and we assume they have authority. 
Give an email an official appearance by using the correct logo and apparently coming from the correct email address, and we might just assume it's real, regardless of how silly its instructions might be.
All of this can be easily forged online, so make no assumptions.Courtesy:We quite rightly all teach our children to be polite. However, politeness does not mean you should not discriminate.
If you do not know something, or you feel something doesn't feel quite right, ask. This principle is truer than ever in the online world, where we are asked to interact with people and systems in ways with which we are quite unfamiliar.
If someone phones you out of the blue and says they are from your bank do you believe them? 
No. Phone them back.
And by the way, use a mobile phone as landlines can remain connected to the person who made the call in the first place and so whilst you might think you're phoning the bank on a valid number you're just talking to the person who called you.Greed:Despite what we'd like to think we are all susceptible to greed even though it might not feel like greed.
Since its inception, the very culture of the web has been to share items for free.
Initially this was academic research, but as the internet was commercialised in the mid-1990s, we were left with the impression that we could still find something for nothing.
Nothing is ever truly free online. You have to remember that if you're not the paying customer, you're very likely to be the product. In the worst case, you might find that you have taken something onto your machine that is far from what you bargained for.
Many pieces of malware are actively downloaded by owners unaware that the "free" product contains a nasty payload, even if it also appears to do what you expected of it.Diffidence:
People are reluctant to ask strangers for ID, and in the online world it is more important than ever to establish the credentials of those whom you entrust with your sensitive information.
Do not let circumstances lead you to make assumptions about ID.
For example, if someone from "IT support" calls you and asks for your password so they can help fix your problem, how do you know they haven't called everyone else in the building first until they found you who has really got a problem?
This is a well-known attack. If someone has a problem with proving who they are, you should immediately be suspicious.Thoughtlessness:Thinking before you act is possibly the most effective means of protecting yourself online. It is all too easy to click that link.
Stop.
How many of us when reading an apparently valid link in an email would bother to check whether the link is actually valid or whether instead it takes you to a malicious site.
It's horribly easy to make links look valid so try hovering your cursor over the link for a few seconds before clicking to see what the real link is: the true link pops up if you give it a moment.
As cynical as it may sound, the only answer is to practise your A-B-C:
  • Assume nothing
  • Believe no one
  • Check everything
With more Christmas shopping expected to be done online this year than ever before, you should watch out for those that would exploit the deadly sins.
Don't give criminals the chance to ruin your holiday season, and remember that a little bit of paranoia goes a long way online.
Alan Woodward is a visiting professor at the University of Surrey's department of computing. He has worked for the UK government and consults on issues including cyber-security, covert communications and forensic computing. 

Source

Now, I wish there had been a bit more cog-sci in this article and a bit less of the forced comparisons to medieval eschatological literature, but all in all it's pretty good.

Too bad nobody will follow it. 

Friday, September 14, 2012

Speculative fiction and novelty

I've wrestled for a while with the idea that "SF is the genre of ideas". I have argued on both sides, and eventually come to the conclusion that SF really refers to a plethora of very different genres with very little in common. Recently, I've grown fond of Neal Stephenson's statement that SF is defined by the fact that the characters behave competently and intelligently (even if this is lacking in verisimilitude -- after all, the real world is arguably better-defined by accidents and screw-ups than by intelligent and informed choices). Nevertheless, the plain fact is, some SF is truly mind-bending and the rest of it simply isn't.

What I think it all comes down to is novelty, and novelty concentrations.

We've talked about novelty on this blog before. Novelty is Shannon-information. It's the unexpected. It's the data point that changes the model. It's the punchline of a funny joke, or the reveal that retcons all of continuity. The only thing that is mind-bending or mind-expanding in of itself is novelty, because everything else you've seen before.

Novelty is not a renewable resource. Not within a single skull, without some major memory problems. Novelty is exhausted immediately upon consumption. Luckily, novelty is not the same as originality. At the time of publication, patents are deemed novel to the patent clerk who approved them, and many patents are for improvements that are obvious in retrospect. Good novelty (or useful novelty) has that funny quirk: it's obvious in retrospect and entirely unexpected beforehand.

Novelty can be gotten cheaply, in relatively small quantities, under circumstances of isolation of social groups. The character of the yokel shocked by the character of the big city is representative of this, as is the culture clash between European explorers and indigenous Americans during the sixteenth century. But, now that the internet makes large quantities of information easily transferred, the only large caches of naturally (which is to say unintentionally) produced novelty are those things completely undocumented and those things intentionally blocked. People who subscribe to their 'Daily Me' and limit their information intake to their filter bubble can still have their mind blown by the ideas of people in different filter bubbles, but such people are rarely habitual novelty-seekers.

For habitual novelty-seekers, the reality tunnels of other people are interesting until exhausted. Speculative fiction comes in at this point, and in this sense it often is the genre of ideas.

Novelty can be generated in different ways. Some of them are fairly mechanical. Burroughs had his cutups, and the Surrealists had their exquisite corpse. Various mind-altering drugs and habits of thought like Dali's paranoiac-critical method can be used to do to the mind what cutups and the exquisite corpse do to text. These methods are much like taking a computer and banging it with a hammer in the hope that it will become an automobile, and they are successful only to the extent that the brain is a wonderfully adaptable machine capable of making sense out of any sort of noise. A more directed and less wasteful but still fairly mechanical method is to take an existing memeplex, find a likely crux, and reconnect or recombine it. This is more like what glitch musicians do, or what gametes do in sexual reproduction. This happens to be the core of speculative fiction, and what many of the better writers start out with.

Take a model of the world. Zoom in on one piece. Twist until mind blown. Spew the resulting model onto paper, with a narrative glued to it in a bag on the side. Rinse and repeat.

What you get is a synthetic reality tunnel. Speculative fiction authors don't just write about robots and space ships; they mutate their view of reality systematically to manufacture new ways of modeling the world that are internally consistent but that nobody really subscribes to yet.

The important part is 'yet'.

Highly popular pieces of speculative fiction get widely read. The model of the world in these stories gets incorporated into the world-models of other people. The novelty seeps away as cultural osmosis sets in.

Many people, for instance, have a view of evolution as guided progress toward a pre-ordained goal. This forms the basis of works as far-flung as 2001 A Space Odyssey, The Starseed Transmissions, Childhood's End, Altered States, and The X Men. Of course, this model is bullshit, and does not describe evolution at all. Few people realize that it came from a handful of science fiction stories in pulp magazines during the first two decades of the twentieth century, most notably The Man Who Evolved.

Certain ideas are no longer novel at all, purely because they were initially considered extremely novel and therefore reached maximum saturation much more quickly.

Sunday, May 6, 2012

Another rant about science fiction


The following is my comment to this post on Charlie Stross's blog.
I'm going to play devil's advocate here and try to argue against all your points -- not because they aren't justified, but because I've had far too much caffeine today. Hopefully the result will be coherent and not too ranty. 
1) The power-chords of sci-fi are in many ways a bad thing. It doesn't make sense to classify everything with prominently featured space ships or robots as science fiction (as I think you said before in a rant about Star Trek). Action movies with robots are just action movies. While it has historically been useful to have these bits of shorthand, the fact that people could easily turn these things into hieroglyphs and expect a stylized silhouette of a rocket on the spine of a book to be almost universally understood as code for "shelve this in the science fiction section" is an indicator of just how little depth these things have. 
So far as I can tell, the flooding of the market by crass commercializations with all the apparent symbols of the genre but none of the guts is not new. Maybe it's become a bit more common now that action movies in science fiction drag (and romance novels in fantasy-horror drag, and action movies in cyberpunk drag) have become so profitable, but looking through the science fiction section of a used book store yields an enormous quantity of slim volumes by (righteously) unknown authors whose prominent placement of space ships on the cover and use of terms like "groundcar" and "zeerust" instead of "car" and "rust" are the sole saving grace that put them into the science fiction section rather than the slush pile of a much more selective general-audience-fiction section. 
Perhaps this is a foolish and egotistical position to take, but I've always associated science fiction with a kind of intellectual daring and experimentation. Naked Lunch had no space ships, but was classed as science fiction because it was too weird to be placed next to this week's new best-seller in the general section. While a lot of the groundbreaking science fiction of the golden age focused on space exploration (or at least had it as a major background element), both earlier and later science fiction did not. Is it justified to stick a starship on Neuromancer? What about on Odd John?
Using these power chords as shorthand for the entirety of the genre is occasionally useful, but I would argue that it's gotten to the point where it is far more misleading. Someone who has a strong interest in Halting State may have little to no interest in Hyperion; they are very different books with very different styles set in very different worlds, and it's questionable whether they should even be classified as the same genre. 
This brings me to my next point. 
2) The fragmentation of genres into overlapping tags is good for authors, for readers, for booksellers -- for everybody except the people who are shelving books by hand in brick-and-mortar stores. 
While the science fiction section is a ghetto to be sure, it's a ghetto of ludicrous diversity. Someone who buys science fiction off a list of science fiction in order to stock the shelves of a science fiction section will get a handful of paranormal romance, a handful of action/adventure with a nominally science-fictional setting, a handful of cyberpunk, a couple books like those of Butcher's Dresden Files series (which are a cross between two genres, neither of them science fiction, but often get filed under science fiction anyway), and -- if they're lucky -- a couple books that get filed as science fiction purely for the weirdness factor (like Lethem's Amnesia Moon or anything by Pynchon or Ballard or WSB). Chances are, anyone who goes into the science fiction section because they like a particular kind of science fiction will find nothing of particular interest, unless they are very open-minded or very easily amused. 
Part of this may be because the fragmentation of the genre, when first it gained legs, was extremely successful. Cyberpunk took off and showed science fiction that a book could be successful without a space ship shoehorned into it or a raygun-wielding pinup on the cover. The tendency for science fiction books to put in all the elements as though off a list -- this took a bit of a nosedive. (I am speculating a little here, since I cannot easily perform quantitative analysis of trends in the frequency of unnecessary instances of starships, rayguns, and robots versus properly justified instances of the same!) If fragmentation is encouraged even more, we'll still have the power chords: they just won't be used so often for the sake of classifying it into a particular genre. 
PKD is perhaps an early example of the trend I see as a whole. Early PKD stories invariably had space travel and robots, regardless of whether or not their existence in the story was justified. But, a sequence of what is often considered his best work follows a very noticeable trend. 
  • The Three Stigmata of Palmer Eldrich had space travel quite prominently, and much of it was justified. 
  • Ubik had space travel prominently in the very beginning of the book, and had no space travel at all for the entire remainder of the book, because bringing it in would not be justifiable. 
  • The Man in the High Castle had space travel as a background element, but it was not prominent because it could not be justified. 
  • A Scanner Darkly, despite being set in the far-off year of 1992, had no space travel and no robots. It was clearly science fiction, but limited the science fiction elements to those that were necessary for storytelling -- very different from earlier novels, in which robot cab drivers were everywhere and cars flew and things took place on Mars for no clear reason (sometimes breaking fairly well-defined physical laws in the process). 
But, if science fiction as a genre is not cohesive enough to merit being listed in the same section, how is a science fiction fan going to become exposed to authors they have never heard of? 
3) There is a very good reason, from the consumer perspective, to go to a brick-and-mortar bookstore and look through their poorly-curated and half-assedly-assembled science fiction section. It is the same reason, paradoxically, that it is desirable that amazon's recommendation engine not be perfectly accurate. 
I will again point out my assumption that science fiction is the domain of daring neophiles with strange ideas. Perhaps people disagree with me on this and would instead I call this by some other name -- after all, science fiction has a long history of referring to two completely different genres with occasional overlaps between them: the genre of daring neophilic explorations of strange ideas, and the genre of people with robots and space ships and laser guns. I'll call the latter 'raygun adventure' instead, for the sake of avoiding confusion. 
Someone whose primary interest is mystery novels (or whose primary interest is raygun adventure novels) may not have any particular problem with reading many takes on what amounts to the same book. A police procedural can be fairly formulaic, and many popular police procedurals are: the interest comes from the emotional drama, from dramatic tension, from not knowing the exact details of the ending, or from knowing the exact details of the ending. 
A book that plays primarily with the emotions of its readers can be extremely successful without giving the reader any new information, or even having any kind of consistent internal logic. Porn remains arousing even to people who realize that pizza delivery boys rarely manage to seduce lonely housewives, and The Kite Runner managed to be disgustingly disturbing despite the problems inherent in the idea that a former Nazi could be a small child in the 1970s and want to join the Taliban in the 1990s.  
Science fiction is different. The one defining factor in science fiction is that it attempts to be information-rich. The characters and plot can (and often do) hang on the world-building or the conceit.
  • Dune drags on endlessly, has numerous minor inconsistencies, bases itself around ideas like the existence of a secret sorority of NLP-masters who seed myths and plot to control the universe... but, it's just so damned interesting when the implications are explored that the predictable and recycled plot, the dull language, the all-pervasive humorlessness, and the occasional forays into incomprehensible non-sequitor are excused. 
  • Altered Carbon has many instances of weak writing, and a plot that's fairly tangled. It has enormous numbers of scenes of gratuitous sex and violence, and many of them seem entirely out of place. But, as a thought experiment it's incredible. The fact that it features what amounts to body transplants does not make it science fiction. Altered Carbon is science fiction because it points out that the existence of body transplant technology would fundamentally change the insurance industry, make it possible for the extremely wealthy to be essentially completely immortal, almost completely wipe out the Catholic church as a political and religious power, and still wouldn't solve problems like mood swings and menstruation. 
The solution may well not be to make recommendation engines more accurate. While recommendation engines more useful the more accurate they are up to a point, to be sure, within the domain of science fiction recommendation engines are more useful the less accurate they are (up to a point). 
Science fiction as a genre only ever worked because there were hardcore fans who would read books about space ships *and* read books about automobile accident fetishes. A regular genre reader would not be able to push his or her way through Rucker's Software were he only interested in robot books, or space ship books, because he'd get stuck on the parts about cannibalistic cults and recreational drug use and anarchism. 
While other genres may be syncretic by accident, science fiction is defined by its syncretism. It's defined by its mashups. Anything that looks like science fiction, by definition, isn't. 
If it has space ships and robots, it's probably an action movie. Don't trust anyone over 30.

Wednesday, March 7, 2012

A rant

It's 2012. Our cars do not fly, are not self-driving, and are powered by fire.

Even though there were personal robots in the american consumer market in the 1980s capable of:
1) Speech recognition (including voice authentication *and* voice command)
2) Goal-oriented programming
3) Real-time procedural generation of original narratives
4) Fairly nuanced navigation and area-mapping, including self-charging, without vision or navigation beacons
the most advanced personal robot on the american consumer market at the moment (barring kit robots like PINO and the PR-2) is the Pleo, which has support for only #4.

Our computers almost universally depend upon a user interface invented in the early 1970s by XEROX and popularized in 1984 -- a user interface now so common that people feel as though no other user interface is possible.

No one has been on the moon since 1971.

The space shuttle is limited to the technology that existed when it was designed (with some exceptions, provided interoperability) in the early 1970s. As a result, at the time of the program's closure, the space shuttle computers still used core memory.

The general design adhered to by tablet computers was created initially by the dynabook project in 1968, and variants on it can be seen in Star Trek: The Next Generation.

The much-advertised SIRI is based loosely on an SRI AI project of the same name. However, Apple's version is not significantly more complex than Eliza or ALICE.

In 1997, the World Wide Web was clearly a simulation of paper under glass, with the occasional hyperlink or animated GIF for distraction. In 2012, most of the web is still clearly a simulation of paper under glass, but elaborate hacks involving abuse of self-modifying code and incredible wastes of bandwidth trick some fraction of the web into instead poorly simulating the 1970s XEROX user interface inside of itself.

In the 1980s, Steve Mann invented a wearable computer that could detect billboards and replace their content with content of his choice. In the early 1990s, the MIT Media Lab worked on a wearable computer that was context-aware and selectively gave reminders and suggestions via subliminal text display when a complete focus shift would not be advisable. In 2012, Google announced that it would work on a wearable computer project, hiring people from the 1990s MIT Media Lab wearable computer projects. Judging from other Android devices and Google's other products, this augmented reality project seems like it will probably involve inserting advertisements where none previously existed, rather than removing them (as Steve Mann's device did).

Whether or not a hyperlink is broken on the web still relies entirely upon the maintenance of the page pointed to, despite all hypertext projects prior to the 1992 Berners-Lee project having solved this problem. Hyperlinks on the web still point to whole pages, or at best single points within pages (given the foresight of the original author to place labels at appropriate points), whereas some pre-1992 hypertext projects supported bidirectional links between spans of content (including multiple overlapping links).

The portable digital music player was invented in the mid-70s. Aside from major storage increases and the occasional feature like video support, portable digital music players have remained largely unchanged in design.

In 1978, a brain-computer interface allowed a blind man to see with the use of a low-resolution digital camera. In 2002, sixteen other subjects had the treatment. It is still not commercially available.

Between 1966 and 1972, MIT developed a mobile robot named SHAKEY that was capable of goal-based reasoning, route planning, environment mapping, obstacle avoidance, object detection, and limited forms of object manipulation.

In 1982, a patent was filed on the use of optical fibers as bend sensors. This was used in a number of 'wired gloves', which use bend sensors and accelerometers to report the state of hand movements to a computer. In 1987, Nintendo commercialized a product similar to these devices as the PowerGlove. In the late 1980s and early 1990s, there was a device used by people who could not afford a PowerGlove (or its more technically advanced brethren), which took the form of a hollowed-out 8-ball full of accelerometers with several buttons on the bottom and a couple infrared beacons. This technology was revived recently as the WiiMote (with the infrared beacons moved to the 'sensorbar' and the cable removed).

In 1991, HP released the HP95LX. It was the size of a modern PDA, had the capabilities of a (low end) stock PC, and ran a stock version of MS-DOS off ROM. It is possible to run Windows 3.x on these.

In 1990, NewTek released the Video Toaster, a piece of hardware that (when attached to a Commodore Amiga) allowed consumers to perform linear video editing, including real time 3d animation.

In short, there are a lot of very cool ideas that have not been implemented, have not been commercialized, or have not been commercially successful. Some of them are very old but still very cool. All of them are still possible. Remember this the next time you are trying to choose between Drupal and Ruby on Rails for your social networking site for iguanas.

Monday, February 13, 2012

Two unexpected varieties of synthesis

OWS has been analyzed a lot, recently. I hope to apply to it some fresh ideas, by comparing it to Maker culture (a comparison that has not been made so far as I am aware).

First off, a bit of background. For those not familiar with Marxist theory, a synthesis is the end result of two opposing cultural forces (thesis and antithesis). Each synthesis becomes the thesis or antithesis for some later synthesis. It is this model of a cultural dialectic that led Marx to believe that his conception of communism was inevitable -- something that is demonstrably untrue (his conception of communism has, as of yet, not even been attempted -- though plenty of people and groups apply the term 'Marxist' to their own pet projects). The antithesis Marx was concerned with was the alienated workers: specifically, the working poor who could no longer comprehend nor afford the products of their labor, creeping Taylorism chopping their assigned tasks into smaller and smaller super-specialized subroutines.

In a sense, both OWS and Maker culture are syntheses of worker alienation and capitalism. OWS is the more traditional of the two: a protest movement with a hint of stale Situationism, it is at its core a kind of solidarity-centric replay of medieval acetic movements, and dangerous for the same reasons. Maker culture is arguably more constructive: the old anarcho-punk DIY culture with all the external trappings of political discontent and blood-in-tooth-and-nail stripped away, it is in a sense even more dangerous to the establishment, because while retaining an ostensible capitalist element it promotes the idea of distributing the means of production rather than seizing it.

Both of these are major threats to consumerism -- as opposed to capitalism, which is not necessarily the same thing. Neither of these movements run counter to capitalism per-se. OWS is concerned with money and its distribution, which puts it firmly in line with pseudo-Marxist statist movements of the past and present (such as those leading to the Soviet Union and modern China, those stalwart encouraging symbols of state-central capitalism shrouded in a cloak of pseudo-Communist babble that told the western world that the Marxists aren't so different at all) -- at its most extreme, it is a socialist movement that seeks to juggle the green paper around such that there isn't such an extreme difference between the powers (social, political, and material) in the hands of the exceedingly wealthy (who -- remember -- are now more capable of venturing into outer space than the ESA, and maybe even NASA) and the exceedingly poor (who cannot necessarily feed themselves with the money provided by the government, on account of pricing policies designed to fleece those who cannot afford to travel very far to buy their bread and milk). An evening out of these powers would abolish the middle class as an entity by clumping everyone together in a single middle class, ending consumerism but not conspicuous consumerism (let alone capitalism, which only the obsolescence of money can practically abolish).

On the other hand, the Maker movement is in a sense far more radical. Living within the bubble of consumerism, this movement is focused largely on tools to make tools -- in some cases automatically. The holy grail, even now, is affordable automatic manufacturing of a variety of objects. Currently, the best contenders in this category are minimally affordable, minimally automatic, and minimally various (a 3d printer costs a bit over a thousand dollars, can print enough of a copy of itself to shave five hundred dollars off of a replica of itself for fifty dollars worth of materials provided one is willing to go through quite a bit of assembly and tweaking, and with extensive modification can produce volumetric solids from several varieties of plastics, chocolate, and wax; some attempts at using other materials have met with limited success, but electronics require a totally separate system, and any material that requires subtractive rather than additive forming requires a CNC mill instead). However, progress is certainly being made: a few decades ago, J. Random Luser could not expect to be able to afford a CNC, but now he can keep a CNC and a 3d printer in his garage and operate them on weekends. It will cost him as much as a brand new computer might have cost him in 1990, and he can use that computer he bought in 1990 to control the devices. The threat to consumerism is clear: J. Random Luser does not need to spend money in order to participate in conspicuous consumerism; he certainly doesn't need to shop the commodity market for the external trappings of an identity, now that he can download a car off The Pirate Bay and run it off the printer while he's at work, put it together on weekends as a project with the kids. It's no post-scarcity society, but it's a society where the artificial scarcity of commodities cannot be maintained (just as widespread internet access made the artificial scarcity of media unmaintainable -- though don't tell the RIAA that).

Nevertheless, these are both the result of the white collar alienation described in J. G. Ballard's The Millennium People, not the blue collar alienation of the working poor covered by Marx. The working poor cannot afford a 3d printer. The working poor cannot afford to take time off to camp out in tents in public parks (though the non-working poor -- homeless who are not suitable to be hired and who cannot get government assistance on account of having no permanent address; the people who might have broken shop windows in the 30s to get into jail and join one of the two classes in America with guaranteed food, shelter, and health care -- can indeed afford to join the protesters in their Hoovervilles). What synthesis do we see in the working poor? Certainly not socialist solidarity!

The attempts by the free market to disabuse the working poor of the notion that all those things socialism brought to the European working poor (minimum wage, unionization, health care, government assistance) are evil has been more or less successful in the United States, where (as is attributed to Vincent Price) "the working poor see themselves as temporarily embarrassed potential millionaires". Of course, this tension between True Belief and Stark Reality will not find its synthesis in some kind of sensible policy decision or effective practical application. As with other forms of cognitive dissonance, the appropriate response is a retreat into apocalyptic fantasy: the religious right reigns in the economically downtrodden middle-America just as the religious right reigns in war-torn Afghanistan (both in the Taliban -- the charismatic religious conservatives of the Muslim world -- and in those sent in to fight them... too bad only Neal Stephenson characters and BBC documentary producers appear to realize the similarities). Those with enough opium sink into a haze and ignore the bombs outside.

What's a little shocking is that it seems like nobody (aside from Cory Doctorow, who could be credited with inventing makers instead) predicted either of these.

Sunday, February 12, 2012

Neal Stephenson on jump-starting the science myth

TL;DR version: between the lack of technical advances that are immediately obvious to laymen and the new tendency in science fiction toward pessimistic and introspective works, the popular idea of the myth of scientific progress is losing ground to myths of regression and millennial, and we should counter this by writing optimistic scifi and creating big symbolic works.



Edit: and a counterbalance!