SECRET * BASILISK
August 7, 2005 12:35 AM Subscribe
Whoa, I read that in Interzone back in the Eighties... made a big impression on me at the time. Very neat story: bit like Roald Dahl.
posted by TheophileEscargot at 1:25 AM on August 7, 2005
posted by TheophileEscargot at 1:25 AM on August 7, 2005
Woah, I remember reading that too, and since then I've looked for it on a number of occasions. I was interested in the 20th century concept of a picture/song/sound/book/joke that can kill and as a consequence wanted to reread it. Thanks for that.
posted by seanyboy at 1:54 AM on August 7, 2005
posted by seanyboy at 1:54 AM on August 7, 2005
Love that story, thanks. Langford has also written the related COMP.BASILISK FAQ (read the story first or you'll be spoiled).
posted by gds at 2:52 AM on August 7, 2005
posted by gds at 2:52 AM on August 7, 2005
They told us
All they wanted
Was a sound that could kill someone
From a distance.
So we go ahead,
And the meters are over in the red.
It's a mistake in the making.
From the painful cry of mothers,
To the terrifying scream,
We recorded it and put it into our machine.
posted by seanyboy at 3:30 AM on August 7, 2005
All they wanted
Was a sound that could kill someone
From a distance.
So we go ahead,
And the meters are over in the red.
It's a mistake in the making.
From the painful cry of mothers,
To the terrifying scream,
We recorded it and put it into our machine.
posted by seanyboy at 3:30 AM on August 7, 2005
Mentioned previously here. I also really liked the story A Colder War from the same site (Cold War meets Lovecraft theme).
posted by Edame at 4:33 AM on August 7, 2005
posted by Edame at 4:33 AM on August 7, 2005
I'm mildly irked that the comp.basilisk FAQ mentions the Monty Python skit but overlooks Lord Dunsany's The Three Infernal Jokes (The Book of Wonder, 1915). But its just that the notion is very old.
posted by wobh at 8:37 AM on August 7, 2005
posted by wobh at 8:37 AM on August 7, 2005
Langford basilisks also pop up on MacLeod's The Cassini Division, sort of.
Then there are other kinds of basilisks in Stross's Atrocity Archives.
Does anyone remember whether the Blight used visual hacks in A Fire Upon the Deep?
posted by ROU_Xenophobe at 9:26 AM on August 7, 2005
Then there are other kinds of basilisks in Stross's Atrocity Archives.
Does anyone remember whether the Blight used visual hacks in A Fire Upon the Deep?
posted by ROU_Xenophobe at 9:26 AM on August 7, 2005
Does anyone else remember this story told from a schoolchild's point of view? Good stuff.
posted by Space Kitty at 10:00 AM on August 7, 2005
posted by Space Kitty at 10:00 AM on August 7, 2005
It's the same plot device as the Monty Python "funniest joke in the world" sketch, tarted up as science fiction.
posted by w0mbat at 11:14 AM on August 7, 2005
posted by w0mbat at 11:14 AM on August 7, 2005
To all of you naysayers crying unoriginality, do I really need to point out that the idea of something that will kill you if you look at/hear it predates the Bible? Langford did an interesting job taking such an ancient theme and making it scientifically plausible.
posted by Faint of Butt at 11:27 AM on August 7, 2005
posted by Faint of Butt at 11:27 AM on August 7, 2005
> Does anyone else remember this story told from a schoolchild's point of view? Good stuff.
That'd be Different Kinds of Darkness.
Langford's compilation of the same name is recommended.
posted by BobInce at 11:31 AM on August 7, 2005 [1 favorite]
That'd be Different Kinds of Darkness.
Langford's compilation of the same name is recommended.
posted by BobInce at 11:31 AM on August 7, 2005 [1 favorite]
No, it really isn't. The Python sketch is about taking an idiom literally. That's it. Die laughing, huh? What if that really happened? What if "knockin' 'em dead" meant that? It's a silly language riff, up there with having "costs an arm and a leg" taken literally, or having people literally glued to their seats in a theater.
BLIT and related stories don't come from a silly turn of phrase. They're based on the idea that human mentalities are really Turing machines, just very big and complex ones. If human minds are Turing machines, then that means that they're subject to the usual constraints: receiving certain inputs can crash a Turing machine, putting it into a halting state. The stories all use visual input because that has very high bandwidth, and at least arguably the retina is a part of the brain directly exposed to the input. It might be more interesting at this point to have an olfactory basilisk.
So funny as the sketch is, it ain't the same. One is a language goof, the other is an extended exploration of what it might mean if human consciousness functioned in a particular way.
posted by ROU_Xenophobe at 11:31 AM on August 7, 2005 [4 favorites]
BLIT and related stories don't come from a silly turn of phrase. They're based on the idea that human mentalities are really Turing machines, just very big and complex ones. If human minds are Turing machines, then that means that they're subject to the usual constraints: receiving certain inputs can crash a Turing machine, putting it into a halting state. The stories all use visual input because that has very high bandwidth, and at least arguably the retina is a part of the brain directly exposed to the input. It might be more interesting at this point to have an olfactory basilisk.
So funny as the sketch is, it ain't the same. One is a language goof, the other is an extended exploration of what it might mean if human consciousness functioned in a particular way.
posted by ROU_Xenophobe at 11:31 AM on August 7, 2005 [4 favorites]
No, it really isn't the same plot device as in the Python sketch, that is.
posted by ROU_Xenophobe at 11:33 AM on August 7, 2005
posted by ROU_Xenophobe at 11:33 AM on August 7, 2005
oh, *thank you* BobInce, that would have drove me crazy. I have to pick up Langford's compilation - I remember Different Kinds of Darkness from a Year's Best anthology.
posted by Space Kitty at 12:20 PM on August 7, 2005
posted by Space Kitty at 12:20 PM on August 7, 2005
No mention of Snow Crash in the comp.basilisk FAQ? OK, it didn't explicitly put people into a halting state, but the concept is very, very similar...
posted by Jon Mitchell at 12:25 PM on August 7, 2005
posted by Jon Mitchell at 12:25 PM on August 7, 2005
If human minds are Turing machines, then that means that they're subject to the usual constraints: receiving certain inputs can crash a Turing machine, putting it into a halting state.
Huh? I can draw up a Turing machine that never, ever "crashes" no matter what the input is. Here it is:
State 0: Read any symbol off the tape. Write the same symbol back to the tape. Go to State 0.
Generally, determining whether a Turing machine will be halted on some (or any) input is the subject of a fundamental result in computer science.
Just because it's a Turing machine doesn't mean that there's an input that will halt it.
posted by tss at 1:22 PM on August 7, 2005
Huh? I can draw up a Turing machine that never, ever "crashes" no matter what the input is. Here it is:
State 0: Read any symbol off the tape. Write the same symbol back to the tape. Go to State 0.
Generally, determining whether a Turing machine will be halted on some (or any) input is the subject of a fundamental result in computer science.
Just because it's a Turing machine doesn't mean that there's an input that will halt it.
posted by tss at 1:22 PM on August 7, 2005
Good story, and I liked that "A Colder War" thing even more. I'm a sucker for new takes on Lovecraftian cosmic horror.
As for the Basilisk stuff, Wikipedia has a good article that summarizes the motif of harmful sensation, including a history.
posted by Joakim Ziegler at 1:45 PM on August 7, 2005
As for the Basilisk stuff, Wikipedia has a good article that summarizes the motif of harmful sensation, including a history.
posted by Joakim Ziegler at 1:45 PM on August 7, 2005
tss, I meant a universal Turing machine, not a specific machine implementation of a specific algorithm.
posted by ROU_Xenophobe at 2:59 PM on August 7, 2005
posted by ROU_Xenophobe at 2:59 PM on August 7, 2005
I meant a universal Turing machine, not a specific machine implementation of a specific algorithm.
Earlier: They're based on the idea that human mentalities are really Turing machines, just very big and complex ones. If human minds are Turing machines
OK, so they're universal Turing machines, i.e. TMs capable of being programmed to act like any other TM.
In that case "big and complex" doesn't mean anything, since the physical manifestation of the TM is irrelevant to its analysis. Famously, people have made TMs (or finite approximations) out of Lincoln Logs. So we're supposing that our UTM is running on neural hardware. That's fine.
Now the question is: what is the program our UTM runs? Because it surely is running some program---it's not just sitting inert. We're not at the stage where the TM is waiting to be programmed by symbols on the tape. The go button was pressed and some of that programming has already happened.
Well, since it's running a program, that's just the same as asking this: what kind of non-universal Turing machine is our UTM brain acting like? In which case, yes, we're right back at square 1. Our mind program could be a program that can't ever halt. It may be that it can---it may be that you can even break into our UTM's particular "programming mode," and that's the basis of this story and Snow Crash. However, it's simply not the case that if something's a TM, even a UTM, that it will halt. If you can somehow demonstrate otherwise, not only will you get a Turing award, they'll rename it the ROU_Xenophobe award for future recipients.
posted by tss at 7:35 PM on August 7, 2005
Earlier: They're based on the idea that human mentalities are really Turing machines, just very big and complex ones. If human minds are Turing machines
OK, so they're universal Turing machines, i.e. TMs capable of being programmed to act like any other TM.
In that case "big and complex" doesn't mean anything, since the physical manifestation of the TM is irrelevant to its analysis. Famously, people have made TMs (or finite approximations) out of Lincoln Logs. So we're supposing that our UTM is running on neural hardware. That's fine.
Now the question is: what is the program our UTM runs? Because it surely is running some program---it's not just sitting inert. We're not at the stage where the TM is waiting to be programmed by symbols on the tape. The go button was pressed and some of that programming has already happened.
Well, since it's running a program, that's just the same as asking this: what kind of non-universal Turing machine is our UTM brain acting like? In which case, yes, we're right back at square 1. Our mind program could be a program that can't ever halt. It may be that it can---it may be that you can even break into our UTM's particular "programming mode," and that's the basis of this story and Snow Crash. However, it's simply not the case that if something's a TM, even a UTM, that it will halt. If you can somehow demonstrate otherwise, not only will you get a Turing award, they'll rename it the ROU_Xenophobe award for future recipients.
posted by tss at 7:35 PM on August 7, 2005
Wow that whole infinityplus site is full of good stuff. And I can read it on my blackberry ... goodbye productivity for the next couple of weeks.
posted by forforf at 7:36 PM on August 7, 2005
posted by forforf at 7:36 PM on August 7, 2005
I'm happy to bear the correction, tss.
But my point was just that BLIT etc aren't from the same place as Python's killer joke. BLIT is science fiction in the purest sense.
posted by ROU_Xenophobe at 9:14 PM on August 7, 2005
But my point was just that BLIT etc aren't from the same place as Python's killer joke. BLIT is science fiction in the purest sense.
posted by ROU_Xenophobe at 9:14 PM on August 7, 2005
As ROU_Xenophobe says this is pure SF, but it's a kind of rare compassionate SF.
It takes a simple idea (images that kill) and looks at what would happen if that idea worked.
Anyway, I thought it was worth linking for the way that it dealt with the idea of urban terror.
posted by thatwhichfalls at 10:21 PM on August 7, 2005
It takes a simple idea (images that kill) and looks at what would happen if that idea worked.
Anyway, I thought it was worth linking for the way that it dealt with the idea of urban terror.
posted by thatwhichfalls at 10:21 PM on August 7, 2005
http://en.wikipedia.org/wiki/Motif_of_harmful_sensation
posted by Eideteker at 5:03 AM on August 8, 2005
posted by Eideteker at 5:03 AM on August 8, 2005
Joakim Ziegler: Thanks very much for that Wikipedia article, an excellent example of Wikipedia doing very well something that other encyclopedias wouldn't think to do at all.
And thatwhichfalls, thanks for posting the story -- I enjoyed it.
posted by languagehat at 6:32 AM on August 8, 2005
And thatwhichfalls, thanks for posting the story -- I enjoyed it.
posted by languagehat at 6:32 AM on August 8, 2005
After I read this I ended up on Charlie Stross's site, and started reading Concrete Jungle, which just won a Hugo. There's an eerie bit of crossover, for example:
ABSTRACT: Recent research in neuroanatomy has characterised the nature of the stellate ganglial networks responsible for gorgonism in patients with advanced astrocytoma affecting the cingulate gyrus. Tests combining the map of medusa layout with appropriate video preprocessing inputs have demonstrated the feasibility of mechanical induction of the medusa effect.
Progress in the emulation of dynamically reconfigurable hidden-layer neural networks using FPGA (fully programmable gate array) technology, combined with real-time digital video signal processing from binocular high-resolution video cameras, is likely within the next five years to allow us to download a medusa mode into suitably prepared surveillance CCTV cameras, allowing real-time digital video monitoring networks to achieve a true line-of-sight look-to-kill capability. Extensive safety protocols are discussed which must be implemented before this technology can be deployed nationally, in order to minimize the risk of misactivation.
Projected deployment of CCTV monitoring in public places is estimated to result in over one million cameras in situ in British mainland cities by 1999. Coverage will be complete by 2004-06. Anticipated developments in internetworking and improvements in online computing bandwidth suggest for the first time the capacity of achieving a total coverage defense-in-depth against any conceivable insurgency. The implications of this project are discussed, along with its possible efficacy in mitigating the consequences of CASE NIGHTMARE GREEN in September 2007.
posted by 31d1 at 6:45 AM on August 8, 2005
ABSTRACT: Recent research in neuroanatomy has characterised the nature of the stellate ganglial networks responsible for gorgonism in patients with advanced astrocytoma affecting the cingulate gyrus. Tests combining the map of medusa layout with appropriate video preprocessing inputs have demonstrated the feasibility of mechanical induction of the medusa effect.
Progress in the emulation of dynamically reconfigurable hidden-layer neural networks using FPGA (fully programmable gate array) technology, combined with real-time digital video signal processing from binocular high-resolution video cameras, is likely within the next five years to allow us to download a medusa mode into suitably prepared surveillance CCTV cameras, allowing real-time digital video monitoring networks to achieve a true line-of-sight look-to-kill capability. Extensive safety protocols are discussed which must be implemented before this technology can be deployed nationally, in order to minimize the risk of misactivation.
Projected deployment of CCTV monitoring in public places is estimated to result in over one million cameras in situ in British mainland cities by 1999. Coverage will be complete by 2004-06. Anticipated developments in internetworking and improvements in online computing bandwidth suggest for the first time the capacity of achieving a total coverage defense-in-depth against any conceivable insurgency. The implications of this project are discussed, along with its possible efficacy in mitigating the consequences of CASE NIGHTMARE GREEN in September 2007.
posted by 31d1 at 6:45 AM on August 8, 2005
« Older Norma Talmadge | RIP Robin Newer »
This thread has been archived and is closed to new comments
posted by Plinko at 12:38 AM on August 7, 2005