Thursday, March 27, 2008

Your Brain is Leaking

This punch-happy little dude has been all over the net for the past week or so: easily the world's coolest crustacean even before then, insofar as how many lifeforms of any stripe can bash their furious little claws through the water so fast (accelerating at over 10,000G!) that the resulting cavitation bubbles heat up to several thousand degrees K? If their ferocious little chelipeds don't take you out, the shockwave alone will shatter you (well, if you're a piece of mantis-shrimp prey, at least).

The reason for their recent fame, though, is this paper in Current Biology, reporting that — alone of all the known species on the planet — these guys can see circular polarised light. And that's just the latest trick of many. These guys see ultraviolet. They see infrared. They can distinguish ten times as many visible-light colors as we can (still only 100,000 — which you'd think would at least shut up those Saganesque idiots from Future Shop who keep blathering about the millions and millions of colors their monitors can supposedly reproduce). Each individual eye has independent trinocular vision. Mantis shrimp eyes are way more sophisticated than any arthropod eye has any right to be.

But what really caught my attention was a line in this Wired article (thanks to Enoch Cheng for the pointer):
"One idea is that the more complicated your sensory structure is, the simpler your brain can be... If you can deal with analysis at the receptor level, you don't have to deal with that in the brain itself."
Which is almost as cool as it is wrong. Cool because it evokes the image of alien creatures with simple or nonexistent brains which nonetheless act intelligently (yes, I'm thinking scramblers), and because these little crustaceans aren't even unique in that regard. Octopi are no slouches in the smarts department either — they're problem solvers and notorious grudge-holders — and yet half of their nervous systems are given over to manual dexterity. Octopi have individual control over each sucker of each tentacle. They can pass a pebble, sucker-to-sucker, from arm-tip to arm-tip. Yet their brains, while large by invertebrate standards, are still pretty small. How much octopus intelligence is embedded in the arms?

So yes, a cool thought. But wrong, I think: because what is all that processing circuitry in the mantis shrimp's eyes if not part of the brain itself? Our own retinas are nothing more than bits of brain that leaked across the back of the eyeball— and if the pattern-matching that takes place in our visual cortices happens further downstream in another species, well, it's still all part of the same computer, right? The only difference is that the modules are bundled differently.

But then this artsy friend points out the obvious analogy with motherboards and buses, and how integrating two components improves efficiency because you've reduced the signal transit time. Which makes me think about the "functional clusters" supposedly so intrinsic to our own conscious experience, and the possibility that the isolation of various brain modules might be in some way responsible for the hyperperformance of savantes1.

So pull the modules apart, the cables between stretching like taffee — how much distance before you're not dealing with one brain any more, but two? Those old split-brain experiments, the alien-hand stuff — that was the extreme, that was total disconnection. But are we talking about a gradient or a step function here? How much latency does it take to turn me into we, and is there anything mushy in between?

Are stomatopod eyes conscious, in some sense? Is my stomach?


1 I would have put a link to the relevant article here, but the incompetent code over at The Economist's website keeps refusing to to open up its online back-issue pdfs until I sign in, even though I already have. Three times now. Anyway, the reference is: Anonymous., 2004. Autism: making the connection. The Economist, 372(8387): 66.

Labels: ,

7 Comments:

Anonymous Anonymous said...

It's just the natural prejudice of the highly encephalized, I guess?

March 27, 2008 at 1:49 PM  
Anonymous Anonymous said...

Consider:

In some parallel universe, a cephalopod sci fi writer is blogging about something he had been feeling in a science journal, a newly-discovered species of primate, and he is speculating that even though mammalia have the deficit of having no real brain in their limbs, that is no reason not to think they aren't capable of all sorts of intellectual pursuits.

Chimps, for instance, are well-documented grudge-holders, he writes, typing it out on a feel pad of enormous complexity, his sucker manipulating the keys in three axis at once.

Don't you imagine?

March 27, 2008 at 1:57 PM  
Blogger John Henning said...

But then this artsy friend points out the obvious analogy with motherboards and buses, and how integrating two components improves efficiency because you've reduced the signal transit time. Which makes me think about the "functional clusters" supposedly so intrinsic to our own conscious experience, and the possibility that the isolation of various brain modules might be in some way responsible for the hyperperformance of savantes1.

So pull the modules apart, the cables between stretching like taffee — how much distance before you're not dealing with one brain any more, but two? Those old split-brain experiments, the alien-hand stuff — that was the extreme, that was total disconnection. But are we talking about a gradient or a step function here? How much latency does it take to turn me into we, and is there anything mushy in between?


That's some extremely interesting and grandiose thinking. I would say that there is something to it as far as separating complex function to an extent that you are dealing with two brains. Then the question becomes, would separate personalities emerge?

Another interesting perspective would be to take Professor Morovec's (spelling?) idea for transferring human intelligence from the brain bioware to electronic hardware. The way to do this, he proposed (at least in Ed Regis' great little book Great Mambo Chicken and the Transhuman Condition) would be to create an electronic brain and then hook it into the human brain component by component. First, you'd simply plug one area of the brain (a lobe or simply a few neurons) into the corresponding portion of the hardware replacement. You'd fiddle with it until the hardware has perfectly replaced the function of the organic brain based on the responses of the subject/patient who's being transferred. Then you move on to the next portion or lobe or part of the brain and replace it. By the end of the procedure, the person's organic brain will have been removed and they'd be "thinking" entirely with the electronic brain. Then, through this gradual process it would seem that you've transferred your consciousness from the organic to an electronic process.

Of course, there are other options in here as well. First, obviously, at some point in this procedure, you'be thinking with half of your original brain in tandem with half of the computer replacement. The computer parts don't have to be inserted into your head physically replacing the bypassed fleshy parts. You could be thinking with the brain in your head and a system halfway across the room (or anywhere with a good connection). Also, it seems like electronic enhancements, a "back-up" brain, could be grafted onto you using this process as 'addition' instead of replacement.

March 27, 2008 at 3:37 PM  
Blogger SpeakerToManagers said...

At first blush I'd guess that it's less a matter of distance or latency than of the amount of compression and/or translation of the signal in the channel between the two modules. If the modules have to transcode their internal symbols into some sort of communication code required by the channel between them they'll act less like a single system and more like two loosely-coupled ones.

And the looseness of the coupling would be to some extent dependent on how lossy that transcoding process was.

March 27, 2008 at 3:37 PM  
Blogger Peter Watts said...

John Henning said...

Another interesting perspective would be to take Professor Morovec's (spelling?) idea for transferring human intelligence from the brain bioware to electronic hardware. The way to do this, he proposed (at least in Ed Regis' great little book Great Mambo Chicken and the Transhuman Condition) would be to create an electronic brain and then hook it into the human brain component by component. First, you'd simply plug one area of the brain (a lobe or simply a few neurons) into the corresponding portion of the hardware replacement. You'd fiddle with it until the hardware has perfectly replaced the function of the organic brain based on the responses of the subject/patient who's being transferred. Then you move on to the next portion or lobe or part of the brain and replace it. By the end of the procedure, the person's organic brain will have been removed and they'd be "thinking" entirely with the electronic brain. Then, through this gradual process it would seem that you've transferred your consciousness from the organic to an electronic process.

The variant on this is that the artificial brain is already tweaked and configured to run identically to yours without any kind of piecemeal incremental replacement. So you wake up one morning and find this android looming over you with a gun, and it says, "I'm going to kill you now, but don't worry, because I have all your thoughts and memories right down to the last synapse, so you won't die at all, because you're already living on in me!"

First question: do you let him shoot you, or do you try to survive?

Second question: If you don't let him shoot you, then why not? How is this situation any different, logically, from an incremental replacement which you would be presumably copacetic with? Why is being replaced all-at-once worse than being taken over a little at a time?


SpeakerToManagers
said...

At first blush I'd guess that it's less a matter of distance or latency than of the amount of compression and/or translation of the signal in the channel between the two modules. If the modules have to transcode their internal symbols into some sort of communication code required by the channel between them they'll act less like a single system and more like two loosely-coupled ones.

Interesting. Here's another thought: the little beggar has two eyes. They're not directly connected, but they're mirror-symmetric anatomically and dealing with almost exactly the same input. Are they "cognitively entangled"?

March 31, 2008 at 12:21 AM  
Blogger John Henning said...

Second question: If you don't let him shoot you, then why not? How is this situation any different, logically, from an incremental replacement which you would be presumably copacetic with? Why is being replaced all-at-once worse than being taken over a little at a time?

The question does touch upon if there is anything inherently "you" in the organic matter of your brain. Morovec's point was that if you created an electronic copy of a person's brain, then you would just have a copy, not the person. However, if you prostheticized the brain bit by bit, discarding each piece as it was successfully replaced and in use by the person, then, in the end, you've simply upgraded the hardware on which the "personality" was running. You haven't actually copied it.

March 31, 2008 at 6:50 PM  
Blogger mtom said...

Thanks for your entire work on this blog. My mom really likes setting aside time for investigation and it is simple to grasp why. We notice all of the compelling mode you deliver effective suggestions through your web site and in addition cause response from some others about this topic and our own girl is really understanding a lot. Enjoy the rest of the year. You have been performing a first class job. boost your brain power

July 4, 2018 at 7:44 AM  

Post a Comment

Subscribe to Post Comments [Atom]

<< Home