Friday, May 19, 2006

In megatexels, report cards, in spoke wheels...

I put Ben's mind at ease the other day, assuring him that I don't think everyone is precisely an untrustworthy bastard; that is, to say so was an error of tone, not of content. But then I casually mentioned or implied that free will was a silly concept, and that didn't go over so well either. Oh yes: I said I am a determinist, but with footnotes.*

It's understandable that people, even some philosophers -- especially philosophers -- want to believe in free will. I don't take a dogmatic position on the metaphysics of the thing, but I think it is plainly a mere word game, a symbol pointing to a concept that is ineffable and therefore useless. "God" is another example of this. However, unlike "God," which means nothing because it means pretty much everything (spirit, nature, Zeus, God the Father who called forth Abraham, Dionysus, chi), "free will" does mean something. It is, I believe, a synonym for a chaotic process: a process which appears random, unordered, and indescribable and unknowable by science. Which is to say, a mystical process. But it isn't; it's just complex.

This is a sticky problem for nontheisitic, nondeterministic philosophers. If you don't believe the human nervous system is a complex software system running on wetware -- a super-duper version of your desktop computer, in other words -- then you believe there is something beyond the physical. This is not unreasonable, and not irrational, but it is mystical. It is properly called faith, not science, because the chaotic explanation is simpler, is sufficient, is complete -- it is the rational explanation, even though it may not ultimately prove correct. When we found out Newton's equations were wrong, it was a bit of a blow to the perceived perfection of Old Ike's Frame of the System of the World, but you were still better off believing in Newton all along than believing exclusively in some primitive shamanistic conception of causal mechanics.

I'm rambling. I'm going to try to scurry back to the reservation and take this wherever it was going. And I will do it by way of an apt analogy.

Even someone trained in computer science and engineering has trouble conceiving of the possible states of a computer. Most people never try; perhaps you never have. Consider it now. Your computer has exactly one state† that represents "playing World of Warcraft, hooked up to server X, swinging my sword at player Y (whose handle is EaterOfZebras) and player Z (who is an elf with so many hits points, and widgets A and B in his inventory, and whose handle is NilesCrane)...." There is one state -- this and that bit set in memory, this and that bit flipped on your hard drive, this and that register holding such and such a value in your CPU -- that corresponds to that one freeze-frame moment in that one game.

To the non-technical user, it is simply magic. To the technical or relflective user, it is simply complex: hard to get one's head around, but uncontroversially deterministic.

Now, we built the computers that run computer games and web broswers and stuff. What would it be like to deal with a computer way more complex than a human being has ever built, or will be able to build for at least several decades?

It'd look a lot like the human central nervous system. There is no reason to suppose that consciousness isn't simply an emergent property of a sufficiently complex network with enough nodes and enough interconnections; indeed this seems fairly likely. The process by which such a software system selects a "choice" among alternatives would appear random and unpredictable to us -- or in any case entirely opaque. That doesn't mean there's a mystical property called "free will" saving it from determinism. It just means the system is nonlinear, and the computer is really complex.

As Ben pointed out, and thank goddess this is plain as day to him, since most philosophy majors somehow can't get it through their Chomsky-addled skulls because they're too busy suggesting gravity can be defeated by jumping off the Sears Tower, not that I am bitter, this needn't (and shouldn't) have any practical effect on one's treatment of individuals as sentient, responsible individuals with moral agency.

Man, I need a cold beer.


* Saying things like this is a great way to make philosophers apoplectic. The torture may be supplemented with irony via a strategically placed footnote; see?

† Actually a family of closely related states, but that doesn't affect the example, so I'm keeping it simple.

0 Comments:

Post a Comment

<< Home