The Turing Test: am I chatting online with another human or with a piece of software programmed to simulate human answers?

The software "passes the Turing Test" if it’s taken for human about as often as a real human.

To play with a sample program of this kind (ELIZA, the computer therapist), go here

Goodman says passing the Turing test means nothing, and ELIZA proves it. We have a natural tendency to anthropomorphize non-human things.

Godwin replies that ELIZA is limited in the topics she can discuss, but AL isn’t. AL thus meets one of Warren’s criteria of personhood: "the capacity to communicate, by whatever means, messages of an indefinite variety of types". Obviously AL also meets Warren’s criterion of reasoning: "the developed capacity to solve new and relatively complex problems".

Goodman: if a computer passes the Turing test, it passes ONLY because it’s following programmed instructions. A computer simply embodies a formal system of exchanging sets of signs for others.

Godwin: that’s NOT all. A computer can also be programmed to learn to add or subtract rules from its rule set, or even to learn to add and subtract rules for adding and subtracting rules. Computers work hierarchically, just like brains. Computers are even capable of creativity; they can, for example, generate new proofs of mathematical theorems. Their output isn’t entirely predictable.

Goodman: it might LOOK LIKE the computer is learning or being creative, but there’s nobody home inside. If the computer passes the Turing test, "the pass is a fake because the computer can’t really mean or think or intend the sense of the symbols it prints out." (34)

Now we (temporarily) leave the discussion of AL. Commissioner Hershell says she understands Goodman’s point about AL: AL’s "thoughts" and "intentions" aren’t *real*, because they are all "programmed in". Hershell thus can’t see how AL has real "inner mental experience" (35). AL can pass the Turing test, but seems not person-like because he lacks genuine intentionality (his intentionality is all "derived" or "second-order": what things mean to AL is always mediated by his programming). Washoe Delta, on the other hand, can’t pass the Turing test (because she can’t speak like a human), but seems to have genuine mental experiences. The chimp seems to have real or "original" intentionality, which Goodman said *was* the significant characteristic of personhood.

Godwin agrees about Washoe Delta’s intentionality: when Washoe Delta asks for a banana, she *means* she wants a banana, and she’s expressing her actual *desire* for a banana (37). And she could pass the *right kind* of Turing test: one designed to challenge our ability to distinguish the responses of very young children versus chimps. And if she passed that Turing test, shouldn’t she have just the same degree of personhood as a very young child?

This afternoon concludes with the commissioners wondering if excluding Washoe Delta is thus merely arbitrary speciesism.