Tuesday, August 12, 2008

On artificial intelligence

I have been, for whatever reason, thinking a bit about artificial intelligence (AI). More precisely, I have been thinking about what is generally called "strong AI," i.e. artificial intelligence that matches or supersedes the human intellect, the kind you see in science fiction books and movies. My thinking has led me to believe that such a thing will never be.

First, the intellect is, for lack of a better term, a substantial or essential power. Its origin is in the substantial form, the essence, the nature of the being that possesses it. No machine possesses a substantial form. Rather, any machine qua machine possesses only an accidental form that is brought about through the organization of parts. These parts may be composed of a substance or substances, but the machine itself exists only insofar as said substance or substances are given certain shapes and arranged within certain relations. Since a machine possesses no substantial form, it cannot possess, qua machine, any power that has its origin in substantial form, and thus it cannot possess intellect.

Second, the intellect is a purely immaterial, spiritual power. It does not depend on matter for its operation, either the matter of the knower or the matter of the thing known. While in some intellectual beings, viz. man, matter may be required to provide the intellect with the forms it uses in its operation through the external and internal senses, this is accidental to the operation of the intellect qua intellect. Now, man does not have the power to create immaterial being. As such, man does not have the power to create intellect.

Third, is anyone familiar with the "Chinese room" argument of analytic philosopher John Searle? I have only a slight familiarity with it, but I believe it goes something like this: Take a man who understands no Chinese and put him in a room filled with data on the rules of the language, such as grammar, structure, likely replies to certain inquiries &c. Have a Chinese speaker try to communicate with the man through writing. Given enough time and enough data on the language, the man will be able to respond to the Chinese speaker in a way that is both grammatically correct and makes sense to the Chinese speaker. The Chinese speaker will believe he is having a meaningful conversation with the man in the room, but the man in the room will have no idea as to what the conversation is about. I find this argument interesting because it demonstrates the difference between manipulating symbols and understanding them.

St. Thomas, if I am not mistaken, held that words carry with them the form of things. The origin of words, whether written or spoken, is in the internal word, the knowledge of a thing possessed by the soul. The use of words is not just the manipulation of symbols, but is instead the transmission if intelligibility and form. The word directs one beyond itself to the thing in itself as it can be known by the soul.

Then there is a difference between instinct, stimulus/response, or rule based communication and true intellectual communication. The former has its origin in some amount of in-built rules that determine the response to certain stimuli. The latter goes beyond the perception of the stimuli and the interaction of the one who produced the stimuli and the one who responds to them, referencing a third being whose intelligibility and form the words carry and to whom the writer or speaker of the words directs the intellect of the one who receives them. The former happens only on the level of the sensible, while the latter transcends the sensible, using it to direct eh light of reason to investigate some communicated piece of reality.

Anyway, those are just some random thoughts I've had in the last few days. Any critiques, discussion or interesting references would be much appreciated.


Anonymous said...

I believe your idea of 'intellect' is encompassing far too great an expanse (delving even into a spiritual essence) that seems altogether incomparable to the concept of "intellect" as envisioned by certain scientists in the matter.

As regards the latter, I believe the current yardstick as to whether such an 'intellect' even exists in an AI is if it were capable of creativity on its part (a creativity that is exhibited in humans activities such as writing, etc.).

However, as to the former, whether this creativity is indicative of a 'soul'?

This is, of course, debatable.

- e.

brendon said...

I believe your idea of 'intellect' is encompassing far too great an expanse (delving even into a spiritual essence) that seems altogether incomparable to the concept of "intellect" as envisioned by certain scientists in the matter.

Of course it encompasses more than what modern scientists envision. Modern man, since Descartes at least, has had far to narrow a view of what intellect actually is. It has been stripped down to formal logic and mathematics, manipulation rather than comprehension. That is the heart of the problem.

Intellect is the acquisition of the universal natures of particular things. It is the possession of the form of something within the soul. It both requires a soul and is immaterial, i.e. spiritual, by definition.

What I am saying is that our science fiction fantasies of robotic people and living software are just that, fantasies. Machines, by the very fact that they are machines, cannot, by definition, be people. Things that exist in a purely material way, such as being encoded on magnetic platters, can never possess the spiritual faculties needed for true intellection.

Anonymous said...

Curious, are you operating from this bit of the Summa Contra Gentiles?

5. The act of understanding cannot be the act of anything corporeal. But it is an act of the soul. Therefore the intellectual soul at least is not a body.

- e.

brendon said...

That is one of the places St. Thomas discusses the immaterial nature of the intellect. I was more directly thinking of Summa theologiae, I, q. 75, a. 2c.

Will Duquette said...

No machine possesses a substantial form. Rather, any machine qua machine possesses only an accidental form that is brought about through the organization of parts.

OK, now if this is true, substantial form doesn't mean what I thought it did.

Consider an axe, made of an axe head and an axe handle. Put 'em together, you get an axe. I understood that said axe was a substance, with the form of an axe, and hence that its substantial form was that of an axe. It doesn't have a soul, whether vegetative, sensitive, or rational, but that doesn't mean that it isn't a substance.

Now, if the axe is a substance, in fact, then why isn't my laptop? It has more pieces, but I don't see why that would make a difference.

Similarly, a dog has the substantial form of a dog (doesn't it?); yet it has no intellect either.

I'm very new at this, so it's likely that I'm confused.

brendon said...

An ax would be an accidental form. The substantial forms involved would be some type of metal for the ax head and some type of wood for the handle.

St. Thomas discusses this in his De principiis naturae. His example is, I believe, a bronze statue or bust. If I take bronze and shape it into the likeness of Caesar, the bronze is still bronze. It is not Caesar substantially, but only has the shape of Caesar. Since shape is a sub-category of the accidental category quality, one could only say that the bronze has the accidental form of Caesar.

This holds true for more complex artifacts made from more than one material. Various metals, woods, rubbers, plastics &c. are shaped in certain ways and arranged into configurations that enable them to be used to perform certain tasks.

I would say that my computer, for example, exists under an accidental form that falls under the category of relation. If I took my computer apart and piled the pieces up I would no longer have a computer, only the parts of a computer. The computer only exists when the parts are configured to relate to each other in a certain way. Similarly, if I took each component and broke it down into its constituent components--metal, plastic &c.--I would not even have computer parts anymore. Rather, I would have substances that could be shaped in certain ways and put into relationships with each other so as to make computer components.

Finally, it is true that dogs have substantial forms but not intellects. It is not the existence of a substantial form is sufficient for the existence of an intellect, but rather that the existence of an intellect necessarily requires some certain substantial form.

Will Duquette said...

Aha. OK, that makes sense; I'd been beginning to wonder how one dealt with composite objects (like computers) and that answers that question. And a substantial form is necessary but not sufficient; that makes sense too.

And that implies the answer to the old question about replacing first the axe handle and then the axe head--is it the same axe? It wasn't a substance to begin with, it was never a single object, it was always a composite, and as soon as any part was replaced it was really a different axe.