The Mind is Not a Computer

There’s an essay by Ted Chiang (unfortunately not freely available on the internet) where he talks about folk biology: ideas about biology that are believed because they’re ‘just common sense’, whether they’re true or not. One example is the idea that you are what you eat, and that different kinds of meat will encourage different characteristics, with lion meat making you stronger and more aggressive, cow meat making you slower and more bovine, and so on. Obviously these beliefs are wrong – there’s no evidence for them, and besides their compatibility with lazy narrative thinking, no good reason to hold them. But folk biology that we know is wrong isn’t very interesting. What’s an example of folk biology that Chiang thinks we still hold onto? The idea, he writes, that the human brain is fundamentally like a computer.

Once you’re looking for it, this idea is everywhere. People talk about ‘smarter’ and ‘stupider’ computers, use words like ‘forget’ and ‘remember’ to describe computer memory, and so on. Even worse, we imagine our own brains as having finite memory resources (a belief that predated the advent of computers but was certainly encouraged by it), set processing power, and a decision-making faculty very much like a computer algorithm.

If you want the ur-example of this error, the great bulbous thing itself, take a look at this article from LessWrong (of course by Eliezer Yudkowsky). After a discussion of object-classification and neural nets, it ends up with this:

Before you can question your intuitions, you have to realize that what your mind’s eye is looking at is an intuition—some cognitive algorithm, as seen from the inside—rather than a direct perception of the Way Things Really Are.

People cling to their intuitions, I think, not so much because they believe their cognitive algorithms are perfectly reliable, but because they can’t see their intuitions as the way their cognitive algorithms happen to look from the inside.

And so everything you try to say about how the native cognitive algorithm goes astray, ends up being contrasted to their direct perception of the Way Things Really Are—and discarded as obviously wrong.

This is the least jargon-y section of the article. Strip away the talk of “native cognitive algorithms” and what you end up with is “people cling to their intuitions because their intuition seems to them to be direct perception of the truth.” Indeed, you end up with something similar to “people believe their cognitive algorithms are perfectly reliable”, if by ‘cognitive algorithm’ you mean intuition. It is bad writing, and it is bad writing because it clings to the computer-esque framework of LessWrong idioms.

What if the mind does not, in fact, run algorithms? You wouldn’t talk about a stone rolling down a hill “running algorithms” to determine the way in which it bounces, or a sunset running algorithms to determine the arrangement of colour in the sky.  For all its self-proclaimed scientificness, the LessWrong picture of the mind is medieval: the world comes in through the eyes and ears, the little homunculus inside makes a decision, and the body moves.

Advertisements

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s