Your Computer Can’t Remember a Darned Thing

Alamo.jpgSome years ago my family and I were vacationing in San Antonio. We bought my son a toy musket in the Alamo gift shop. At the airport for the return home, my wife suggested that we inform the TSA that we were carrying a musket in our checked baggage. I felt it was unnecessary — it wasn’t a gun, it was a toy, and there are probably hundreds of toy guns brought home from Texas vacations every day in checked baggage. My wife insisted, and I quipped, glancing at my daughter who was carrying her teddy bear, that we should also inform the TSA that we’re bringing a bear on board the plane (my little joke didn’t resolve the issue, as you might imagine).

I have a similar feeling when I contemplate the distressingly common assertion that computers store and process memory. We often refer to “computer memory.” But is “computer memory” real memory? What could “computer memory” possibly mean?

There are three ways in which we might mean “computer memory.” One is that the computer actually remembers something. Another is that the computer stores memories. And another is that the computer stores representations of memories.

MInd-and-Technology3.jpgSurely the computer itself doesn’t actually remember anything. The whole Strong Artificial Intelligence project is a comical delusion — comical, that is, if it weren’t taken so seriously by people who ought to know better. Memory is a psychological ability. Memory is retained knowledge. Knowledge is the set of true propositions. Propositions are beliefs that can be either true or false. Memory and knowledge and propositions and beliefs are psychological things. Computers are electromechanical devices, and electromechanical devices can’t do psychological things, any more than people can boot up or freeze or turn their own screen saver on when they sleep. It’s a category error. We attribute memory to a computer merely as a metaphor. The attribution of memory — in a psychological sense — to a computer is just nonsense, a bizarre confusion of metaphor and reality. A computer no more has memory, in the sense of remembering things, than you can catch a train at your computer terminal.

So a computer itself doesn’t have memories, in the sense of remembering anything. But can a computer store memories? Of course not. Memories are not the kind of things for which the verb “store” has any sense. Nothing — neither we nor a computer — can store a psychological thing. “I can’t store any more memories in my psychology, because I’m already full of propositions” doesn’t even make sense. I can have memories, I can like or dislike memories, I can tell other people about my memories, but I can’t store memories. And of course, neither can my computer store memories. My computer can store electrons, or data understood as patterns of electrons on the hard drive. But memories can’t be stored on computers, because memories can’t be stored at all. The assertion is nonsense.

Now, what is true is that representations of memories can be stored on a computer. I can enter data into my computer that is a representation of a memory that I have — the time and place of an appointment, or a catchy phrase I remember, or a photo of my uncle Fred whom I remember fondly. I have these memories, and my computer has representations of these memories. My computer doesn’t have any memories itself, nor can it store memories, which can’t be stored anyway.

It’s the same thing when I put printed photos of Uncle Fred in my photo album. If I said “I store my memories of Uncle Fred in my photo album,” I’m merely speaking metaphorically. The only thing my photo album contains is paper photos, which are representations of things I remember.

Now you may say “Wait a minute. The photo of Uncle Fred is equivalent to a memory, because if you’re not thinking of Uncle Fred, and you open your photo album and see his photo, you remember him, which is basically the same as a memory. It’s just an external memory, like an external hard drive.” But you would be mistaken. If you show the photo album to someone who has never met Uncle Fred and has no idea who he is, that person will not have a memory of Uncle Fred just by seeing the photo. Therefore, the memory isn’t “in” the photo. The memory is in you, and is merely rekindled by the photo. Only people have memories. Computers don’t have memories, except as a metaphor.

We have memories, and computers store and move around electrons. Sometimes we represent our memories using computers. We have chosen to call that ability of a computer to store electronic representations “computer memory,” by analogy to real memory, which is something that we, not computers, have. It’s just a metaphor, for goodness sake.

The interesting question is this: Why would such an obvious point escape us — and it escapes many very thoughtful and well-informed people? Why would it be news of any sort that computer “memory” is really just a representation of memory, just like a photo of a relative isn’t the relative himself but is really just a representation of him?

I’ve wondered about this a bit, and I think it has to do the philosophical illiteracy that is endemic among materialists, who populate much of intelligentsia and whose misunderstandings infest our intellectual culture.

Materialism — the belief that reality without remainder consists of dense stuff extended in space — is gibberish. It’s self-refuting, to the extent that it’s sufficiently coherent to refute anything. If materialism is true, then immaterial things like propositions, such the proposition that materialism is true, can’t be true. Materialists, saddled with a (literally) meaningless metaphysical anchor, grasp at the nearest thing to philosophy that they can find. Metaphors are nearby. Materialist metaphysics is mostly metaphors. Metaphors seem like metaphysics, if you don’t think too hard, and both start with “m,” which seems to be enough.

It really is that absurd. For materialists, the metaphor of memory for computer storage of electronic representations has become metaphysics, which (sadly) is a bit of an improvement on actual materialist metaphysics, which is utterly incoherent. The proper response to this pitiful error is a bit of common sense, explained slowly.

Image credit: Frank Thomson Photos/Flickr.

Michael Egnor

Professor of Neurosurgery and Pediatrics, State University of New York, Stony Brook
Michael R. Egnor, MD, is a Professor of Neurosurgery and Pediatrics at State University of New York, Stony Brook, has served as the Director of Pediatric Neurosurgery, and is an award-winning brain surgeon. He was named one of New York’s best doctors by the New York Magazine in 2005. He received his medical education at Columbia University College of Physicians and Surgeons and completed his residency at Jackson Memorial Hospital. His research on hydrocephalus has been published in journals including Journal of Neurosurgery, Pediatrics, and Cerebrospinal Fluid Research. He is on the Scientific Advisory Board of the Hydrocephalus Association in the United States and has lectured extensively throughout the United States and Europe.

Share

Tags

__k-reviewComputational SciencesContinuing SeriesMind and Technologytechnology