The Internet, email and Google now do much of our remembering for us. Jane Wheatley of The Times looks at how this affects our powers of recall
When the East Timorese resistance leader Xanana Gusmão was imprisoned during the 1990s he wanted to write an account — part poetry, part memoir, part political tract — of his country’s struggle for independence. Years later, when he was freed and elected Prime Minister, the material would form his autobiography, To Resist Is To Win!
But in jail inmates were denied pen or paper: instead Gusmão used fellow prisoners as a human archive. In canteen queues and under cover of shuffling feet in the exercise yard, he spoke passages of his work into willing ears. Each man memorised what he heard each day; much of it survived and was later transferred to manuscript.
Before the invention of the written word, this is how all information was stored, retrieved and passed on — in stories, verbal and visual, and collective memory. Gusmão’s collaborators demonstrated the impressive capacity of the human brain to remember things it wants to remember.
“If a man write little, he need have a great memory,” said Francis Bacon. But things have moved on: first libraries, then technology — photography, the Internet, Palm Pilots, email, mobile phones — have provided us with a storehouse of knowledge, much of it these days accessed at the tickle of a few keys. But if we no longer need to remember phone numbers or dates, if we can google the words of A Hard Day’s Night or the formula for the periodic table, have we made some aspects of memory redundant? Are we in danger of forgetting how to remember — a case of use it or lose it?
“You’re asking the wrong question,” says Professor Martin Conway, of the University of Leeds, a psychologist and expert on memory. “The reverse is entirely the case: technology reduces the burden on memory and increases our ability to make use of our minds. It is enabling rather than disabling.”
Sherlock Holmes once astonished Dr Watson by admitting that he didn’t know the name of the President of the United States. When Watson told him, Holmes said he would now do his best to forget it again; he believed that the brain had limited capacity and wanted to reserve his memory for more relevant things, such as knowing the difference between 50 different kinds of cigarette ash — vital stuff for the great detective.
Holmes was wrong; we use only a fraction of our capacity to remember. And because the act of memorising induces additional production of the neurotransmitters that lay down routes in the brain in which memories consist, it is likely that the more you memorise, the more you can memorise.
You can improve your memory by making a conscious effort to repeat and integrate information (see below). And motivation is an important aid to memory: Holmes could recall 50 types of ash because he needed to; a soccer-mad schoolboy who recalls nothing of his maths lesson can effortlessly recite the statistics for every Arsenal match of the season.
But though our capacity to retain information is almost unlimited, our attention is a finite resource, and time spent learning the names of presidents is time not spent on other, more creative, undertakings. And there are so many of them: a blizzard of demands and possibilities from our Internet-enabled lives.
We are constantly using our working memories to copy information from “storage” in our own long-term memories as well as from our computers, working on prospective tasks: organising, annotating, linking, responding to emails, making arrangements. We combine technology with our memories to enhance the performance of each: “We are learning to incorporate the digitalisation of things-to-be remembered in our intellectual economy, and finding ways of being better because of it,” says the philosopher Anthony Grayling.
There are, broadly, three kinds of memory: autobiographical or episodic (the people and events in our personal lives), conceptual or semantic (our own personal database of knowledge about the world); and procedural (how to ride a bike or make pastry). Semantic includes the meaning of words and the rules and concepts that form a mental representation of the world. It is a reference source, continuously accumulated and, after a while, becoming independent of the context in which it was acquired. We “know” Delhi is the capital of India and that people there are mainly Muslim or Hindu but it is unlikely that we will recall where we first learnt it.
People with amnesia do not usually lose this kind of consolidated, long-term memory or the memory of how to do things — they still know the capital of India and can still make pastry or operate a computer. What they do lose is autobiographical memory — which is frightening, because memory is what makes us human. It binds us in to our culture, social groups and families; it gives meaning to our lives. We need memory to be happy or sad or moved, and the more emotionally charged the occasion, the more vivid the recollection.
As Voltaire observed: “That which touches the heart is engraved in the memory.” A new study by the Open University found that looking at photographs of themselves with family and friends made subjects happier and more relaxed than other therapeutic devices such as eating chocolate, listening to music or drinking wine.
But where once memory cues dwelt only in physical places such as photo albums, record collections, boxes of letters and embarrassingly maudlin attempts at poetry, now email folders, personal blogs and websites such as MySpace.com provide a virtual archive of autobiography, giving a new meaning to the notion of the “Me” generation.
From this month subscribers to Orange can sign up to My Album, enabling them to upload photos taken on their mobile phones to a secure site that they can then visit, edit and share with friends. Soon, says Nick Bostrom, of the Institute for the Future of Humanity at Oxford University, wearable computers will be available to document a lot more than your best friend gurning into your mobile.
Because autobiographical memory is invested with meaning and emotion, technology cannot ever replace it, though it can facilitate recollection in increasingly sophisticated ways. Five years ago Margaret Wilson (not her real name) contracted limbic encephalitis, which destroyed the part of her brain that recalls past events. Margaret could spend a happy day out with her husband or attend a family wedding and then remember nothing about it the next morning.
Now when she goes out she wears a device, a SenseCam, that takes a photograph every time there is a sensory change such as movement, light or someone else entering the room. By replaying her “film” later, Margaret can review and re-experience her day. Doing this two or three times over the ensuing fortnight seems to “stabilise” her own memory so that three months later she can retrieve scenes from the day by herself.
It seems that the device mimics the hippocampal structures in our brain that construct memory in a series of episodes: replaying the resulting rapid-fire sequences provides a series of visual cues to the wearer’s day. Ken Wood, of the Microsoft Research Laboratory in Cambridge which developed it, says SenseCam performs much better in trials than either keeping a diary or deliberately taking Polaroid shots of things that subjects thought they would want to remember.
The device is currently aimed at helping patients like Margaret and others with conditions such as Alzheimer’s, but Martin Conway thinks that it could have a prophylactic role in our ordinary lives: “Using it to exercise that part of your brain regularly could stave off the normal progress of memory loss.”
For more on this article visit The Times Online website
No responsibility is taken for the content of external Internet sites.