Your brain is an index.
Is Google making us stupid? Will digitized books and e-readers devalue the written word? Does reading on the web harm human thinking? Do technologies like RSS and Twitter simply provide too much too fast — are we all about to drown in a informational tidal wave? In his widely discussed Atlantic essay on how the web changes our reading habits, Nicholas Carr wrote of the changes he’s noticed to his reading habits in the years since the rise of the web:
Immersing myself in a book or a lengthy article used to be easy. My mind would get caught up in the narrative or the turns of the argument, and I’d spend hours strolling through long stretches of prose. That’s rarely the case any more. Now my concentration often starts to drift after two or three pages. I get fidgety, lose the thread, begin looking for something else to do.
Reading on the web is almost certainly affecting the way we process information, but it’s not making us stupid. Instead, it’s changing the way we’re smart. Rather than storehouses of in-depth information, the web is turning our brains into indexes. These days, it’s not what you know — it’s what you know you can access, and cross reference.
In other words, books taught us to think like they do — as tools for storing extensive knowledge. Now the web teaches us to think like it does — as a tool for recall and connection. We won’t be so good at memorizing everything there is to know about a particular small-bore topic, but we’ll be a lot better at knowing what there is to be known about the broader category the topic fits into, and what other information might provide insight and context.
Here’s a personal example: As a kid film buff in the early and pre-digital age (early/mid 90s), I studied movie reference books: guides to cult films, to directors, to particular eras and critics. And I didn’t just study them, I soaked up their information. By my mid teens, I could recite actor, director, and writer filmographies, summon obscure facts about little-known cinematographers, and generally dominate in most cinema-related trivia competitions. That was the mark of an (amateur) expert. These days, it seems like I can barely remember who worked on the movie I saw last week. Why? Because I don’t have to. IMDB.com is available from any iPhone or wi-fi hotspot to instantly fulfill my desire for movie-related trivia.
In addition, I can call up film reviews by dozens of critics, look for references in other movies or literature, search story keywords and recommendation engines to find related films, books, and TV shows. When I’m through, I don’t need to remember all the details of what I found; I just need to remember, say, that there were two essays in two publications that made convincing arguments for particular interpretations. I’ll remember the authors, perhaps, and the names of the publications, as well as the gist of the argument. What about their supporting evidence, the details of their cases? How they came to the conclusions they did? Don’t remember — and don’t need to. I’ll look it up later.
At the end of my favorite novel, Fahrenheit 451, Ray Bradbury posits a rebel faction in which a group of people have formed a society around preserving old books, all of which have been banned. Their method? To memorize the books, to store them in their brains in their entirety — to, for all practical purposes, become the books. This is a tremendously appealing thought to those of us who find meaning and purpose in books, to those who seek refuge in stories (which, let’s face it, is what all of us story addicts are really doing with our lives). The idea is that you can give yourself over to a story not temporarily, but forever. These days, that’s an idea that’s fading fast, as it’s no longer terribly efficient to use our brains to store information. Why memorize the content of a single book when you could be using your brain to hold a quick guide to an entire library? Rather than memorize information, we now store it digitally and just remember what we stored — resulting in what David Brooks called “the outsourced brain.” We won’t become books, we’ll become their indexes and reference guides, permanently holding on to rather little deep knowledge, preferring instead to know what’s known, by ourselves and others, and where that knowledge is stored.
I could barely get to the end of this article before Tweeting and sending some emails. True Story and great read.
— Mac G · May 11, 03:39 AM · #
I liked the way I used to be smart.
Incidentally, this is all tied to a much older argument about what it means to be intelligent, the old “Why do we have to memorize dates?” argument. You can put me down firmly on the side that content, as well as process, is necessary to be an intelligent person. I don’t think you can be smart about poetry if you know everything about them but couldn’t recite a single one.
— Freddie · May 11, 12:34 PM · #
I agree with Freddie!
Would you want a doctor who said, “I can’t really remember all those interminable details about body parts and diseases and stuff like that, but that doesn’t matter, because I’ll have Google on hand when I perform your surgery”?
— Stuart Buck · May 11, 02:38 PM · #
This reminds me of a Richard Feynman anecdote. I don’t recall all the details (though I could look them up!) but basically he was auditing a graduate-level biology class and did a presentation on some topic of anatomy. He got a lukewarm response because it turns out he was just presenting a bunch of info that the “real” bio students had already had to memorize. His reaction was basically that it seemed like a waste of brain cells to memorize stuff that’s just as easily looked up.
I’m not endorsing that opinion, just presenting it.
— kenB · May 11, 03:39 PM · #
Einstein said something similar to Feynman: “Never memorize what you can look up in books.”
As a lawyer and former philosophy dude, I can say definitively that the main things are content familiarity and issue spotting. You may not remember exactly what the answer is, but if you can recognize the existence of a issue, why it exists and how it interacts with the whole, you’re going to do just fine.
Or as Henry Jones, Sr., said, “I wrote it in my diary so I wouldn’t have to remember.”
— Sargent · May 11, 09:00 PM · #
Freddie, how do you see this as a change for the worse, but also like books? Aren’t books just doing a poor job of what Google is doing now, but with the same intent?
I really don’t know how anyone considers this a bad thing – this is a pure upgrade from books, where our required storage space and lookup times are vastly, vastly reduced.
To keep with Freddie’s idea: Isn’t memorization a sign of love, not a sign of understanding? Isn’t this really about, “I wouldn’t trust the guy who hasn’t memorized a poem to know everything about them, because I don’t believe someone could learn everything about poetry without the passion for poetry that would also incite him to memorize at least a few.”
— bcg · May 11, 09:20 PM · #
I read books to learn the information within. Now do I think you need to memorize everything? No. But the obverse is being offered as a reasonable facsimile of intelligence, when it most certainly isn’t. A cipher is not intelligent, it’s merely a conduit for regurgitated information. Content matters.
— Freddie · May 11, 11:40 PM · #
When you get right down to it, I don’t think anybody reads books to learn the information within, but rather to learn how to think about the subject the book is about, how to manipulate new information. It’s not about the facts in the book, but instead teaching us how to understand new facts that will emerge after we’ve read the book. There’s no practical reason to care about the fall of the Roman Empire outside of either a) dramatic fancy or b) understanding what we should avoid, or what we should replicate.
If that’s true (and I’m saying it is), then this is actually a wonderful new advantage – all facts become reference, freeing us up to do what we like to do, which is think.
— bcg · May 12, 12:15 AM · #
Understanding complex subjects, as well as creative thinking about them, is impossible without a lot of memorization. Steve Dutch puts it best :
— Stuart Buck · May 12, 12:54 AM · #
Stuart, in an age when 3G is everywhere and Google is no further away than one’s belt, it’s pretty easy to have all the references available at one’s fingertips.
I’m not arguing that memorization is bad – but I don’t need to remember 80 million conversion factors, for example, when I can just type “3,320,000 kilometers to astronomical units” into Google and get the answer.
The amount of information a given person can learn is limited. There is no sense in not taking advantage of technology to focus one’s learning on the elements of synthesis and reasoning that all the Google queries in the world can’t produce.
— Travis Mason-Bushman · May 12, 09:27 AM · #
Not really: Estimates are that we have a trillion neurons with a thousand connections apiece . . . meaning that there are more possible configurations of neural patterns in our head than there are atoms in the universe. Is there a limit on how much a person can learn? I guess so, but I doubt that any of us is in any danger of even remotely approaching that point.
More importantly, you can’t engage in any sort of real synthesis and reasoning apart from a deep knowledge of the facts. Read Daniel Willingham’s recent book for a summary of the cognitive science on that point. That doesn’t mean you have to have memorized every equation that a calculator could figure out, nor does it mean that you can never use Google. But again, to return to my doctor example, you don’t find creative and useful medical research coming from people who never learned the names of body parts.
— Stuart Buck · May 12, 04:51 PM · #
I don’t mean it’s limited by our mental capacity – I mean it’s limited by time. None of us have time to learn everything.
— Travis Mason-Bushman · May 12, 07:13 PM · #
To survive, humans have developed the ability to imagine a model of the world that they must engage. Throughout cultures and regions, and over time, our mental abstraction of the ‘world-that-matters’ has taken many forms. But in all cases, we keep in our head a template of all the important places, people, facts and references that allow us to survive — the nodes that build our conception of the world. Abstraction is the timeless, unchanging aspect of the human species.
Consider that the vast majority of human history has been an preliterate or illiterate experience. Knowledge was empirical and acquired in the field, through different social associations and culture. Our forbears may not have been bookish, but neither were they imagining less. They kept a different model of the world in their heads, as complex as any we imagine today. The abstract world allows for testing connections — trying out possibilities before actually attempting them. It allows for comparative analysis — discussing ideas with peers. It could be about the optimal method of tracking gazelle, killing barbarians, or forging metal. In all cases, the amount of mental abstraction involved is roughly the same.
If we are moving to a mental model where survival is dependent on information access through digital nodes rather than recording the actual information in our wetware, we are still keeping a rich, complex abstraction of the world in our heads, as we always have done. Our world model is simply morphing yet again to accommodate survival node changes in the world.
Much is lost by abandoning the old mental models; but much is gained by creating new ones. It makes little difference if we celebrate it or lament the change by clinging to our paper books, any more than we should cling to our spears and arrows. If the data points of our mental model have become less clustered around the bush, the farm, the village, or the city library, but have dispersed across the planet, then so be it. We will survive by the same timeless means as always before: by imagining the rich, complex world that we live in, in its current form.
— Dan Lynch · May 13, 01:17 PM · #
There is still the problem of search time. A surgeon (as mentioned above) does not have the luxury of looking up the solution to an unexpected heart problem, whereas lawyers have functioned with more of an index-type memory since well before the internet was invented.
Also, the brain-as-index model would appear to adversely affect serious conversation. There’s not much riposte in, “I remember reading an article by some guy that refuted your point but I will need to look it up on my iPhone to recall the details and continue this debate.”
— David · May 13, 05:15 PM · #
Worthwhile questions in the post, and good comment, Dan Lynch. I think the importance of information as it relates to synthesis and understanding has not changed, but the actual ‘value’ of information has decreased from what it once was – what would be the purpose to buy a book that simply has readily available information online? The value of memorization has decreased as the usefulness of doing so has dropped.
The grasping of the meaning behind the facts is still the important part – reaching understanding through consideration and analysis of the (apparent) facts. Once understanding has begun to be grasped, then attempting to synthesize that understanding with the rest of one’s knowledge and instinct is where high creativity lives and new concepts can be formed. (Kinda what bcg is saying, above).
— Brett J · May 13, 07:25 PM · #
True — but learning lots of facts has to come first. Has to. There are no exceptions, not in any serious subject.
— Stuart Buck · May 13, 07:37 PM · #
I’m a guy who used to think he knew his trivia better than a lot of folks – at least the folks I live with ;). I also find myself almost getting ‘lazy’ because I can look many facts up now ‘on demand’ rather than having to absorb as much from books in my leisure time. Reference books used to be a great love of mine; now it’s web sites when I need them (or when I’m just killing time).
To a certain extent, I’m concerned with the correctness and completeness of information found the web, where that information is taken as an authoritative source. The process for getting a book published, at least traditionally, has involved much more review and fact-checking before publication.
I’m hoping that the result is that we become more discerning and critical of the web sources we quote from, but I’m not quite sure how realistic that hope is…
— Tom Culler · May 18, 07:02 PM · #
This was happening long before the internet came along. In high school I already knew that it was much more efficient to remember where I hid the test answers in my calculator cover than it was to actually remember the answers**.
(** stay in school kids, and remember to cheat only on the useless subjects)
— Tim B · May 19, 04:13 PM · #
I would argue that intelligence isn’t about storing information, whether as an index or longer bits — it’s about using what we know creatively. Reading without interruption allows the brain to play with ideas, not just sort and store.
— Jenny · May 24, 01:42 AM · #