The Book is Not Dead… But the Reader is on Life Support (+ Twitter Literacy)

There goes Chicken Little again, claiming that the sky is falling. There has been much renewed alarmism in the past five years on the topical issue of whether or not books are going the way of the butter-churner, the hansom cab, and the 8-track player. E-books continue their explosive growth, netting record profits, and yet the surplus of newly published books has continued apace.

Recently, I undertook to catalogue my entire library for insurance purposes (and as a handy reference for when I need bibliographical details for citing purposes). Yes, I know books occupy an obscene amount of space, are cumbersome to move about, and seem archaic in an era where we can fit the entirety of our libraries on our computer hard drive with so much more room to spare.

I, for one, enjoy the look and feel of books. Perhaps I fetishize them, but there is much to be said about the beauty of the book (and I admire how their spines occupy a fully book-cladded wall). I enjoy their binding, their smell, and their unique histories - something not present in the sterile and standardized form of digital text. They do not require recharging, and are still quite durable and portable (I can drop a book and not wince the way I might if I dropped a Nook or Kindle).

My concern is not that books have somehow become antiques, but that the patience and attention required for reading has been the real victim of this shift in the medium. New media optimists - the kind who make their names by lauding everything academia stands against - tell us that this is a new age for text. Those like Clay Shirky remind us that many of our canonical works may not be worth reading. What we want is information NOW. We don't want to have to work for it, wait for it. Give us the memes! Give us the keywords via the search engine algorithm! And with that comes a fundamental change in our textual great expectations.

In Nicholas Carr's book, The Shallows: What the Internet is Doing to Our Brains, he cites a variety of recent neuroscience studies that have been mapping the effects of digital media on our cerebral hardware - and the findings seem to confirm the suspicions of those like myself who are cynical devotees of technological determinism. Yes, hypertext is distracting. Yes, our being plugged in so often is a source of constant interruption. Yes, the depth of research and thought is compromised by severe cognitive switching costs as we snap (and must adjust) from one screen or popup or notification to another. The ultimate danger of such radical changes in our neural operations? I can think of a few:

1. Lack of cognitive depth (relying too heavily on working memory) might makes us more efficient at finding information, but information needs to be processed into knowledge. We are constantly gathering the ingredients for a cake that we will never bake.
2. Decrease in historical context and sustained reasoning. This has disastrous consequences for a democracy.
3. Attention deficits compromise deeper thought which, perhaps, makes us not so bright.
4. This lack of context and deeper reasoning increases ignorance, especially when it comes to understanding the world in which we live. This ignorance - which seems to be a baffling point of pride among those who malign intellectualism - will increase intolerance, violence, and repetition. I say that because I can hop into the comment pit at my local news site and see almost verbatim the same invective I have read in books from the 1920s. In addition, an ignorant, context-less society is so much easier to control and manipulate with less effort. It also means that politicians can simply appeal to emotions rather than reason.
5. It will privilege a lesser quality of thought. If you cannot make your point in two lines or less, the audience is becoming increasingly incapable of understanding what you are trying to convey. Not all knowledge can be communicated in two lines or less, and much of it takes time and development. When we do not afford that time, we lose out.
6. Without proper consideration of issues, without the careful posing of problems, all we will be capable of is short-term solutions.

I mentioned earlier about the apparent pride in being anti-intellectual. There is much clamour against the perception that the "elites" are running the show. This, of course, does not square with the facts (those who seek to confirm their beliefs are effectively immune to facts, it seems). In reality, much of our political landscape is dominated by those who do not belong to an intellectual elite class. Instead, it is governed by those who seek popular support by maligning intellectuals as being irrelevant know-it-alls whose main purpose is to raise taxes and live sinecure lives. Meanwhile, the politicians on the wave of corporate support are the ones living that sinecure life, using the intellectual as their convenient scapegoat. By drumming up the false perception that an intelligentsia is behind all the woes of a nation, this works to appease the "average" voter. Now that "average" continues to experience downward pressure with regard intellectual expectation, what we find lurking behind anti-intellectualism is really, in its naked form, an attempt to justify one's laziness.

But back to books and readers. When I say readers, I mean the serious kind, not the sort that points to the formulaic potboilers and sensationalist sweet nothings and calls that reading. Technically, yes, it is reading, but so is reading an ingredient label, a road sign, or comments on an online news story.

Recently, we read that Margaret Atwood was all-atwitter about Twitter, praising it for its benefits in increasing literacy. I would ask Atwood to please provide us with an operational definition of literacy, specifying what sort of literacy (since this is a complex concept that cannot be so easily reduced and simplified). What I think problematizes her claims would be the following:

1. Relying on communication media comparisons ("Twitter is like a smoke signal, a carving in a tree, a telegram" etc.) is not useful. Moreover, it is to commit a fallacy. Communication technologies of any kind have specific aspects to them that do not necessarily make them neatly comparable.

2. Atwood does not supply us with any real quantitative support to substantiate her claim that literacy is improving. First of all, what KIND of literacy? Second, prove the causal link or else you have committed post ergo propter hoc. Thirdly, anecdotal data (the tweets she follows does not constitute a more robust empirical study). Verdict: her opinion, which is unqualified in this respect.

3. How does she respond to the amount of redundancy and self-promotion on Twitter? How does that increase literacy?

4. I do not recall Atwood saying this whatsoever, but it is in the public discourse; namely, the idea that short-forms, misspellings, and other "alterations" on the English language represent a creative and dynamic reinvention of the language. This I do not buy. The poets and prose-writers who re-imagined the language did so with a purpose in mind. They did it for very real reasons, and knew the rules of the language. We should not coddle those who do not bother with the rules of language by praising their innovation when it is more likely it is ignorance or indolence. I fully agree that language is fluid and in constant re-negotiation over its lexical terms, its structures, and semantic flexibility, but not knowing how to properly use an apostrophe, for example, is not the way to push language forward.

And now I sound like a language Nazi. However, what is wrong with some of the linguistic rules we currently have on offer? Do we really need to spell words any which way as proof of our creativity? I can fully understand altering certain conventions of language such as gendered writing, or stock phrases such as "fair game" which are insensitive holdovers from a colonial past.

If Twitter is having such a tremendous positive impact on literacy, I would like to see the studies that show this to be the case.
 •  2 comments  •  flag
Share on Twitter
Published on December 21, 2011 05:19
Comments Showing 1-2 of 2 (2 new)    post a comment »
dateUp arrow    newest »

message 1: by Dave (last edited Dec 21, 2011 09:09PM) (new)

Dave
"what we find lurking behind anti-intellectualism is really, in its naked form, an attempt to justify one's laziness. "
Ding, ding, ding.

The thing is, humans are naturally lazy, and that's OK. Twitter and Facebook merely accelerate natural human tendencies to engage in self-promotion, in the hopes of yielding fame and fortune. I'm doing it right now with this short comment.

At the same time, I'm not sure the aggregate level of laziness has increased due to Twitter/Facebook. We just see comments from the entire bell curve out in the open on the internet now, whereas in the past, it was too difficult for them to climb onto their soapboxes (books, radio, and TV have a high cost of entry).

Yes, popups (context switches) are not conducive to learning. However, I see no issue with ebooks. They increase the ease with which we can all read more books, and do it at less cost to both publisher and consumer. I for one would love hypertext links built into books. I'm manually googling interesting terms/subjects anyway. Ditto for faster/easier dictionary lookup. It only increases my background knowledge and comprehension of the text.


message 2: by Kane (new)

Kane Faucher For sure, one of the outcomes of the information age is that it may verify once and for all what previously could not be verified because of lack of access to certain media forms (dominated by gatekeeping functions such as cash or editors): that most of us are rather banal, boring, filled with ego-anxieties, etc. Twitter may be an example of this. "RT I had a muffin this morning." One of my colleagues has expressed a similar claim that we've only found a way of broadcasting our dullness for all to see, but that it was always there. I partially agree with this, but I would add that it has also increased expectation to maintain online presence through constant updating. And so, with nothing of substance to say, just say anything... Look around... What's on my desk? I'll tweet that!

I pity the digital archeologists of the future. I know the Library of Congress is going to archive all tweets. Imagine having to sift through all that redundant data! All information that has ever been produced before the Internet accounts for about 3 percent of the total we have now. Crazy.

When it comes to eBooks, I am sure there are good readers who have the media literacy skills to stay focused. Espen J. Aarseth speaks of the beneficial qualities of ergodic writing and hypertext. I will allow my bias to show on this score, however, since I just adore physical books (but I suppose that doesn't come as much of a surprise).

The other issue about look-ups is search term query failure where, in the current web 2.0 architecture, only explicit mention of terms come up in the results. The algorithm privileges this over implicit mention (until we can develop a truly "semantic web"). And, not everything is digitized, and not every source is credible (which entails some media literacy to sort out the good from the bogus or simply wrong). I continue to bump up against the weakness of search engines. There is stuff that I can find in my books that I cannot find on Google. There is information that simply does not exist (at least insofar as Google can locate it) that I have on paper. My fear is that if it is not on Google, for example, then people might assume that a) the information does not exist [false] or b) the information has no value [judging relevance strictly on popularity, which historically has been a piss-poor method of evaluating info and knowledge - just look at what happened during that arid period of scholasticism!].


back to top