Nicholas Carr's Blog, page 41
November 2, 2011
Chitchat
From an interview with William Gibson in The Paris Review: For someone who so often writes about the future of technology, you seem to have a real romance for artifacts of earlier eras. It's harder to imagine the past that went away than it is to imagine the future. What we were prior to our latest batch of technology is, in a way, unknowable. It would be harder to accurately imagine what New York City was like the day before the advent of broadcast television than to imagine what it will be like after life-size broadcast holography comes online. But actually the New York without the television is more mysterious, because we've already been there and nobody paid any attention. That world is gone. My great-grandfather was born into a world where there was no recorded music. It's very, very difficult to conceive of a world in which there is no possibility of audio recording at all. Some people were extremely upset by the first Edison recordings. It nauseated them, terrified them. It sounded like the devil, they said, this evil unnatural technology that offered the potential of hearing the dead speak. We don't think about that when we're driving...[image error]
Published on November 02, 2011 22:06
October 29, 2011
Utopia is creepy
Works of science fiction, particularly good ones, are almost always dystopian. It's easy to understand why: There's a lot of drama in Hell, but Heaven is, by definition, conflict-free. Happiness is nice to experience, but seen from the outside it's pretty dull. But there's another reason why portrayals of utopia don't work. We've all experienced the "uncanny valley" that makes it difficult to watch robotic or avatarial replicas of human beings without feeling creeped out. The uncanny valley also exists, I think, when it comes to viewing artistic renderings of a future paradise. Utopia is creepy - or at least it looks creepy. That's probably because utopia requires its residents to behave like robots, never displaying or even feeling fear or anger or jealousy or bitterness or any of those other messy emotions that plague our fallen world. I've noticed the arrival recently of a new genre of futuristic YouTube videos. They're created by tech companies for marketing or brand-burnishing purposes. With the flawless production values that only a cash-engorged balance sheet can buy you, they portray a not-too-distant future populated by exceedingly well-groomed people who spend their hyperproductive days going from one screen to the next. (As seems always...

Published on October 29, 2011 09:37
October 25, 2011
Minds askew
Iain McGilchrist, the psychiatrist and former English professor whose 2009 book on the human brain, The Master and His Emissary, is endlessly fascinating, discusses his ideas on the meaning of the brain's hemispherical divide in this wonderful animation: That helps explain, among many other things, why we're so drawn to the metaphor that portrays the brain as a computer....

Published on October 25, 2011 08:37
October 17, 2011
Retransmission of a language-based practice
Penn prof Kenneth Goldsmith has seen the future of culture, and it's a content farm: For the past several years, I've taught a class at the University of Pennsylvania called "Uncreative Writing." In it, students are penalized for showing any shred of originality and creativity. Instead they are rewarded for plagiarism, identity theft, repurposing papers, patchwriting, sampling, plundering, and stealing. Not surprisingly, they thrive. Suddenly what they've surreptitiously become expert at is brought out into the open and explored in a safe environment, reframed in terms of responsibility instead of recklessness. We retype documents and transcribe audio clips. We make small changes to Wikipedia pages (changing an "a" to "an" or inserting an extra space between words). We hold classes in chat rooms, and entire semesters are spent exclusively in Second Life. Each semester, for their final paper, I have them purchase a term paper from an online paper mill and sign their name to it, surely the most forbidden action in all of academia. Students then must get up and present the paper to the class as if they wrote it themselves, defending it from attacks by the other students. What paper did they choose? Is it possible to...

Published on October 17, 2011 08:58
October 14, 2011
Bondi Blue
I kid you not, in the late 1990s I actually paid good cash money for a Macintosh computer that looked like this: It was as ugly as Steve Ballmer's ass. It weighed a million pounds. It was referred to as "The Molar." That was not a term of endearment. Apple Computer was dead. It was not "nearly dead," as you'll hear some say today. It was doornail dead. It was laid out on a slab in a Silicon Valley morgue, a tag hanging from its toe. Scott McNealy could have purchased the remains for an amount more or less equal to what he was spending on greens fees every month, but at the last moment he came to his senses and put his checkbook back into his pocket. The few pathetic fanboys left on the planet - myself among them - knew when we bought a new Mac that what we were really buying was a memento mori. In 1996, the Apple board exercised the only option it had left, short of outright dissolution: it paid Steve Jobs to come back and take possession of the corpse. But it wasn't until two years later that Apple released the first product...[image error]
Published on October 14, 2011 15:22
October 13, 2011
Sentence of the day
It's by Evgeny Morozov, and it appears in his breaking-a-butterfly-on-a-wheel review of Jeff Jarvis's Public Parts: "This is a book that should have stayed a tweet." UPDATE: Our Resident Philistine responds: "One price of publicness is haters." I believe the word you were looking for, Jeff, is "critics."...

Published on October 13, 2011 16:18
October 8, 2011
Overselling educational software
Tomorrow's New York Times carries the second installment in the paper's series "Grading the Digital School." Like the first installment, this one finds little solid evidence that popular, expensive computer-aided instruction programs actually benefit students. The focus of the new article, written by Trip Gabriel and Matt Richtel, is Cognitive Tutor, a widely esteemed and much coveted software program for teaching math in high schools. The software was developed by Carnegie Learning, a company founded by Carnegie Mellon professors and now owned by Apollo Group, the same company that owns the University of Phoenix. Carnegie Learning promotes its software as producing "revolutionary results." It is widely used, and has been applauded by respected thinkers like the Harvard Business School's Clayton Christensen, who in an article published by the Atlantic two weeks ago used Carnegie Learning as the poster child for the power of software-based education: Carnegie Learning is the creation of computer and cognitive scientists from Carnegie Mellon University. Their math tutorials draw from cutting-edge research about the way students learn and what motivates them to succeed academically. These scientists have created adaptive computer tutorials that meet students at their individual level of understanding and help them advance via the...[image error]
Published on October 08, 2011 17:56
Conspicuous production: a hypothesis
When conspicuous consumption shifts from the realm of material goods to the realm of information it takes the form of conspicuous production. Discuss....

Published on October 08, 2011 08:29
October 7, 2011
The age of deep automation
Thanks to interconnected computers that are able to compute and communicate at incredibly low costs, we have entered a time of what I'll call deep automation. The story of modern economies has always been a story of automation, of course, but what what's going on today goes far beyond anything that's happened before. We don't know what the consequences will be, but the persistent, high levels of unemployment in developed economies may well be a symptom of deep automation. In a provocative article in the new issue of the McKinsey Quarterly, W. Brian Arthur argues that computer automation has in effect created a "second economy" that is, slowly, silently, and largely invisibly, beginning to supplant the primary, physical economy: I want to argue that something deep is going on with information technology, something that goes well beyond the use of computers, social media, and commerce on the Internet. Business processes that once took place among human beings are now being executed electronically. They are taking place in an unseen domain that is strictly digital. On the surface, this shift doesn't seem particularly consequential—it's almost something we take for granted. But I believe it is causing a revolution no less important...

Published on October 07, 2011 09:25
October 6, 2011
Whose book is it, anyway?
Even after I wrote a couple of posts about Amazon's Kindle announcements last week, something still nagged me - I sensed there was an angle I was missing - and two nights ago it finally hit me. I woke from a fretful sleep and discovered a question pinballing through my synapses: What the heck does Kuzuo Ishiguro think about this? Or, more generally: Whose book is it, anyway? You might have thought that question was put to rest a few hundred years ago. For quite a while after Gutenberg invented the printing press, the issue of who controlled a book's contents remained a fraught one. As is often the case, it took many years for laws, contractual arrangements, business practices, and social norms to catch up with the revolutionary new technology. But in due course the dust settled, and control over a book's contents came to rest firmly in the hands of a book's author (at least through the term of copyright). Which seems like the proper outcome. You probably wouldn't, for instance, want book retailers to be able to fiddle with the text of a new book at their whim - that would be annoying, confusing, and wrong. And...

Published on October 06, 2011 12:10