Douglas Rushkoff's Blog, page 32
March 1, 2016
Throwing Rocks at the Google Bus Now Available
February 21, 2016
The Atlantic: Twitter Is Not a Failure
http://www.theatlantic.com/business/archive/2016/02/twitter-failure/463248/
The post The Atlantic: Twitter Is Not a Failure appeared first on Rushkoff.
The People’s Net
15 years ago, in the wake of the dotcom crash and reports of the Internet’s demise, I wrote this piece for Yahoo Internet Life. Joichi Ito just referred to it on his blog, so I thought I’d dig it out of the archives.
The Internet is Back, Alive and Well
Yahoo Internet Life, July 2001
To those of us who really love it, the Internet is looking and feeling more social, more alive, more participatory, and more, well, more Internet-y than ever before. This might sound surprising, given the headlines proclaiming the official bursting of the technology bubble. Likewise, analysts on the financial cable channels and the venture capitalists of Silicon Alley now shun any company whose name ends in .com and have moved on to more promising new buzzwords, such as wireless.
But the statistics fly in the face of conventional wisdom. In terms of real hours spent online and the number of people getting new accounts every day, Internet use is up. We spent an average of 20.2 hours looking at Internet sites in March 2001, up from 15.9 hours last year and 12.8 hours the year before, according to the latest data from Jupiter Media Metrix. More surprisingly, while countless dot-coms have gone under for failure to meet investor demands, e-commerce is actually up-it rose more than 30 percent compared with last year. More than 100 million Americans now buy goods and services online.
The Internet is more capable now than it ever was of supporting the vast range of individual, community, and commercial interests that hope to exploit the massive power of networking. Still, countless investors, analysts, and pundits have fallen off the Internet bandwagon.
Good riddance, I say. The experts jumping ship today can’t see the Internet as anything other than an investment opportunity that has dried up. Sure, the Internet made a lot of people money, but its real promise has always been much greater than a few upward stock ticks. If we can look past the size of our 401(k) plans to the underlying strength of our fledgling networked society, all signs are actually quite good. The Internet has never been in better health.
Maybe this kind of optimism requires us to look at the Internet less as an investment opportunity and more as a new life form. That’s the way we used to see it in ancient times, anyway. Back in the 2,400-baud, ASCII text era of 10 long years ago, the Internet had nothing to do with the Nasdaq Index. Until 1991, you had to sign an agreement promising not to conduct any business online just to get access to the Internet! Imagine that. It was a business-free zone.
How could such rules ever have been put in place? Because the Internet began as a public project. It was created to allow scientists at universities and government facilities to share research and computing resources. Everyone from the Pentagon to Al Gore saw the value of a universally accessible information-sharing network and invested federal funds to build a backbone capable of connecting computers around the world.
What they didn’t realize was that they were doing a whole lot more than connecting computers to one another. They were connecting people, too. Before long, all those scientists who were supposed to be exchanging research or comparing data were exchanging stories about their families and comparing notes on movies. People around the world were playing games, socializing, and crossing cultural boundaries never crossed before. Since no one was using the network to discuss military technology anymore, the government turned it over to the public as best it could.
The Internet’s unexpected social side effect turned out to be its incontrovertible main feature. Its other functions fall by the wayside. The Internet’s ability to network human beings is its very lifeblood. It fosters communication, collaboration, sharing, helpfulness, and community. When word got out, the nerdiest among us found out first. Then came those of us whose friends were nerds. Then their friends, and so on. Someone would insist he had found something you needed to know about-the way a childhood friend lets you in on a secret door leading to the basement under the junior high school.
How many of you can remember that first time you watched that friend log on? How he turned the keyboard over to you and asked what you wanted to know, where you wanted to visit, or whom you wanted to meet? That was the moment when you got it: Internet fever. There was a whole new world out there, unlimited by the constraints of time and space, appearance and prejudice, gender and power.
It’s no wonder so many people compared the 1990s Internet to the psychedelic 1960s. It seemed all we needed to do was get a person online, and he or she would be changed forever. And people were. A 60-year-old Midwestern businessman I know found himself logging on every night to engage in a conversation about Jungian archetypes. It lasted for four weeks before he realized the person with whom he was conversing was a 16-year-old boy from Tokyo.
It felt as though we were wiring up a global brain. Techno visionaries of the period, such as Ted Nelson-who coined the word hypertext-told us how the Internet could be used as a library for everything ever written. A musician named Jaron Lanier invented a bizarre interactive space he called “virtual reality” in which people would be able to, in his words, “really see what the other means.”
The Internet was no longer a government research project. It was alive. Out of control and delightfully chaotic. What’s more, it promoted an agenda all its own. It was as if using a computer mouse and keyboard to access other human beings on the other side of the monitor changed our relationship to the media and the power the media held. The tube was no longer a place that only a corporate conglomerate could access. It was Rupert Murdoch, Dan Rather, and Heather Locklear’s turf no more. The Internet was our space.
The Internet fostered a do-it-yourself mentality. We called it “cyberpunk.” Why watch packaged programming on TV when you can make your own online? Who needs corporate content when you can be the content? This was a world we could design ourselves, on our own terms. That’s why it fostered such a deep sense of community. New users were gently escorted around the Internet by veterans. An experienced user delighted in setting up a newbie’s connection. It was considered an honor to rush out to fix a fellow user’s technical problem. To be an Internet user was to be an Internet advocate.
It’s also why almost everything to do with the Internet was free. Software was designed by people who wanted to make the Internet a better place. Hackers stayed up late coding new programs and then distributed them free of charge. In fact, most of the programs we use today are based on this shareware and freeware. Internet Explorer and Netscape are fat versions of a program created at the University of Illinois. Streaming media is a dolled-up version of CUSeeMe, a program developed at Cornell. The Internet was built for love, not profit.
And that was the problem-for business, anyway. Studies showed a correlation between time spent on the Internet and time not spent consuming TV programs and commercials. Something had to be done.
Thus began the long march to turn the Internet into a profitable enterprise. It started with content. Dozens, then hundreds, of online magazines sprang up. But no one wanted to pay a subscription charge for content. It just wasn’t something one did online. So most of these magazines went out of business.
The others … well, they invented the next great Internet catastrophe: the banner ad. Web publishers figured they could sell a little strip atop each page to an advertiser, who’d use it as a billboard for commercials. But everyone hated them. They got in the way. And the better we got at ignoring banner ads, the more distractingly busy they grew, and the more time-consuming they were to download. They only taught us to resent whichever advertiser was inhibiting our movement.
So advertising gave way to e-commerce. The Internet would be turned into a direct-marketing platform. An interactive mail-order catalog! This scheme seemed to hold more promise for Wall Street investors. Not many of these e-commerce businesses actually made money, but they looked as if they could someday. Besides, Wall Street cares less about actual revenue and more about the ability to create the perception that there might be revenue at some point in the future. That’s why it’s called speculation. Others might call it a pyramid scheme.
Here’s how it works: Someone writes a business plan for a new kind of e-commerce company. That person finds “angel investors”-very in-the-know people who give him money to write a bigger business plan and hire a CEO. Then come the first and second rounds, where other, slightly less in-the-know people invest a few million more. Then come the institutional investors, who underwrite the now-infamous IPO. After that, at the bottom of the pyramid, come retail investors. That’s you and me. We’re supposed to log on to an e-trading site and invest our money, right when the investors at the top are executing their “exit strategy.” That’s another way of saying carpetbagging.
What’s all that got to do with the Internet, you ask? Exactly. The Internet was merely the sexy word, the come-hither, the bright idea at the top of the pyramid. Sure, there were and still are lots of entrepreneurs creating vibrant online businesses. But the Internet was not born to support the kind of global economic boom that venture capitalists envisioned. And by turning its principal use from socializing to monetizing, business went against the Internet’s very functionality.
People doing what comes naturally online-such as sending messages to one another-doesn’t generate revenue. The object of the game, for Internet business, was to get people’s hands off the keyboard and onto the mouse. Less collaboration, more consumption. Sites were designed to be “sticky” so people wouldn’t leave. And “information architecture” turned into the science of getting people to click on the Buy button.
Anyone logging on to the Internet for the first time in the year 2000 encountered a place very different from the interactive playground of *0 years earlier. Browsers and search engines alike were designed to keep users either buying products or consuming commercial content. Most of those helpful hackers were now vested employees of dot-com companies. And most visions of the electronic future had dollar signs before them.
But the real Internet was hiding underneath this investment charade the whole time. It was a little harder to find, perhaps, and few in the mainstream press were writing about it anymore. Nevertheless, plenty of people were still sharing stories, e-mailing relatives, finding new communities, and educating themselves.
This is why so many of the business schemes were doomed to fail. The behavior control being implemented by more nefarious online merchants, the mercenary tactics of former hackers, and the commercial priorities of the Internet’s investors were a blatant contradiction of the Internet’s true nature. Sure, the Internet could support some business guests, the way a tree can support some mushrooms at its base and a few squirrels in its branches. But businesses attacked the Internet like men with chain saws. They needed to be rejected.
The inevitable collapse of the dot-com pyramid was not part of some regular business cycle. And it most certainly was not the collapse of anything having to do with the Internet. No, what we witnessed was the Internet fending off an attack. It’s no different from when the government abandoned the Internet in the ’80s, after scientists online began talking about science fiction instead of defense contracts. The Internet never does what it’s supposed to do. It has a mind, and life, of its own. That’s because we’re alive, too.
Now that the Internet’s role in business has faded into the background, the many great applications developed to make our lives better are taking center stage. They are compelling, and surpass some of our wildest dreams of what the Internet might someday achieve. This past spring, for example, as one dot-com after another was folding, M.I.T. announced a new Web curriculum. This leading university promised that, over the next 10 years, it will carry online the lecture notes, course outlines, and assignments for almost all of its 2,000 courses in the sciences, humanities, and arts. Instituting a policy that would make an Internet investor shudder, M.I.T. plans to release all of this material, to anyone in the world, for free.
Or have a look at Blogger. It’s not just a Web site; it’s also a set of publishing tools that allows even a novice to create a Weblog, automatically add content to a Web site, or organize links, commentary, and open discussions. In the short time Blogger has been available, it has fostered an interconnected community of tens of thousands of users. These people don’t simply surf the Web; they are now empowered to create it.
Taking their cue from old-school Internet discussion groups like Usenet, Web sites such as MetaFilter let people begin discussions about almost anything they’ve found online. Each conversation begins with a link, then grows as far as its participants can take it. This is the real beauty of hypertext, and it’s finally catching on. Although hackers have used bulletin board interfaces on sites such as Slashdot since the Web’s inception, more commercially minded endeavors-e.g., Plastic-are adopting the same model to generate dialogues about culture and media.
On Yahoo! the biggest growth area is conversation. Yahoo! Groups, a set of bulletin board discussions and mailing lists, contains thousands of the best discussions happening online-and almost all of them have been started by real people. Based on an old but still widely used style of e-mail conversation called Listserv, it allows group members to read postings and add to the conversation without ever opening their browsers. Some of these special-interest groups are narrowcast to a degree possible only on a global network where people interested in anything from absinthe drinking to zither tuning can find one another across great distances.
And now that international trade and open markets are no longer the Internet’s chief global agenda, more humanitarian efforts are taking shape. Back in 1999, my friend Paul Meyer helped launch Internet Project Kosovo (IPKO) just days after NATO stopped shelling the Serbs. A single satellite dish let Albanian refugees find lost family members, and enabled aid agencies to allocate their resources. Today, Meyer and others are helping people in this and other war-torn and developing regions to network, and even open businesses.
For those whose refugee status ended long ago, Ellis Island has teamed with the Mormon Church to create a database containing arrival records for the 22 million immigrants who came through the New York port between 1892 and 1924. Linked databases, accessible to anyone via the Internet. Is this starting to sound familiar?
Or remember how the Internet was supposed to provide us with alternative sources of news and information? Although it was almost lost under the avalanche of content during the dot-com gold rush, AlterNet has emerged as a vibrant source of news and opinions you won’t see in your evening paper anytime soon. It’s the ultimate alternative newsweekly, available on the Web or by e-mail, using the Internet to collect and syndicate content from sources that just couldn’t get published any other way. And it’s free.
It’s not that the original Internet community went into some sort of remission. No, not all. While e-commerce customers were waiting for return authorization numbers for misordered merchandise from Pets.com, the participants in AOL’s chatrooms were exchanging tips on caring for their Chihuahuas. While DoubleClick was reckoning with plummeting click-through rates on its banner ads, the personal ads in the Nerve singles classifieds were exploding. While the value of many E*Trade portfolios was falling into the red, people who’d never sold anything before were making money peddling items through the auctions on eBay.
Likewise, as headlines panicked investors about the failure of broadband, the massive communities built on IRC chat channels and other early live networking platforms were finding new, more advanced avenues for social and intellectual exchange. For-profit streaming media companies like Icebox may have failed, but the streaming technologies they used have survived and flourished as social tools such as iVisit and NetMeeting. And while the client lists of business-to-business service companies have shrunk, peer-to-peer networks, from Napster to Hotline, still grow in popularity and resist all efforts to quell the massive exchange of data, illegal or not.
In fact, the average American home now has more information and broadcast resources than a major television network newsroom did in the ’70s. A single Apple laptop is a video production studio, allowing even for the complex editing of independent films. Add a fast Internet connection, and a home producer can broadcast around the globe. My own Aunt Sophie, armed with a scanner and e-mail account, has inundated the family with photos of all our relatives’ new babies.
Independent radio stations run through DSL and cable modems out of studio apartments around the world find loyal audiences through Shoutcast and other amateur media networks. And, as the word amateur suggests, these stations are born out of love for a particular genre of music. They allow aficionados from anywhere to enjoy their favorite styles-from raga to reggae-round the clock.
The early Internet was often compared to the Wild West-an anarchic realm where a lone hacker could topple any empire-and that spirit of independence still dominates the culture of the interactive space. Any group or individual, however disenfranchised, can serve as the flash point for an extraordinarily widespread phenomenon.
Online sensations-such as the spoof of the Japanese video game at All Your Base Are Belong to Us! and the parody of Budweiser’s “Wassup?” commercial at Budwizer.com: Wassup Page-are launched by teenagers and distributed by e-mail to millions of office cubicles, eventually finding their way to the evening news. Think about it: The mind-melding of some 14-year-old kid and his computer-such as Neil Cicierega, who created the brilliant parody of consumer culture called Hyakugojyuuichi!!-becomes a conversation piece around the watercooler in thousands of offices all over the world. Powerful stuff.
It gets better. Thousands of hackers worldwide still represent a threat to major software companies, the DVD industry, and any corporation whose interests rely on closed-source computer code or encrypted files. No sooner is a new closed standard released than it is decoded and published by a lone hacker-or by a team of hackers working in tandem from remote and untraceable locations. Activists of all stripes have also seized upon the Internet to cultivate relationships across vast distances and promote new alliances between formerly unaffiliated groups. The Internet-organized demonstrations against World Trade Organization meetings in Seattle and Quebec are only the most notable examples of such networking.
In spite of the many efforts to direct its chaotic, organismic energy toward the monolithic agenda of Wall Street, the Internet can’t help but empower the real people whose spirit it embodies. I’ve mentioned only a few of the thousands of equally vital new buds blooming on the Internet today. They thrive because they promote the life of the Internet itself. They are not parasites but fruit, capable of spreading their own seeds and carrying the Internet’s tendrils even further. They are the Internet.
They share the very qualities that make the Internet so compelling and valuable: transparency, participation, openness, and collaboration. Theirs are the ideals and communities that allowed the Internet to fend off efforts to harness its power for a single, selfish objective. They are also what will keep the Internet resilient enough to withstand the next attack.
So do not mourn. Rejoice. While you may never be able to sell that great dot-com name or make a bundle on that tech stock you bought last year, you’re getting to participate in something that no civilization in the history of the planet has ever had the privilege of experiencing until now: the Internet.
The post The People’s Net appeared first on Rushkoff.
February 18, 2016
The Atlantic: Twitter Is Not a Failure
To listen to Wall Street tell the story, Twitter is an abject failure. The stock is down more than 50 percent since co-founder Jack Dorsey took over as CEO last year. User growth and revenue prospects have stagnated, and investors see little chance of a major turnaround.
Yet only in the twisted logic of the startup economy could a company with around $500 million of revenue per quarter—and more, most recently—be called a failure. That’s half a billion dollars for a tiny application that simply lets people send out 140 characters to each other. The economic activity it has generated is nothing short of miraculous.
But that’s not enough for investors who expect recoup 100 or even 1,000 times their original investment in the company. To do that, Twitter must grow. Somehow, it must turn itself from a simple, popular, and profitable way for more than 300 million people to broadcast messages into something still bigger—even if it has to risk killing what people love about Twitter in order to do so.
This is why I couldn’t help but grimace that morning I saw Twitter’s founders smiling on the floor of the New York Stock Exchange as the company celebrated its IPO and each of them became billionaires. Among them, these guys had upended journalism with Blogger, and credit with Paypal and Square. Here they were throwing in with the biggest industry of them all. When you get to ring the opening bell on the exchange and bask in the applause of the traders on the floor, it’s not because you have “disrupted” something. It’s because you have confirmed that—at least for a few—the game is still working. As the dealer is sure to cry out at the casino for all to hear, “We have a winner!”
But becoming such a winner—even playing the startup game to begin with—condemns the founders of a company to chase growth above all else. That’s the core command of the highly accelerated digital economy.
This is why a company like Uber can’t simply be satisfied helping people get rides. It must instead establish a monopoly in the taxi business so it can “pivot” to another vertical such as delivery services, logistics, or robotic transportation. Airbnb can’t just help people find places to stay, but must colonize city after city and deregulate its entire sector. A social media platform like Facebook must pivot to become a data miner; a messaging app Snapchat must try to become a news service; even a giant like Google must accept that its once-inspiring stream of innovations pales in comparison to what it can earn as a new holding company, Alphabet.
For Twitter, this command means finding a way to grow a business that may already be full-grown. What if half a billion dollars a quarter really is all the world wants to spend on tweets? But that is not an option. Instead, the company must pivot toward new potential growth areas, at the expense of the market it already has.
And so Twitter users are confronted with a news reader through which they’re supposed to glean the headlines. Or a new, annoying feature called “Twitter moments”—an algorithmically derived stream of greatest hits, which is little more than a thinly veiled opportunity to fold in “Sponsored Moments,” meaning commercial messages masquerading as organic content. Now the company is working on live-streaming video ads, again valuing growth over user experience.
Maybe it’s this very drive toward growth that is pushing users away. For the first time Twitter’s user base has begun to decline, from 307 million users down to 305. It’s just a tick, of course, but in the wrong direction.
If Twitter were to value the sustainability of its enterprise over the growth prospects of its shares, it wouldn’t have to invest so much of its revenue in new, outlandish features, and would have a lot more to show in profit. Heck, it might even be able to offer a dividend.
Last week, Dorsey told investors on his conference call that he wants Twitter to become “the planet’s largest daily connected audience.” That’s supposed to give them hope for the future. But when the hope of a company is based on it becoming the biggest thing in the whole world, chances are the opportunity for genuine prosperity has already been lost.
The post The Atlantic: Twitter Is Not a Failure appeared first on Rushkoff.
February 16, 2016
The Malfunctioning Tech Economy
http://www.theguardian.com/technology/2016/feb/12/digital-capitalism-douglas-rushkoff
The post The Malfunctioning Tech Economy appeared first on Rushkoff.
February 12, 2016
The Guardian: The Malfunctioning Tech Economy

Douglas Rushkoff emerged as a media commentator in 1994 with his first book, Cyberia. His debut examined “the early psychedelic, rave roots of digital technology. I was trying to infer what a digital society might be like given the beliefs of these people,” he tells me during a phone interview from his home in Hastings on Hudson, New York.
He has published 10 books detailing an increasingly fierce critique of digital society. Along the way Rushkoff has coined terms that have slipped into the lexicon such as “digital natives”, “social currency” and “viral media”. He has also made several documentaries and written novels both graphic and regular; consulted for organisations from the UN to the US government and composed music with Genesis P-Orridge. In 2013 MIT named him the sixth most influential thinker in the world, sandwiched between Steven Pinker and Niall Ferguson.
His latest book, Throwing Rocks at the Google Bus: How Growth Became the Enemy of Prosperity, is published by Portfolio Penguin on 3 March.
Was this a topic for this book something you’d been pondering for a while or was this book inspired by the Google bus protests in San Francisco?
Actually the germ of the idea was when in 2000 AOL announced they were buying Time Warner, which was a huge deal. It was the moment where I realised that digital businesses were not disrupting the underlying operating system of traditional corporate capitalism. The question I had been asking myself before that point was: will digital media ‘networkise’ capitalism or will capitalism commodify and destroy the internet? Initially, with people like Howard Rheingold and Stewart Brand the internet promised a retrieval of a 60s hippy communal approach to the world.
What do you find most objectionable about the kind of economy that technology appears to create?
What’s most pernicious about it is that we are developing companies that are designed to do little more than take money out of the system – they are all extractive. There’s this universal assumption that we have to turn working currency into share price.
You call this the “growth trap”?
The growth trap is the assumption of business that growth and health are the same thing – and I understand how they got back that way – that when you have a debt-based monetary system it has to pay back to the central banks more than was borrowed and that requires growth. So if you have a currency that requires growth in order to have value you’re going to have all these businesses biased towards growth rather than everything else.
For example?
Uber has nothing to do with helping people get rides in towns. Uber is a business plan. It’s a platform monopoly getting ready to leverage that monopoly into another vertical whether it be delivery, drones or logistics. The prosperity of all the people who used to be in the cabbie industry ends up sacrificed to the growth of this company. Corporations are like these obese people, they suck money out of our economy and store it in the fat of share price. That’s not business, that’s value extraction. They take all the chips off the board.
You’re an advocate of local currencies and bartering. Do you see “sharing economy” platforms such as Airbnb as their internet manifestation?
Yes and no. Initially they seemed to be leaning in the right direction, they appeared to be encouraging peer-to-peer exchange. Which is what we need the ability to do – I want to buy from you, you want to sell to me but without some big corporation being involved. The real problem is they end up taking too much venture capital and then the money people say you’ve got to extract more from that transaction – you can’t just take 5% for your little app, you should be taking half. So the young developer is forced to pivot from whatever the original idea was to become a monopoly that allows the company to reach a sellable event – an IPO or an acquisition – in order for the original investors to get 100 times their initial investment. Anything less than that is a loss for them, they need a home run.

FacebookTwitterPinterest
Douglas Rushkoff speaks at Occupy Wall Street in Zuccotti Park. Photograph: Alamy
You left Facebook in 2013. How is that working out for you?
Professionally, I’m thinking it may be good for one’s career and business to be off social media altogether. Chris Anderson was wrong. “Free” doesn’t lead to anything but more free. Working for free isn’t leverage to do a talk for loads of money; now they even want you to talk for free. What am I supposed to do? Join YouTube and get three cents for every 100,000 views of my video? That is crap; that is insane!
So business-wise I’m thinking that every time I post an article summarising what my book is about I’m hurting the sales and I end up delivering my ideas in a piecemeal, context-less fashion which ends up communicating less. And it makes my ideas much more easily applied for evil by corporations. That’s the lesson I should have learned in 1994 when I published Media Virus and my concept got turned into “viral marketing”, which took a slither of an idea and used it for pernicious applications.
I hope you don’t regard this interview as part of that process.
Not at all, but if I write a piece for someone, they ask: “Are you gonna tweet it? Facebook it? Are you going put it on your blog? Are you RSSing that blog? Do you have a newsletter?” Oh my God, I became an author to sit alone and write ideas. It used to be when you finished a book it would be a celebration. Now it’s when the work starts. It’s torture.
You’re an established writer, but social media can be useful to someone just starting out.
Maybe I’m unfair. I’m sure there is a way of using Facebook as a ladder to get to somewhere else. But also knowing what Facebook does behind the scenes, I thought it was bad digital hygiene to encourage people to “like” me and make them more vulnerable to nasty things.
What kind of nasty things?
They’ll get marketed to. Facebook will market you your future before you’ve even gotten there, they’ll use predictive algorithms to figure out what’s your likely future and then try to make that even more likely. They’ll get better at programming you – they’ll reduce your spontaneity. And they can use your face and name to advertise through you, that’s what you’ve agreed to. I didn’t want Facebook to advertise something through me as an influencer where my every act becomes grist to marketing.
Do you ever feel like you’re shouting into the abyss? Most people are relaxed about the levels of surveillance and tracking that happen on the internet. They enjoy and use the services too much to care …
I’m less frustrated by people’s blindness to the problem than I am to their blindness to the solutions – by how easy it is to develop local currencies, to use alternative websites, to do simple investments in their communities rather than in far-flung mining companies. People don’t realise how much power they have. And that’s partly because the real world has been dwarfed by this digital simulacra which seems much more important than our reality but it’s not – it needs to be in service of our reality.
Is it true that in the early 90s your publishers cancelled your first book Cyberia because they thought the internet wouldn’t last?
I finished it in 1992 but the publisher believed the net would be over by 1993 so they cancelled it. So I sold it to HarperCollins – a Rupert Murdoch imprint, so I took a whole load of grief from my leftie friends.
You’ve been credited with coining the term “digital natives” – saying they are better equipped to navigate the current landscape. Is it not harder for them since they don’t have an experience of anything pre-Google, pre-smartphone etc?
Originally I thought they could navigate it better and my generation were the immigrants. I think they have more facility with these networks and platforms as they are designed but they have less insight that they are designed environments. They don’t see how they are tilted towards extracting value from them. They could benefit from engaging with those of us that saw how those networks were put together. That’s why I wrote the book Program or Be Programmed – if you don’t know what a piece of software is for, the chances are you are being used by it.
Do you still advocate taking a digital sabbath?
I came up with this thing which I now don’t like: the digital sabbath. It feels a little forced and arbitrary, and it frames digital detox as a deprivation. I would much rather help people learn to value looking into other people’s eyes. To sit in a room talking to people – I want people to value that, not because they aren’t being interrupted by digital media but because it’s valuable in its own right.
The post The Guardian: The Malfunctioning Tech Economy appeared first on Rushkoff.
January 8, 2016
How the Digital Media Environment Enforces Boundaries
In the 1980’s, the ultimate television president, Ronald Reagan, went to Berlin and implored Mr. Gorbachev to “tear down this wall.” Thanks to the global spectacle of electronic age as well as the unifying image of the earth from space, we were on our way to becoming one world. For better and for worse, both the spirit of kumbaya and the new power of the global market were in full force. This was utterly consistent with the media landscape of that society.
Today, the ultimate Internet candidate, Donald Trump, offers not to tear down a wall but to build one between the United States and Mexico. Thanks to the discrete bits and binary logic of the digital age, as well as the frightfully alienating spectacle of beheadings on social media, we are becoming obsessed with divisions and indentificiation. For better and for worse, both the spirit of decentralization and the latent power of nationalism are in full force. This is utterly consistent with the media landscape of our society.
Consider the current argument over Ted Cruz’s status as a “natural born citizen.” No matter how disingenuously the question was raised, it proved wiggly enough to bring Harvard constitutional scholar Laurence Tribe to explain on CNN that “the Supreme Court has never fully addressed the issue one way or the other.” Even though Tribe believes Cruz’s eligible to run, he nevertheless wants this grey area to be rendered in black and white.
This is a digital-style problem. I don’t mean it’s caused by digital media so much as reflective of the qualities, the biases of the digital media enviornment in which we live.
For just one example, as we transitioned from emulsion film to digital photography and projection, we replaced smooth, random specks of silver with discreet pixels of numerically rendered tints. Each pixel required the computer to make a decision about what color to enter into the pixel. Back when there were 16 colors, that was a very crude estimate. Is it blue or purple? Whichever is closest.
Even with millions of colors and retina-display density, the decision must be made. Definition is forced, and once the decision is made, fidelity is assured forever more. Everything has been made discrete (not discreet, but distinct).
That’s why we’re either Americans or Mexicans, Canadians or natural born citizens. Red states or blue states. Where pixels are getting mixed up, well, that’s where we have to build better walls. Get Supreme Court decisions that something is one way or the other. All the wiggle room, the undefined nooks and crannies that may have created ambiguity but also helped soften the edges of our societies, is taken away.
I was thinking our goal should be to re-establish the ambiguity — find new tolerance for ill-defined and undefined places on the spectrum. But even in those places, like the increasingly nuanced definition of gender, most are gravitating toward evermore specific names for their sense of self.
So now I’ve started to wonder if it’s better to push through. Maybe forcing definitions, as our digital environment seems to be doing, will lead to more granular definitions and categories. But each time we do this process, we will also be forced to come to terms with the arbitrary nature of all these categories and distinctions. Each one is a compromise, no matter how many decimal places we use.
The post How the Digital Media Environment Enforces Boundaries appeared first on Rushkoff.
January 7, 2016
Fork the Economy
I’ve given up on fixing the economy. The economy is not broken. It’s simply unjust. There’s a difference.
We have to stop looking at our economy as a broken system, but one that is working absolutely true to its original design. It’s time to be progressive — and this means initiating systemic changes.
For example, Bernie Sanders’ well-meaning calls to rein in the banking industry by restoring the Federal Reserve’s function as a “regulatory agency” reveals the Left’s inability to grasp the true causes for today’s financial woes. We are not witnessing capitalism gone wrong — an otherwise egalitarian currency system has not been corrupted by greedy bankers — but, rather, capitalism doing exactly what it was programmed to do from the beginning. To fix it, we would have to dig down to its most fundamental code, and rewrite it to serve people instead of power.
First off, the role of the Federal Reserve was never to serve as an “agency.” It’s not like the Environmental Protection Agency, which is charged with regulating corporate destruction of the natural world — however woefully it may be carrying out that purpose. Rather, the Fed is a private corporation — a banker’s bank owned by the banks — created to guarantee the value of currency. It was built to serve the dollar and maintain its value by fighting inflation. When the Fed is feeling magnanimous, it can also lend extra money into existence, in the hope that it will be invested in enterprises that employee people.
The actions of the Fed, however, are limited by the way our money, central currency, was designed to work. It was developed back before the Industrial Age, as a waning European aristocracy sought to stem the rise of the merchant middle class. Small merchants were getting rich for the first time since feudalism began, thanks to the spread of the peer-to-peer marketplace and its ingenious new currency system of grain receipts and market money.
Photo credit: epicharmus via Foter.com / CC BY.
At the beginning of the market day, a baker could put receipts for bread into circulation by purchasing his weekly supplies. Those receipts could be spent on other items until a receipt holder actually needed bread, and cashed it in. Other moneys were based on stored grain or hay. They were created not for savings or accumulation, but to promote transactions.
One by one, European monarchs outlawed these local currencies and implemented central currencies that could only be lent into existence, at interest. If a business wanted to use money, it would have to borrow it from the central bank, at interest. This new system helped the rich maintain their exclusivity over wealth. They could get richer simply by being rich.
The monetary system was designed not to help people create and exchange value, but rather to extract value from anyone hoping to transact. It was not designed to promote circulation, but to serve as a drag on circulation.
Making matters worse, central currency requires an economy to grow — and to do so faster and faster. If, for every $100,000 lent into circulation, $200,000 has to eventually be paid back, then where does the other $100,000 come from? Someone has to borrow or earn it.
Now this scheme works fine as long as the economy is growing — as the colonial powers were through their conquest of the world, and even America managed to do through corporate expansion in the decades following WWII. But our ability to grow has reached its limits. There are no more regions to conquer or developing nations to exploit. Efforts to escape into outer space notwithstanding, our planet has been stretched beyond its carrying capacity for additional extraction and growth.
We are moving toward an economic plateau; but, while a steady state economy of slow or no growth is good for people and planet, it is utterly incompatible with the money system on which our economy is still based.
Making matters worse, in the digital age, we have accelerated our stock markets with high frequency trading and our business landscape with steroidal startups and ruthless platform monopolies from Amazon to Uber. These companies are valued less for their ability to turn a profit than to get acquired or reach IPO — and pay up to the institutions who lent them their original capital.
No, charging the Fed with fixing the problems of capitalism is like asking an oil company to help get us off fossil fuels. That’s selling the wrong tool for the job.
As I’ve argued in my upcoming book, Throwing Rocks at the Google Bus, we are running a 21st Century digital economy on a 13th Century printing press-era operating system. The opportunity of a digital age and the sensibilities it brings is to reprogram money to favor transaction over accumulation — flow over growth.
Photo credit: 401(K) 2013 via Foter.com / CC BY-SA.
This means experimenting with new, frictionless forms of exchange — from local currencies that increase circulation 10-fold over bank-issued money to Bitcoin, which verifies transactions without the need for an expensive central authority. Already, we see successful implementations of alternative monetary systems not only in progressive coastal cities, but also former industrial cities of the steel belt. Online “favor banks” energize the exchange of goods and services in communities from austerity-paralyzed Greece to recession-devastated Lansing, Michigan. New, investor-proof co-ops — from window manufacturers in Chicago to software developers in New Zealand — consciously optimize for the flow of value through a network, rather than the extraction of value from it.
Platform cooperatives — such the driver-owned, ride-sharing platform Lazooz — utilize the blockchain to assess ownership based on the number of miles driven. Even if the company follows Uber toward driverless vehicles, at least its workers will share in the future earnings their labor has created.
What distinguishes these experiments from traditional Leftism is that they are not attempting to compensate for the inequities of our economic system after the fact. They are not redistributing the spoils of corporate capitalism, as top-down enacted policies would do. Rather, they mean to distribute the means of production and the tools for exchange more widely. From Benefit Corporations to local crowdfunding, the best efforts at forging more equitable financial instruments are characterized by a willingness to reprogram business, currency, and exchange from the inside out.
That’s why, as we embark on another election year, we must stop looking toward candidates to tweak one knob or the other on our existing economy or monetary system. Replacing the members of the Fed won’t change the basic nature of the Fed any more than an incrementally more progressive tax code will change the extractive nature of central currency.
What those who hope to rein in the banking industry must do instead is break its monopoly over value creation and exchange by fostering competitive currencies, alternative corporate structures, worker-ownership, and restored respect for land and labor instead of just capital. If we can’t join ’em, then let’s beat ’em at their own game. We can make our own economy and money, too.
After all, it is a free market.
##
For an explanation of how we can reprogram the economy from the inside-out, check out Rushkoff’s upcoming book, Throwing Rocks at the Google Bus: How Growth Became the Enemy of Prosperity , coming March 1 and available for pre-order today. He will also be speaking at SXSW, the 92nd St Y, and SF Commonwealth Club.
via Shareable.net
The post Fork the Economy appeared first on Rushkoff.
January 4, 2016
Fork the Economy
I’ve given up on fixing the economy. The economy is not broken. It’s simply unjust. There’s a difference.
We have to stop looking at our economy as a broken system, but one that is working absolutely true to its original design. It’s time to be progressive — and this means initiating systemic changes.
For example, Bernie Sanders’ well-meaning calls to rein in the banking industry by restoring the Federal Reserve’s function as a “regulatory agency” reveals the Left’s inability to grasp the true causes for today’s financial woes. We are not witnessing capitalism gone wrong — an otherwise egalitarian currency system has not been corrupted by greedy bankers — but, rather, capitalism doing exactly what it was programmed to do from the beginning. To fix it, we would have to dig down to its most fundamental code, and rewrite it to serve people instead of power.
First off, the role of the Federal Reserve was never to serve as an “agency.” It’s not like the Environmental Protection Agency, which is charged with regulating corporate destruction of the natural world — however woefully it may be carrying out that purpose. Rather, the Fed is a private corporation — a banker’s bank owned by the banks — created to guarantee the value of currency. It was built to serve the dollar and maintain its value by fighting inflation. When the Fed is feeling magnanimous, it can also lend extra money into existence, in the hope that it will be invested in enterprises that employee people.
The actions of the Fed, however, are limited by the way our money, central currency, was designed to work. It was developed back before the Industrial Age, as a waning European aristocracy sought to stem the rise of the merchant middle class. Small merchants were getting rich for the first time since feudalism began, thanks to the spread of the peer-to-peer marketplace and its ingenious new currency system of grain receipts and market money.
Photo credit: epicharmus via Foter.com / CC BY.
At the beginning of the market day, a baker could put receipts for bread into circulation by purchasing his weekly supplies. Those receipts could be spent on other items until a receipt holder actually needed bread, and cashed it in. Other moneys were based on stored grain or hay. They were created not for savings or accumulation, but to promote transactions.
One by one, European monarchs outlawed these local currencies and implemented central currencies that could only be lent into existence, at interest. If a business wanted to use money, it would have to borrow it from the central bank, at interest. This new system helped the rich maintain their exclusivity over wealth. They could get richer simply by being rich.
The monetary system was designed not to help people create and exchange value, but rather to extract value from anyone hoping to transact. It was not designed to promote circulation, but to serve as a drag on circulation.
Making matters worse, central currency requires an economy to grow — and to do so faster and faster. If, for every $100,000 lent into circulation, $200,000 has to eventually be paid back, then where does the other $100,000 come from? Someone has to borrow or earn it.
Now this scheme works fine as long as the economy is growing — as the colonial powers were through their conquest of the world, and even America managed to do through corporate expansion in the decades following WWII. But our ability to grow has reached its limits. There are no more regions to conquer or developing nations to exploit. Efforts to escape into outer space notwithstanding, our planet has been stretched beyond its carrying capacity for additional extraction and growth.
We are moving toward an economic plateau; but, while a steady state economy of slow or no growth is good for people and planet, it is utterly incompatible with the money system on which our economy is still based.
Making matters worse, in the digital age, we have accelerated our stock markets with high frequency trading and our business landscape with steroidal startups and ruthless platform monopolies from Amazon to Uber. These companies are valued less for their ability to turn a profit than to get acquired or reach IPO — and pay up to the institutions who lent them their original capital.
No, charging the Fed with fixing the problems of capitalism is like asking an oil company to help get us off fossil fuels. That’s selling the wrong tool for the job.
As I’ve argued in my upcoming book, Throwing Rocks at the Google Bus, we are running a 21st Century digital economy on a 13th Century printing press-era operating system. The opportunity of a digital age and the sensibilities it brings is to reprogram money to favor transaction over accumulation — flow over growth.
Photo credit: 401(K) 2013 via Foter.com / CC BY-SA.
This means experimenting with new, frictionless forms of exchange — from local currencies that increase circulation 10-fold over bank-issued money to Bitcoin, which verifies transactions without the need for an expensive central authority. Already, we see successful implementations of alternative monetary systems not only in progressive coastal cities, but also former industrial cities of the steel belt. Online “favor banks” energize the exchange of goods and services in communities from austerity-paralyzed Greece to recession-devastated Lansing, Michigan. New, investor-proof co-ops — from window manufacturers in Chicago to software developers in New Zealand — consciously optimize for the flow of value through a network, rather than the extraction of value from it.
Platform cooperatives — such the driver-owned, ride-sharing platform Lazooz — utilize the blockchain to assess ownership based on the number of miles driven. Even if the company follows Uber toward driverless vehicles, at least its workers will share in the future earnings their labor has created.
What distinguishes these experiments from traditional Leftism is that they are not attempting to compensate for the inequities of our economic system after the fact. They are not redistributing the spoils of corporate capitalism, as top-down enacted policies would do. Rather, they mean to distribute the means of production and the tools for exchange more widely. From Benefit Corporations to local crowdfunding, the best efforts at forging more equitable financial instruments are characterized by a willingness to reprogram business, currency, and exchange from the inside out.
That’s why, as we embark on another election year, we must stop looking toward candidates to tweak one knob or the other on our existing economy or monetary system. Replacing the members of the Fed won’t change the basic nature of the Fed any more than an incrementally more progressive tax code will change the extractive nature of central currency.
What those who hope to rein in the banking industry must do instead is break its monopoly over value creation and exchange by fostering competitive currencies, alternative corporate structures, worker-ownership, and restored respect for land and labor instead of just capital. If we can’t join ’em, then let’s beat ’em at their own game. We can make our own economy and money, too.
After all, it is a free market.
##
For an explanation of how we can reprogram the economy from the inside-out, check out Rushkoff’s upcoming book, Throwing Rocks at the Google Bus: How Growth Became the Enemy of Prosperity , coming March 1 and available for pre-order today. He will also be speaking at SXSW, the 92nd St Y, and SF Commonwealth Club.
via Shareable.net
The post Fork the Economy appeared first on Rushkoff.
Rebooting Work
Digital and robotic technologies offer us both a bounty of productivity as well as welcome relief from myriad repeatable tasks. Unfortunately, as our economy is currently configured, both of these seeming miracles are also big problems. How do we maintain market prices in a world with surplus productivity? And, even more to the point, how do we employ people when robots are taking all the jobs?
Back in the 1940’s, when computers were completing their very first cycles, the father of “cybernetics,” Norbert Wiener, began to worry about what these thinking technologies might mean for the human employees who would someday have to compete with them. His concern for “the dignity and rights of the worker” in a technologized marketplace were decried as communist sympathizing, and he was shunned from most science and policy circles.
Although it may still sound like heresy today, Wiener realized that if we didn’t change the underlying operating system of our economy – the very nature and structure of employment and compensation – our technologies may not serve our economic prosperity as positively as we might hope.
As we wrestle with the bounty of productivity as well as the displacement of employees by digital technologies, we may consider the greater operating system on which they’re all running. If we do, we may come to see that the values of the industrial economy are not failing under the pressures of digital technology. Rather, digital technology is expressing and amplifying the embedded values of industrialism.
It’s time we have the conversation toward which Wiener was pushing us, and challenge some of the underlying assumptions of human employment. The current anxiety over the future of work may be inspired by the increasing processing power of computers and networks, or even the platform monopolies of Amazon and Uber. But it has its roots in mechanisms much older than these technologies – mechanisms set in motion at the onset of industrialism, in the 13th century.
Looked at in terms of human value creation, the industrial economy appears to have been programmed to remove human beings from the value chain. Before the Industrial Age, the former peasants of feudalism were enjoying a terrific economic expansion. Yes, in spite of the way they’ve been chronicled by Renaissance court historians, the very late Middle Ages were actually a boom time. The Crusaders had just returned from their global treks, having established trade routes through which the goods of many lands could travel. They also returned with new technologies for agriculture and trade, including the bazaar – a marketplace for the exchange of crafts, crops, grain and meat, which used new financial instruments such as grain receipts and market money.
But as the peasants got wealthy exchanging goods and services, the aristocracy got relatively poorer. So they re-established control over the economy by outlawing market moneys and chartering monopolies with dominion over particular industries. So now, instead of making shoes himself, the local cobbler had to get a job at the officially chartered monopoly company. Thus what we think of as “employment” was born – less an opportunity than a restriction on creating value.
Instead of selling his shoes, the cobbler sold his hours – a form of indenture previously known only to slaves. Worse, his skills were not valued. The owners of proto-factories saw in industrial processes a way to hire cheaper workers, with less leverage against them. Why hire a skilled craftsman when you can break down the shoe-making into tiny steps, each capable of being taught to a day laborer in 15 minutes?
Viewed in this light the Industrial Age may have had no more to do with making products better or more efficiently than simply removing human beings from the value equation, and monopolizing wealth at the top. Automation reduced the economy’s dependence on the laboring classes. Those few tasks that still required humans could go to the lowest bidder – ideally in countries too far away for the human toll to be noticed by potential customers.
The only business priority these companies understood was growth. That’s largely because their own solvency was based on paying interest to nobles chartering and later to the banks financing them. But today, growth has become an end in itself—the engine of the economy—and humans have come to be understood as impediments to its functioning. If only people and our idiosyncratic demands could be eliminated, business would be free to reduce costs, increase consumption, extract more value, and grow.
This is one of the primary legacies of the Industrial Age, when the miraculous efficiency of machines appeared to offer us a path to infinite growth—at least to the extent that human interference could be minimized. Applying this ethos in a digital age means replacing the receptionist with a computer, the factory worker with a robot, and the manager with an algorithm. When digital companies disrupt an existing industry, they tend to offer just one new job for every 10 they render obsolete.
If we want a digital economy that gets people back to work, we have to program it for something very different. The word digital itself refers to the digits—the 10 fingers – that we humans use to build, to count, and to program computers in the first place. That we should now witness a renaissance in makers, crafts and artisanal production is no coincidence. The digital landscape encourages production from the periphery, lateral trade, and the distribution of wealth. Instead of depending on centralized institutions for sustenance, we begin to depend on one another.
Where the corporations of the past depended on government regulation to maintain their monopolies, today’s digital companies do it through the monopoly of the platforms themselves. Today’s digital behemoths are not factories but networks whose embedded programming controls the landscape on which interactions take place. In a sense, Uber is software designed to extract labor and capital (in the form of automobiles) from drivers and convert it into share price for its investors. It is not an opportunity to exchange value so much as to do the R&D for a future network of robotic cars, without even offering a share in the ownership.
Thankfully, the remedies are varied. Unlike the one-size-fits-all solutions of the Industrial Age, distributed prosperity in a digital age won’t scale infinitely. Rather, the solutions gain their traction and power by reconnecting people and rewriting business plans from the perspective of serving human stakeholders rather than abstracted share values.
Yes, on the surface most of them sound idealistic or even socialist, but they are being tried by companies and communities around the world, and with documented success. Among the many I explore in my upcoming book on the subject are letting employees share in increased productivity by reducing their workweek — at the same rate of pay. Or contending with overproduction by implementing a guaranteed minimum income. Or retrieving the Papal concepts of “distributism” and “subsidiarity,” through which workers are required to own the means of production, and companies grow only as large as they need to in order to fulfill their purpose. Growth for growth’s sake is discouraged.
Many companies today – from ridesharing app Lazooz to Walmart competitor WinCo – are implementing worker-owned “platform cooperatives” to replace platform monopolies, allowing those contributing land or labor to an enterprise to earn an ownership share equal to those contributing just capital.
Finally, distributing the spoils of distributed technologies means accepting the good news: there may simply be fewer employment opportunities for people. We must remember that employment may really just be an artifact of an old system – the reactionary move of a bunch of nobles who were afraid for people to create value for themselves.
Once we’re no longer conflating the idea of “work” with that of “employment,” we are free to create value in ways unrecognized by the current growth-based market economy. We can teach, farm, feed, care for and even entertain one another. The work challenge is not a problem of scarcity but a spoil of riches. It’s time we learn to deal with it that way.
Douglas Rushkoff is Professor of Media Theory and Digital Economics at Queens/CUNY, and the author of “Throwing Rocks at the Google Bus: How Growth Became the Enemy of Prosperity’’ (Portfolio, March 2016).
(published on Pacific Standard)
The post Rebooting Work appeared first on Rushkoff.


