More on this book
Community
Kindle Notes & Highlights
Read between
March 2 - March 10, 2019
An important component of this entire process was the ability to “pay per click” —as opposed to paying based on the number of people who (theoretically) viewed your ad, as every other online advertiser did in the dot-com era. This was the second key innovation: with the GoTo model, an advertiser only “paid for performance.” If no one clicked on your ad, you paid nothing. This was a radical but extreme...
This highlight has been truncated due to consecutive passage length restrictions.
Two thousand two would become Google’s first profitable year, with $440 million in sales and $100 million in profits.16 By 2003, profits were more than $185 million and the AdWords program could boast more than 100,000 advertisers, all without a commensurate rise in Google’s head count, because the AdWords sales system was automated.17
Google can be thought of as a company born from two miracle inventions, one of which it came up with itself, and the other of which was cribbed from Overture. Definitively solving the problem of web search is obviously the miracle that has made the largest impact on our society.
by improving on Overture’s pioneering work with paid links, Google was able to achieve something just as amazing: it made the Internet profitable at scale and for the first time. Paid search would prove to be the greatest advertising engine yet devised by man. Furthermore, algorithmically served ads
This digital economy didn’t just flower on the marketing side of the equation, because Google had developed a way to monetize content as well. This was AdSense, which Google launched soon after AdWords. Google engineers dreamed up ways to syndicate text ads not just to major search sites and portals, but to the entire web itself. “The idea of putting ads on nonsearch pages had been floating around here for a long time,” Google executive Susan Wojcicki said later. Google already had basically the entire web in its index, so if it could find a way to match relevant ads to the content on other
...more
Launched in 2002, Gawker was a straight-up tabloid, covering the foibles of the New York media industry. “Nick had the brilliant insight that if you want to get people to read something, the easiest way is to write about them,” remembered Lockhart Steele, another early blogger whom Denton would eventually hire into the Gawker stable of writers.
Some credit can be given to Napster for opening these floodgates. All those tens of millions of users who traded MP3 files were proactively and spontaneously self-organizing and using their own libraries to create content for others. Napster was the first time mainstream web users saw the utility in producing, not just consuming, content.
From the days of the Netscape browser, users had used bookmarks and “favorites” to keep track of their favorite web pages. But what if you wanted to see what other people had bookmarked? Del.icio.us (launched in September of 2003) let you do just that, allowing users to discover cool new things on the web by sharing their bookmarks with each other, just as Napster had allowed them to exchange songs.
The new postbubble web was about the users and the content in equal measure. It was about spontaneous impulses like “sharing” and self-organizing schemes like “tagging” and taxonomies. It was about how the content created by and for the hoi polloi often ended up being more engaging and exciting than the content that was prepackaged or professionally produced.
The idea of collaborative effort and collective organization had long been a common practice in hacker and software development circles. Just as each of the hackers on w00w00 had pitched in to help Shawn Fanning refine Napster, groups of programmers often came together and formed communities around the development of “open source” projects like the Linux operating system. Far from being a case of “too many cooks in the kitchen” creating a muddled fiasco, open-source development proved that complete strangers could independently, and without much centralized coordination, come together to
...more
Cunningham is famous for coining “Cunningham’s Law,” which finds that “the best way to get the right answer on the Internet is not to ask a question, it’s to post the wrong answer.”
Wikis tapped into a powerful impulse of collective action. A few years later, an obscure entrepreneur would make use of this impulse to save his own struggling creation.
“Couldn’t total idiots put up blatantly false or biased descriptions of things, to advance their ideological agendas?” asked one of the leads of the original Nupedia project on internal Wikipedia message forums. “Yes,” replied a Wikipedia partisan, “and other idiots could delete those changes or edit them into something better.”
Wikipedia was a modern miracle and soon became one of the most trafficked websites in the world. Wales had originally intended the project to be a commercial one, supported by advertising. But when the contributors and editors revolted at the very suggestion of putting ads up on Wikipedia, Wales instead made the site into a nonprofit enterprise.
If Web 1.0 was about browsing stuff created by others, Web 2.0 was about creating stuff yourself. If Web 1.0 was about connecting all the computers in the world together, then Web 2.0 was about connecting all the people in the world together, via those interlaced computers. If the clarion call of Web 1.0 was the Netscape IPO, then the coming of age of Web 2.0 was Google’s IPO.
Both investors and entrepreneurs had been chastened by the bubble’s aftermath. Get Big Fast was no longer the strategic mantra; multimillion-dollar advertising campaigns and gaudy launch parties were out. Instead, Web 2.0 companies aimed at refining their products and services, carefully cultivating a user base through feature innovation and word-of-mouth discovery, all while focusing like a laser on issues such as reliability and scalability.
VC investment didn’t roar back in huge numbers because it didn’t have to. In the Web 2.0 era, you could create a service used by millions in a matter of months, and you could do so for pennies on the dollar—at least, compared to the dot-com era. The hangover from the bubble fallout meant that talented programmers could be hired on the cheap; the infrastructure glut leftover from the global fiber buildout meant that bandwidth, storage and data costs were lower; and the tools developed during the bubble meant that you didn’t have to build a company from scratch anymore—
By some estimates, the cost of starting a web company had fallen by 90% in the few short years of the nuclear winter.15
The new Web 2.0 companies didn’t need to raise as much money and, unlike just a few years previously, none of them were in any hurry to go public. In the wake of the bubble bursting, a wave of scandals involving companies such as Enron and WorldCom had ushered in a new era of financial regulations. The Sarbanes-Oxley legislation especially meant that there were fewer advantages to going public and more incentives to stay private for as long as possible. Without the venture capitalists breathing down their necks for a financial “exit,” the Web 2.0 companies were more in control of their own
...more
Aside from push-button-easy uploading, the true brilliance of YouTube was the site’s second important focus: dead-simple sharing. After you posted a video to YouTube, you could simply share a link to your uploaded video, just like with Flickr.
YouTube achieved this success on a shockingly small amount of money. The company only ever raised $11.5 million, in two investment rounds. The fact that YouTube could serve video to the world from just a handful of servers (and some helpful content delivery networks in the background) was a powerful testament to the infrastructure the dot-com bubble had bequeathed to this new generation of startups.
YouTube was ground zero for things like that, for the birth of modern meme culture as well as the social media–celebrity ecosystem. The idea that random events or random people could “go viral” really entered the mainstream thanks to YouTube. “We are providing a stage where everyone can participate and everyone can be seen,” Hurley told the Associated Press in April of 2006.22 There was no greater Web 2.0 manifesto than that.
As would come out in subsequent litigation, the YouTube guys knew perfectly well that there was a ton of pirated material on their site. But they had learned the lessons of Napster. Napster had attempted to make the argument that it enjoyed legal immunity under the Digital Millennium Copyright Act as a neutral platform.
Service providers and platforms were protected as “safe harbors” under the law, provided they quickly and efficiently remove copyrighted material when notified. That was what had ultimately doomed Napster: it had never been able to take down 100% of the pirated files on its service. Five years on from Napster, might YouTube be able to find someone who could create a better system to remove illegally uploaded material—someone who had a mastery of algorithms, perhaps?
But Google’s decision to take on YouTube’s burden seemed downright crazy to a lot of people. Wasn’t Google paying a lot of money to basically assume a huge liability risk? It turned out that Google made one simple calculation when it purchased YouTube: in the broadband era, video was likely to become as ubiquitous on the web as text and pictures had always been. YouTube was already, in essence, the world’s largest search engine for video. In fact, it would eventually become the second-most-used search engine, period. With its stated mission to organize all the world’s information, Google
...more
Google was the savior Napster never had. It had the infrastructure to allow YouTube to scale up; it had the technical sophistication to keep YouTube on the right side of the law; it had the money to contest the legal battles; and—most important—it provided YouTube with the business model that would allow it to thrive. Those little text ads that Google had put all over the Internet? They could be used to monetize the videos on YouTube just as they could with any other type of content. As the years went by, the text ads could even morph into actual video ads—but algorithmically targeted and
...more
once Google came to the table with a willingness to share advertising revenue with rights holders, a lot of them (Viacom notwithstanding) were willing to play ball. At least Google/YouTube was offering Hollywood some kind of revenue stream. Digital revenue might not be as lucrative as the old analog revenue streams but, well, that was the Napster lesson, right? Better to take what you could get and embrace new distribution models rather than fight them.
entertainment industry was even now willing to buy into one of the key arguments Napster had tried to make only half a decade before: giving users a taste of your content online was actually great promotion! The phenomenon of Lazy Sunday had shown that. By 2008, when YouTube was streaming 4.3 billion videos per month (in the United States alone), many people—young people especially—were beginning to watch more video online than they were watching on traditional TV.23 For the first time, Hollywood stopped fighting disruption, and followed the changing tastes of their audience into a digital
...more
WEB 2.0 WAS ABOUT PEOPLE expressing themselves—actually being themselves, actually living—online. The last piece of the puzzle was simply to make the threads of all this social activity explicit.
Even as AOL the company began to crumble after the disastrous merger with Time Warner, AIM continued as a breakout success for one simple factor: it was a literal social graph, a tangible map of your online connections and relationships. Chatting on AIM became more popular than email, and your AIM screen name eventually gave you the ability to customize a rudimentary profile, turning it into a valuable online marker of identity.
The social graph was actually the great prize of Web 2.0. Others were only able to seize this prize because AOL dropped the ball. AIM eventually lost its relevance through benign neglect. “If AOL had 20/20 hindsight, maybe the story [of social networking] would have had a different ending,” says Barry Appelman, one of the AOL engineers who invented AIM.
The first modern social-networking site as we would recognize it today was invented by SixDegrees.com. In 1996, a former lawyer and Wall Street analyst named Andrew Weinreich had an idea inspired by the popular notion that any single person on the planet can be connected to anyone else by around six steps of personal connections—“six degrees” of separation. If that was true, then the web was the perfect tool for mapping those connections.
It was the right idea, but as Weinreich would ruefully admit, “We were early. Timing is everything.”27 The site was expensive to operate in the dot-com days, and of course, there were no photos on the profiles. “We had board meetings where we would discuss how to get people to send in their pictures and scan them in,” Weinreich says.
It’s something of a universal phenomenon that we can probably all recognize from our own lives. When you’re between the ages of sixteen and twenty-four, you’re plugged into the zeitgeist. During that intellectually fecund period, you tend just to “get” things: the latest fashions, the coolest new music and films, the trends and jokes and ideas that are au courant. It’s almost like young people see the future before everyone else.
At Parker’s urging, Zuckerberg decided that Thefacebook shouldn’t just plan for the immediate future; it should plan for an exponential future. To prepare for the coming autumn and the anticipated influx of users, Thefacebook desperately needed new servers. Zuckerberg decreed that, rather than struggle to keep up, the site’s infrastructure should, from that point forward, be architected to anticipate ten times the number of users it was getting at any one moment. That would cost more money than Facebook was already generating. Zuckerberg and his family were forced to sink $85,000 into the
...more
But the main thing that affected Zuckerberg’s thinking was data. From the very first days, Zuckerberg was obsessed with watching how users actually used his site. While monitoring the behavior of his users, Zuckerberg was fascinated by the very real info his network could tease out, and how little tweaks he made to Facebook’s systems could affect user activity.
By the fall of 2005, fully 85% of American college students were members of Thefacebook and 60% returned to the site daily.38 Ninety percent logged in at least once a week.39 What product or service in any industry got used so obsessively? Parsing the server logs, Zuckerberg and the others could see user behavior that they termed “the trance.” Users would log on and then click and click and click and click, browsing people’s profiles for hours at a time. “Wanting to look people up is kind of a core human desire,” Zuckerberg said around this time. “People just want to know stuff about other
...more
Not being a natural entrepreneur—and then stumbling onto a great entrepreneurial insight—and then having the fortitude, and discipline, and strength of will to become the sort of person who can bring that insight to reality? To me,
humans are nothing more or less than highly social primates. Finding out what is happening with your friends and family is a core human desire, right smack in the middle of Maslow’s hierarchy of needs. Zuckerberg had once mused that someday somebody was going to make a community site that would satisfy the need to know what’s up with your friends—but for the entire planet. And when they did so, they’d be building an amazing company.
if you uploaded a photo with a friend in it, you could “tag” them and they would receive a notification that you had posted a photo of them online. Facebook Photos took off right away. Within three weeks, Facebook hosted more photos than Flickr.50 After a month, 85% of the service’s users had been tagged in at least one photo.51 Zuckerberg and the rest of the team were amazed that an arguably inferior product could so quickly unseat the incumbents. The secret sauce had to be the network effects.
“Watching the growth of tagging was the first ‘aha’ for us about how the social graph could be used as a distribution system,” Cohler says. “The mechanism of distribution was the relationships between people.”52
We’re monkeys that like to talk to each other—that like to see and be seen. When someone tagged you in a photo, how could you help but look? Again, the primary way Zuckerberg measured the success of Facebook was by monitoring how often users returned, and how much they clicked on when they did so. After photos, he saw that Facebook’s return traffic ramped up in a major way.
The theory was something like this: human society is all about that small group of people you know and care about. Facebook had succeeded in capturing that, harnessing that, replicating that (at least, for college students). If Facebook really had tapped into one of the most powerful human impulses among college kids, why couldn’t it appeal to everyone? A product like Microsoft Windows was used by almost everyone who owned a computer. Billions of users. But a product like Coca-Cola was known to almost every human being alive, was used by almost every person alive. Could Facebook and the social
...more
From the beginning of the web, all the way through the launch of Google AdWords, the Internet had been monetized on the premise of taking the guesswork out of advertising. Well, on Facebook people were using their real names. They were volunteering their likes and dislikes. You could actually get people to tell you if they were interested in your product or not. It was advertising’s holy grail.
He told people he was building Facebook for the long term. He still was nursing the crazy idea that Facebook could become a brand as ubiquitous as Coca-Cola. A billion dollars wasn’t cool. What would be cool? A billion users. “I don’t want to sell,” he told one of the more persistent executives looking to buy his company.58 “And anyway, I don’t think I’m ever going to have an idea this good again.”59
Zuckerberg and the others saw that the reason people got so sucked in to the site was that they had to surf around to find out what had changed on every friend’s profile page. Users seemed to be most interested in learning what was new. Heck, every time a user simply changed their profile picture, Facebook’s engineers could see in the logs that that led to an average of twenty-five new pageviews.61 If Facebook’s key value proposition was the ability to find out what was up with your loved ones, then maybe they could design a better delivery system for this information. This would become the
...more
the News Feed built on ideas that were already out there. Every user’s profile page would function as a glorified RSS feed, and the News Feed would collect all the updates, photos and status changes that your friends made in one central place—just like a feed reader collected blog posts. You wouldn’t have to visit profile page after profile page individually, you could just log in and Facebook would tell you what was new. It would all spool out in one, long, reverse-chronological stream, just like a blog.
few years after the News Feed brouhaha, Digg would redesign its site and change its voting algorithms in a way that so angered users that they fled, en masse, to a Digg competitor named Reddit. To this day, Reddit is known as the “front page of the Internet,” the birthplace of memes and viral culture, while Digg, though still around, is nowhere near as relevant or well trafficked.
“If [News Feed] didn’t work,” Chris Cox says, “it confounded [Zuckerberg’s] whole theory about why people were interested in Facebook. If News Feed wasn’t right, he felt we shouldn’t even be doing [Facebook itself].”68
Zuckerberg is the twenty-three-year-old who turned down a billion dollars because he thought he was sitting on an idea that was even bigger. The gamble has paid off (at the time of this writing) to the tune of a nearly half a trillion dollars in market value. It helped that advertising against everyone’s personal lives also proved to be lucrative, and that the reverse-chronological scrolling mechanism of the News Feed proved to be perfectly suited for the coming age of mobile computing. But none of that would have been possible had Zuckerberg not matured into the sort of businessman who could
...more