Click Here to Kill Everybody: Security and Survival in a Hyper-connected World
Rate it:
Open Preview
Kindle Notes & Highlights
1%
Flag icon
Your car used to be a mechanical device with some computers in it. Now, it is a 20-to 40-computer distributed system with four wheels and an engine. When you step on the brake, it might feel as if you’re physically stopping the car, but in reality you’re just sending an electronic signal to the brakes; there’s no longer a mechanical connection between the pedal and the brake pads.
1%
Flag icon
“smart” is the prefix we use for these newly computerized things that are on the Internet, meaning that they can collect, use, and communicate data to operate. A television is smart when it constantly collects data about your usage habits to optimize your experience.
2%
Flag icon
As the cost of computerizing devices goes down, the marginal benefit—in either features provided or surveillance data collected—necessary to justify the computerization also goes down.
3%
Flag icon
we’ve generally left computer and Internet security to the market. This approach has largely worked satisfactorily, because it mostly hasn’t mattered. Security was largely about privacy, and entirely about bits. If your computer got hacked, you lost some important data or had your identity stolen. That sucked, and might have been expensive, but it wasn’t catastrophic. Now that everything is a computer, the threats are about life and property. Hackers can crash your car, your pacemaker, or the city’s power grid. That’s catastrophic.
5%
Flag icon
Software is poorly written because, with only a few exceptions, the market doesn’t reward good-quality software. “Good, fast, cheap—pick any two”; inexpensive and quick to market is more important than quality. For most of us most of the time, poorly written software has been good enough.
5%
Flag icon
Companies don’t reward software quality in the same way they reward delivering products ahead of schedule and under budget. Universities focus more on code that barely works than on code that’s reliable. And most of us consumers are unwilling to pay what doing better would cost.
5%
Flag icon
Microsoft spent the decade after 2002 improving its software development process to minimize the number of security vulnerabilities in shipped software. Its products are by no means perfect—that’s beyond the capabilities of the technologies right now—but they’re a lot better than average. Apple is known for its quality software. So is Google. Some very small and critical pieces of software are high quality. Aircraft avionics software is written to a much more rigorous quality standard than just about everything else. And NASA had a famous quality control process for its space shuttle software. ...more
5%
Flag icon
The base Internet protocols were developed without security in mind, and many of them remain insecure to this day. There’s no security in the “From” line of an e-mail: anyone can pretend to be anyone. There’s no security in the Domain Name Service that translates Internet addresses from human-readable names to computer-readable numeric addresses, or the Network Time Protocol that keeps everything in synch. There’s no security in the original HTML protocols that underlie the World Wide Web, and the more secure “https” protocol still has lots of vulnerabilities.
8%
Flag icon
in the 1990s, cell phones were designed to automatically trust cell towers without any authentication systems. This was because authentication was hard, and it was hard to deploy fake cell phone towers. Fast-forward a half decade, and stingray fake cell towers became an FBI secret surveillance tool. Fast-forward another half decade, and setting up a fake cell phone tower became so easy that hackers demonstrate it onstage at conferences.
8%
Flag icon
In the early 1990s, researchers would disclose vulnerabilities to the vendors only. Vendors would respond by basically not doing anything, maybe getting around to fixing the vulnerabilities years later. Researchers then started publicly announcing that they had found a vulnerability, in an effort to get vendors to do something about it—only to have the vendors belittle them, declare their attacks “theoretical” and not worth worrying about, threaten them with legal action, and continue to not fix anything. The only solution that spurred vendors into action was for researchers to publish details ...more
9%
Flag icon
Our computers and smartphones are as secure as they are because there are teams of security engineers dedicated to writing patches. The companies that make these devices can support such big teams because they make a huge amount of money, either directly or indirectly, from their software—and, in part, compete on its security. This isn’t true of embedded systems like digital video recorders or home routers. Those systems are sold at a much lower margin and in much smaller quantities, and are often designed by offshore third parties. Engineering teams assemble quickly to design the products, ...more
9%
Flag icon
Even when manufacturers have the incentive, there’s a different problem. If there’s a security vulnerability in Microsoft operating systems, the company has to write a patch for each version it supports. Maintaining lots of different operating systems gets expensive, which is why Microsoft and Apple—and everyone else—support only the few most recent versions.
10%
Flag icon
Before everything became a computer, dangerous devices like cars, airplanes, and medical devices had to go through various levels of safety certification before they could be sold. A product, once certified, couldn’t be changed without having to be recertified. For an airplane, it can cost upwards of a million dollars and take a year to change one line of code.
10%
Flag icon
coffeepot manufacturers and their ilk—don’t have experience with security researchers, responsible disclosure, and patching, and it shows. This lack of security expertise is critical. Software companies write software as their core competency. Refrigerator manufacturers, or refrigerator divisions of larger companies, have a different core competency—presumably, keeping food cold—and writing software is always going to be a sideline.
10%
Flag icon
the company called the initial report of the security vulnerability—published without details of the attack—“false and misleading.” That might be okay for computer games or word processors, but it is dangerous for cars, medical devices, and airplanes—devices that can kill people if bugs are exploited. But should the researchers have published the details anyway? No one knows what responsible disclosure looks like in this new environment.
10%
Flag icon
Because of the DMCA, it’s against the law to reverse engineer, locate, and publish vulnerabilities in software systems that protect copyright. Since software can be copyrighted, manufacturers have repeatedly used this law to harass and muzzle security researchers who might embarrass them. One of the first examples of such harassment took place in 2001. The FBI arrested Dmitry Sklyarov at the DefCon hackers conference for giving a presentation describing how to bypass the encryption code in Adobe Acrobat that was designed to prevent people from copying electronic books. Also in 2001, HP used ...more
This highlight has been truncated due to consecutive passage length restrictions.
10%
Flag icon
Microsoft might use agile development processes internally, but its releases are definitely old-school.
10%
Flag icon
we need both the long-term stability of the waterfall paradigm and the reactive capability of the agile paradigm.
12%
Flag icon
people from my grandparents’ generation who never got used to house keys. They would always keep their doors unlocked, and they resented the inconvenience of having to lock their doors: they had to remember, they had to always carry a key with them, their friends couldn’t get in without a key, and on and on and on. For me, it’s an inconvenience that I have been used to all my life. Sure, I’ve locked myself out of my home and had to call my wife to help me out, or pay for the occasional locksmith. But for me, it’s a small inconvenience for the trade-off of a more burglar-resistant home. Seat ...more
12%
Flag icon
Updates need to be authenticated, to prevent attackers from tricking you into installing a malicious update. This was one of the techniques that the computer worm Stuxnet used. For years, though, hackers have been using valid signing authorities to create valid authentication signatures for bad updates. Many of the supply-chain vulnerabilities I’ll talk about in Chapter 5 are the result of faulty authentication.
12%
Flag icon
If I can impersonate you to your devices, I can take advantage of you. This is the identity theft of the future, and it’s scary. If I can feed your devices faulty information, I can manipulate those devices in a harmful way. If I can fool your devices into thinking I’m more trusted than I am, I can give commands in your name. We don’t fully understand the consequences of these attacks, because we don’t fully understand the scope of the systems.
13%
Flag icon
NSA has eliminated anonymity through massive surveillance. If you can watch everything, you’re better able to piece together disparate clues and figure out what’s going on and who’s who. You can probably even do that automatically. That’s what countries like China and Russia are trying to do with the wide-reaching surveillance of the Internet in their countries.
13%
Flag icon
Unless attribution is followed by an effective response, it makes a country look weak, and it often makes sense for a nation not to publicly attribute a cyberattack unless it can respond.
13%
Flag icon
“sources and methods” Joyce was referring to—which is also classified. This means that the US government often can’t explain why it attributes an attack to a particular country or group, which means there is no way to independently verify its attribution. This, if you’re someone who tends to distrust the government, is bad. And while it’s obvious that the NSA needs to keep its sources and methods secret, government officials will need to expose them if they expect the general public to believe their attribution claims and support any retaliatory actions they’re going to take.
16%
Flag icon
It isn’t that the NSA woke up one morning and said: “Let’s spy on everyone.” It said: “Corporate America is spying on everyone. Let’s get ourselves a copy.” And it does—through bribery, coercion, threats, legal compulsion, and outright theft—collecting cell phone location data, Internet cookies, e-mails and text messages, log-in credentials, and so on. Other countries operate in a similar fashion.