A few of the main ideas: To start with, code is law. As Marshall McLUhan postulated that the medium is the message and Harold Innis showed the bias of communications, we must understand that instructions encoded in software regulate what we can do. Second, a recent change is the movement away from searching the WWW to a push notifications environment where „information is delivered to us” through apps. Third, while in the beginning the internet seemed lika a free place, hard to regulate, right now, many countries use censorship and block Twitter, Facebook, Youtube, etc. Internet censorship went from being regulated – like usual things – through law, to being regulated through code and software, and responsibility is put directly on the service providers. For example, China has a particular way of doing this: it sends back to the user an error message, as if the content itself doesn’t exist (Google found a way around this, suggesting users alternate spellings). We must begin to understand and connect the dots, as users and as citizens: the internet is international, but its cables are everywhere, its central nodes are everywhere – but mostly around the US – and the devices we use are from specific nations – bending to specific national laws. From a lawless place, it has became a place of many, many laws. Fourth, the future is at least partly out of the West’s hands. The growing populations of the rest of the world will have access to the net, along with living in increasing inequality due to climate change and capitalism’s mechanism, so the question Deibert asks is, what kind of web will they craft? As the author shows, in some countries governments outsource to extra-legal intervention groups to deal with unrurly citizens. Coming back to corporations, Google has started issuing transparency reports, showing the number of requests it has received from governments to censor or remove content, and highlighting those it complied with or turned down (most requests are „other requests”, not issued through a court order). Most companies don’t tell users if their data is asked for by the government. In 2002 and 2004, Chinese government requested information on two dissidents from Yahoo!, who complied. When being sued by the families in the US, the company testified that it was following local law. Skype, as well, uses content filtering for China, and can be intercepted, although it promises end to end encryption. After 9/11, a key point in the cybersurveillance debate, governments felt entitled to more and more of citizen’s information, creating the false tradeoff: privacy vs security. Human Rights Watch found that the UN passed several resolutions urging member states to pass laws that expand government powers to „investigate, arrest, detain, and prosecute individuals at the expense of due process”. With enough data, a Minority Report future isn’t just dystopian fiction anymore – politically inclined individuals can be monitored before they do anything. Researcher Chris Soghoian pointed out that some companies even charge fees for „lawful access”, with automated process.
Cybercrime is real, and just like most crime, its structure is knotted in complicated patterns and networks – many „cyberweapons” (spying software, malware for breaking in, or just hiring a black hat to hack someone) are cheap and easy to buy on the internet, and, as Deibert puts it, how can the West condemn the Syrian Electronic Army when it openly markets computer network attack products at trade shows? Besides, when cyberweapons are perceived as clean, there might be „strong pressures to adopt military over diplomatic solutions”. Technology is multi-puroposed, and the same is used for surveillance of dangerous targets or of peace activists.
Hacking used to have a more positive value – „of experimentation and exploration of limits and possibilities”. Technology can be seen not as a thing, but as a craft, inherently political. In the context of our constant connectedness, the increasing restrictions on cyberspace „are alarming”. The closing off of hardware and software and putting on copyright or other laws to diminish access to them are not only barriers to our freedoms, but ultimately to our security as well. The Electronic Frontier Foundation has found laws (in debate – Article 3 of DAAIS in Europe) that limit the publishing of research on security flaws. The denial of access to knowledge is increasing, together with the tools to dismantle it. One solution could take the form of a distributed model: mixture of multiple actors with governance roles, division of control with cooperation and consent, and restraint. Without humans „cyberspace would not exist”. Deibert pushes for a position of joint custodianship: we either degrade cyberspace, or we extend it. The responsibility is intergenerational.