Mindf*ck: Cambridge Analytica and the Plot to Break America
Rate it:
Open Preview
Kindle Notes & Highlights
92%
Flag icon
When forming his new government, Johnson appointed Dom Cummings, his former colleague from Vote Leave, to become one of his new senior advisers in 10 Downing Street. It did not seem to matter that Cummings was the director of a campaign that cheated during the very referendum Johnson was now using as the “democratic” basis for leaving the European Union at almost any cost.
92%
Flag icon
Federal Trade Commission levied a record $5 billion civil penalty against Facebook, and the same day the Securities and Exchange Commission issued notice of an additional $100 million fine. The regulators found that not only did Facebook fail to protect users’ privacy, the company misled the public and journalists by issuing false statements that it had seen no evidence of wrongdoing when it in fact had.
92%
Flag icon
The news actually increased Facebook’s share value by 3.6 percent, with the market tacitly recognizing that even the law cannot stop the growth of these technology giants.
93%
Flag icon
This story took the leadership of dedicated women, immigrants, and queers to ignite a public awakening about the discreet colonizing power of Silicon Valley and the digital technologies they have created to surround us.
93%
Flag icon
In coming out, we realize the power of speaking our truth to those who may not want to hear it. We reject their comfort and make them listen.
93%
Flag icon
The closet is a container whose boundaries are imposed by others who want to control how you behave and present yourself. The closet is invisible, and it is placed upon you by default, never by choice, for others to create a more palatable version of who you are—for their benefit, not yours. Growing up in a closet means incrementally learning how to pass in society—which movements, tones, expressions, perspectives, or uttered desires transgress the norms of those social boundaries imposed upon you.
93%
Flag icon
The closet is a place of acquiescing to society in exchange for passing, but it is also a place where rage builds as those boundaries and definitions slowly suffocate you until you cannot bear to remain inside that prison.
93%
Flag icon
In harvesting and processing your data self, algorithms make decisions on how to define you, how to classify you, what you should notice, and who should notice you. But there is a fine line between an algorithm defining you in order to represent who you really are and an algorithm defining you to create a self-fulfilling prophecy of who it thinks you should become.
95%
Flag icon
Regulation works because we trust people who know better than we do to investigate industries and innovations as the guardians of public safety. “Regulation”
95%
Flag icon
This consent-washing has continually allowed large tech platforms to defend their manipulative practices through the disingenuous language of “consumer choice.” This
95%
Flag icon
We do not let people “opt in” to buildings that have faulty wiring or lack fire exits. That would be unsafe—and no terms and conditions pasted on a door would let any architect get away with building dangerous spaces.
95%
Flag icon
consent should not be the sole basis of a platform’s ability to operate a feature that engages the fundamental rights of users.
96%
Flag icon
ban dark pattern designs, which are common design patterns that deliberately confuse, deceive, or manipulate users into agreeing to a feature or behaving in a certain way. Agency by design would also require proportionality of effects, wherein the effect of the technology on the user is proportional to the purpose and benefit to the user. In other words, there would be a prohibition on undue influence in platform design, where there are enduring and disproportionate effects, such as addictive designs or consequential mental health issues.
96%
Flag icon
For society to function, we must be able to trust our doctors or lawyers to always act in our interests, and that the bridges and buildings we use every day have been constructed to code and with competence. In these regulated professions, unethical behavior can bring dire consequences for those who transgress boundaries set by the profession—ranging from fines and public shaming to temporary suspensions or even permanent bans for more egregious offenders.
96%
Flag icon
design later is found to run afoul of a regulation, the company can absorb liability and pay fines, and there are no professional consequences for the engineers who built the technology, as there would be in the case of a doctor or lawyer who commits a serious breach of professional ethics. This is a perverse incentive that does not exist in other professions. If an employer asked a lawyer or nurse to act unethically, they would be obligated to refuse or face losing their professional license. In other words, they have skin in the game to challenge their employer.
97%
Flag icon
Out of all the possible types of regulation, a statutory code for software engineers is probably what would prevent the most harm, as it would force the builders themselves to consider their work before anything is released to the public and not shirk moral responsibility by simply following orders.
97%
Flag icon
as software engineers, we should all aspire to earn the public’s trust in our work as we build the new architectures of our societies.
97%
Flag icon
The unrestricted power of these Internet utilities to impact our public discourse, social cohesion, and mental health, whether intentionally or through incompetence and neglect, must also be subject to public accountability. A new digital regulatory agency should be established to enforce this new digital regulatory framework with statutory sanctioning powers. In particular, these agencies should contain technically competent ombudsmen empowered with rights to conduct proactive technical audits of platforms on behalf of the public. We should also use market-based reinforcement mechanisms, such ...more
98%
Flag icon
Platforms such as Facebook have vigorously argued that they are a “free” service, and if consumers do not have to pay for the service, the platform cannot be complicit in anticompetitive practice. However, this argument requires one to accept that the exchange of personal data for use of a platform is not an exchange of value, when it plainly is. There are entire marketplaces that valuate, sell, and license personal data.
1 3 Next »