Privacy is often threatened not by a single egregious act but by the slow accretion of a series of relatively minor acts. In this respect, privacy problems resemble certain environmental harms, which occur over time through a series of small acts by different actors. Although society is more likely to respond to a major oil spill, gradual pollution by a multitude of actors often creates worse problems.
Earlier this month the Pentagon announced a new effort to build a system aimed at allowing it to scan billions of communications in order to detect "anomalies" in people's behavior that will predict who is about to snap and turn into a homicidal maniac — or, perhaps, leak damaging documents to a reporter.
Citing the case of Maj. Nidal Hasan, the Army psychiatrist charged with killing 13 people in Fort Hood, Texas, the Pentagon's Defense Advanced Research Projects Agency (DARPA) wants to try to identify, before they happen, "malevolent actions" by insiders within the military. (See coverage by Wired, CNN, or Government Security News.)
The new project is called ADAMS, for Anomaly Detection at Multiple Scales, and anyone who remembers the battles over the Bush Administration's "Total Information Awareness" (TIA) program may be experiencing a major flashback right about now. TIA, also a DARPA project, was based on a vision of pulling together as much information as possible about as many people as possible into an "ultra-large-scale" database, making that information available to government officials, and sorting through it to try to identify terrorists. Eventually shut down by Congress, it was probably the closest thing to a truly comprehensive monitor-everyone "Big Brother" program that has ever been seriously contemplated in the United States. And many of the problems with TIA are equally present with this ADAMS project.
For one thing, the idea is naïve and misguided and it won't work. Statistical data mining has been found to be of limited use in some areas, such as in detecting credit card fraud. But as experts have said, data mining is not good at predicting highly unusual events, because it does not have a large body of examples it can use as a basis for identifying patterns. In fact, there are no patterns with some things. As my colleague Mike German often points out — and he used to work undercover on anti-terrorism cases for the FBI — empirical studies show that there is no such thing as a reliable profile that will predict violent behavior. Incidents in which people turn into homicidal maniacs and begin shooting up their offices are extremely rare and each one has unique origins in the individual psychology, circumstances and life history of the perpetrator.
Since Google's CEO has proclaimed the future of the web is no anonymity, does that make it a fact? If we keep hearing that privacy is dead and long buried, how long before we accept that anonymity is an anti-social behavior and a crime?
Security expert Bruce Schneier suggests that we protect our privacy if we are thinking about it, but we give up our privacy when we are not thinking about it.
Schneier wrote, "Here's the problem: The very companies whose CEOs eulogize privacy make their money by controlling vast amounts of their users' information. Whether through targeted advertising, cross-selling or simply convincing their users to spend more time on their site and sign up their friends, more information shared in more ways, more publicly means more profits. This means these companies are motivated to continually ratchet down the privacy of their services, while at the same time pronouncing privacy erosions as inevitable and giving users the illusion of control."
The loss of anonymity will endanger privacy. It's unsettling to think "governments will demand" an end to anonymous identities. Even if Schmidt is Google's CEO, his message of anonymity as a dangerous thing is highly controversial. Google is in the business of mining and monetizing data, so isn't that a conflict of interest? Look how much Google knows about you now.
Bruce Schneier put it eloquently, "If we believe privacy is a social good, something necessary for democracy, liberty and human dignity, then we can't rely on market forces to maintain it."
Facebook is wildly successful because its founder matched new social media technology to a deep Western cultural longing — the adolescent desire for connection to other adolescents in their own private space. There they can be free to design their personal identities without adult supervision. Think digital tree house. Generation Y accepted Facebook as a free gift and proceeded to connect, express, and visualize the embarrassing aspects of their young lives.
Then Gen Y grew up and their culture and needs changed. My senior students started looking for jobs and watched, horrified, as corporations went on their Facebook pages to check them out. What was once a private, gated community of trusted friends became an increasingly open, public commons of curious strangers. The few, original, loose tools of network control on Facebook no longer proved sufficient. The Gen Yers wanted better, more precise privacy controls that allowed them to secure their existing private social lives and separate them from their new public working lives.
Facebook's business model, however, demands the opposite. It is trying to transform the private into a public arena it can offer advertisers. In doing this, the company is breaking three cardinal cultural norms:
- It is taking back a free gift. In order to build profits, Facebook has been commercializing and monetizing friendship networks. What Facebook gave to Millenials, it is now trying to take away. Millennials are resisting the invasion to their privacy.
- Facebook is ignoring the aging of the Millennials and the subsequent change in their culture. Older Gen Yers want less sociability and more privacy as actors outside their trusted cohort enter the Facebook space in search of information and connection. These older Millennials want more privacy tools for control of their information and networks.
- Facebook is behaving as though it owned not only its proprietary technology platform but the friendship networks created on it. It doesn't. Millennials believe that ownership of their networks of friends belongs to them, not Facebook, and resist their commercialization.
Facebook, under intense pressure, is belatedly agreeing to streamline and strengthen its privacy tools. That will lower the anger of its audience but increase the anxiety of its advertisers. The brand value of Facebook has already taken a hit and competing social media platforms that promise privacy are beginning to appear.
We’re not living in an Orwellian Police state; it’s all just a conspiracy theory. However, that’s not what Pennsylvania’s government is telling their citizens. In what can only be described as a mafia-style intimidation tactic, the Pennsylvanian government is telling citizens there that they “know who you are”. The video shows a satellite image zooming in on and individual’s home while a computerized voice informs him that they know who he is and that he owes $4,212 in back taxes. The voice then proceeds to tell him that they can make it easy for him if he pays quickly. The ad then closes with a threatening message: “FIND US BEFORE WE FIND YOU”.
What is more disturbing than the ad itself is that governments are now finding it suitable to announce to us that we are living in an Orwellian police state and that we are all being monitored. “Pay up, or we will find you. We know where you live. We are watching you.” This commercial is a chilling confirmation that we are living in an Orwellian nightmare.
I actively avoid the "conspiracy theory" rhetoric of this site, but unless this TV commercial is supposed to be humor, it's hard to argue that it's benign.