1984 was George Orwell’s chilling prophecy about the future, and the narrative is timelier today than ever. In Orwell’s 1984, a device known as the “memory hole” was a slot into which government officials deposited politically inconvenient documents and records to be destroyed in a huge hidden furnace. (from Wikipedia, paraphrased)
In the world Orwell imagined, the present was what had always been — and there were altered documents to prove it. The originals had vanished, leaving only individuals’ fallible memories to say otherwise.
The earliest prototypes of a new kind of “disappearance” are already being tested. We are closer than we imagine to a dystopian reality that once was described in futuristic novels like 1984. Welcome to the memory hole.
Most of us get at least some of our news, books, TV, and all sorts of other communications electronically. Governments in some countries already are implementing filtering mechanisms that block access to “unapproved” sites and online material. And here, the U.S. government attempts (as do many employers) to block its employees from viewing certain material or websites on their work computers. But not at home. Yet.
The UK, however, will soon take a big step toward deciding what private citizens can see on the web even in their own homes. Last summer, British prime minister David Cameron unveiled planned internet filtering that will block pornographic sites unless users specifically opt in by telling their ISP that they WANT access to porn. Imagine who might be interested in names of those users who say they want this access?
Internet users are automatically opted IN by default to blocking of certain websites. By default, the controls will also block access to violent material, extremist and terrorist related content, anorexia and eating disorder websites, and suicide related websites. These new settings will censor sites mentioning alcohol or smoking, and block “esoteric” material. (The British government has not clarified what that category will include.)
Now back to Google, which finds anything on the web quickly and adds it into worldwide search results often within seconds. Since most people rarely scroll past the first few search results displayed, being “disappeared” already has a new online meaning. Getting Google’s algorithms to place what you post high enough on its search results page even to be noticed is what matters now. It would be oh-so-easy to manipulate the search results to relegate important (inconvenient?) information so far down in the results that it wouldn’t be noticed. So think of that as a starting point for more significant forms of “disappearance” we may find in the future.
Hiding something from users by reprogramming search engines is one step. Another is actually deleting content, a process as simple as transforming the computer coding behind the search process into something predatory. And if Google refuses to implement the change-over to “negative searches,” the NSA, which already appears to be able to reach inside Google, can implant its own version of malicious code as it has already done in at least 50,000 instances.
Now Google has introduced software that makes it more difficult for its users to locate child abuse material.
As company head Eric Schmidt put it, Google Search has been “fine-tuned” to clean up results for more than 100,000 terms used by pedophiles to look for child pornography. Now, for instance, when users type in queries that may be related to child sexual abuse, they will find no results that link to illegal content. Instead, Google will redirect them to help and counseling sites.
I doubt that any of us are in favor of facilitating child abusers, but think for a moment about the implications of Google (or any company — or our government) simply deciding to redirect us to help or counseling, or to somewhere else they select for us when we look for information on a topic they deem suspicious. To me that is scary, and it seems to be one more step on the road forward to 1984.