Privacy is paramount, but as we further approach an IoT future, the lines of privacy and integration are blurring.
Despite their page on data protection, Google repeatedly comes under fire every year for privacy breaches. Most recently, the issue involved emails and information housed in various Google Drive apps.
Is loss of privacy and control the price we pay for interconnectivity?Does #Google Own the Info in Our Emails?Click To Tweet
The Devil is in the Terms of Service Details
When accessing a new app or service, most of us just click and go when we come to the “Terms of Service” section. Many tech users are familiar with the basics of ToS agreements, so we figure most are standard. As the old adage goes, “The devil is in the details.”
With Google specifically, their ToS allows for many activities some would consider breaches of privacy. In 2014, Google claimed that it had a right to even sensitive data if it flows on an open network. But that’s not all.
Things like scanning emails to personalize the ads you receive which, to be fair, they did stop doing this summer. Now scrapped AI plans presented major privacy concerns, too. Google can track online spending activities, as well as real-world purchases. Most recently, Google mistakenly locked out Google Docs users for supposedly “violating ToS”.
How much control do you have over the information you are putting in a Google and are you tacitly giving Google the right to use that information? After all, is a Google Doc any different from a tweet or Instagram post?
Protecting Privacy in an Interconnected World
Data protection and privacy in an Internet of Things environment seem like competing objectives. The lines between what we share on Instagram and Twitter vs. a private email to someone are clear to us. But to companies and computer algorithms, data may just be data, personal or otherwise.
This dense, but worthwhile article from the National Institute of Standards and Technology blog highlights why security and privacy both matter in an IoT future. The data breaches of user information such as with Equifax earlier this year are obviously problematic. More insidious breaches include when Uber paid a group of hackers not to reveal sensitive info of 57 million people.
There are real economic ramifications for insecure data and ambivalence toward privacy. Beyond economic, tech, and data-driven concerns, a lack of privacy has measured effects on people psychologically, too.
The Hidden Mental and Emotional Costs of Digital Privacy Vacuums
Shame is a powerful tool. It can cost high profile actors to lose their roles in a popular Netflix series. The effects of shame can cause women or any victim to stay silent for decades about egregious abuses of power. But guilt is such a powerful tool that is has evolved into a new concept wielded by abusers, vigilantes, and do-gooders alike.
Shame conditioning extends to illicit activities as a way to deter damaging or potentially dangerous behavior. But it also seems to include activities some people would prefer to keep private such as plucking their eyebrows or their browser history.
Glenn Greenwald’s TED Talk on why privacy matters tackles this subject head-on. He starts out by saying that proponents of mass surveillance argue that “only people engaged in bad acts have a reason to want to hide and to care about their privacy.” He elaborates that this binary worldview robs people of the freedom to be themselves.
Greenwald takes aim at Google CEO Eric Schmidt when he references a 2009 interview in which Schmidt said something unsettling.
“If you’re doing something that you don’t want people to know [about], maybe you shouldn’t be doing it in the first place.” He further pokes at Schmidt referencing this CNET article to display the inherent problems with that mentality toward privacy.
Social media can be curated, but some use it as a diary or just a tool to connect with others. It can also be a safe space where we can workshop our feelings or beliefs. At what point can statuses posted to only your Facebook friends be shared publicly due to the content of the status?
Answer Me These Questions Three
In another TED Talk, Stuart Lacey examines privacy in the context of monetization. He references location tracking, pulling up a completely accurate map and timeline of a day he spent in New York City, along with photos he took. He posits three questions:
- One: Are you being robbed?
- Two: Are you missing out?
- Three: Are you being paid?
To put things in context, Samsung TVs were shipped with active webcams with the intent that the webcam needed to be ready for when you wanted to use it. They prioritized monetary gains at the threat of user privacy.
When companies are rightfully critiqued for this, often times their answer is to produce convoluted explanations such as the iTunes ToS as displayed in this nifty webcomic. This all amounts to companies not treating user privacy as an important issue.
What Can We Do to Protect Our Privacy?
For Google specifically, you can turn off search and location history. You can also control how you interact with Google ads regarding personalization. Chrome plugins like HTTPS Everywhere can help bolster your device security.
But beyond VPNs, proxies, and ad hoc solutions, we, as consumers, cannot solve the issue of privacy in the digital age. Companies like Google and Facebook and Equifax must enact their own security measures to protect our data.
You can email, fax, and call your local government officials, too. Petitioning for greater accountability of companies to their users is definitely within the bounds of lawmakers and Congresspeople. On a related note, the war for Net Neutrality wages on. Though the FCC voted to gut it, Congress and the courts could still pull through.
Without it, user privacy may be in even more danger.