You Don’t Care About Your Friends’ Data, And 4 Other Things We Learned From Privacy Experts

Image courtesy of Byron Chin

The things we buy and use every day are increasingly connected — to the internet, and to each other — and while this new level of interconnection provides a slew of benefits, it also raises a new set of privacy problems and security challenges. Yet, as we recently learned, consumers are often self-centered when it comes to protecting their data and don’t give much thought to making their friends’ info available.

This is one of the many things we learned at last week’s PrivacyCon in D.C., hosted by the Federal Trade Commission.

The event brought together dozens of researchers and hundreds of internet and privacy professionals from around the world to talk about one of the biggest problems facing consumers at this point in the 21st century: Our data, and who can or can’t access it.

Here are just a few of the key takeaways from the event:

1. You’re kind of selfish when it comes to privacy, but so is almost everyone else.

If you’ve ever looked at the information that your apps and devices have access to, you’ll probably see that more than a few things don’t just know a lot about you, but you’ve given them permission to access the information you have on your friends and contacts.

It turns out that many of us are always happy to let an app do that, because we don’t really value our friends’ personal data.

Researchers Jens Grossklags and Yu Pu gave a presentation they called “Your data, my decision” covering their research [PDF] that investigated the ways in which everyone on the internet values data to which they have access but don’t own — like your friends’ phone numbers and birthdays.

The research set out to answer the question: “Are we good stewards of others’ data?” and the answer is, no. The average user is a “private egoist” that places basically no monetary value on their friends’ private data.

There is hope, though. “When individuals have higher privacy knowledge, they are more likely to place higher values on friends’ data,” the researchers found.

And the more you care about and empathize with others, the more likely you are to acknowledge concerns about their privacy.

2. You don’t need a person’s actual data to figure out what they are up to.

Increasingly, data from your connected devices — fitness trackers, sleep monitors, home assistants, whatever — is encrypted.

That’s the good news. Here’s the bad: Someone who really wants to look at your data can still figure out what you’re running, when, and why, even if the communication data is locked away from them.

Researchers Noah Apthorpe and Dillon Reisman from Princeton presented their paper [PDF] analyzing internet traffic patterns. In the course of their research, they figured out that anyone snooping on your WiFi — which might be a nosy neighbor, a criminal, or your home internet provider — can figure out what devices you have in your home, and when they ping the internet.

That means anyone could tell when you ask your Amazon Echo a question, or when your sleep monitor sends data back to its mothership — which means that anyone looking at it could also determine when you are home, when you are asleep, and other personal information you probably wouldn’t feel comfortable sharing with the whole world.

Likewise, Aleksandra Korolova presented research from the University of Southern California [PDF] highlighting the fact that the rapid spread of Bluetooth devices, which send and receive data within a 100-meter radius, leaves gaping security holes for most users.

A huge number of those devices, and the apps they can connect to, turned out to be vulnerable, and able to connect to any nearby app that asked for permission.

In a field trial using 70 volunteers, researchers were able to pinpoint 87% of the users’ IDs through devices that only tried to connect once every five hours.

In short: your monitors are sharing more than you want them to. And for most users, “stop having internet things” isn’t a viable long-term solution.

3. Your sense of how creepy something is matters, even if reality disagrees.

A huge part of privacy is “what people do,” and a large part of that is “how people feel.” So behavioral researchers are starting to tap into the actual reality of humans in order to understand how to improve privacy standards for everyone.

To that end, researcher Chandra Phelan, from the University of Michigan, presented research [PDF] on the theme of creepiness.

Phelan’s team found that overwhelmingly, users were able to feel something was creepy — but were rarely able to articulate why in any clear, analytical, logical fashion.

Sometimes the situations subjects were presented with were actually creepy, sometimes they weren’t. But overall, a feeling of something being not right lingered, even after thinking it through logically.

In other words: in a war between your heart and your head over privacy, your heart might not win, but it’s sure as heck going to influence what your head decides.

That research dovetailed with findings from Yang Wang and a research team based out of Syracuse and Rome who found [PDF] that “folk models” and conventional wisdom are holding a huge amount of sway over users’ perceptions of online behavior advertising.

We all kind of “know” how ads follow us around the web — but if you ask two dozen people how exactly that works, you’ll get two dozen different answers, Wang’s team found. They fall loosely into four categories of model, some of which are a lot more right than others.

After having reality explained to them, users were better able to diagram the way that systems they interact with every day actually work — but the things you just kind of “hear” from your friends or around the internet tend in large part to be things you end up believing.

4. Very few developers are actually thinking about, or centering, the needs of actual people in their work.

Alethea Lange, from the Center for Democracy and Technology, presented a paper [PDF], talking not only about how algorithmic, personalized targeting on the web works, but specifically about how users feel about it.

The research and presentation were titled, “A User-Centered Perspective on Algorithmic Personalization,” and reviewed how consumers feel about ads that make assumptions about them based on characteristics inferred from their online behavior.

Largely, users hated feeling they were being targeted by gender or race, but were more open to being targeted based on presumed location and income level. But across data types, users disliked being targeted-to based on inaccurate assumptions.

But her response to later questioning from the audience circled back around to putting the “user-centric” perspective back into design.

“Think about how consumers think,” Lange said. “Think about how people feel. It’s not that hard — talk to people.”

The room, full of human people, laughed before Lange continued. “A white lady age 35-45 who lives in a major city,” she said, gesturing at herself, “isn’t how people introduce themselves.” Being reduced to their component atoms, as it were, makes people uncomfortable.

“Something can be both relevant and offensive,” Lange stressed, pointing out that most people don’t want to see tombstone ads while looking for will-writing services.

Even if your ad is relevant, “that doesn’t mean it’s a good thing.” And businesses that want to hyper-personalize advertising need to start taking that human dimension into account.

5. It’s time for product ratings to take security and privacy into account.

Our colleague down the hall at Consumer Reports, Maria Rerecich, took the podium to tease a new initiative coming up from Consumer Reports.

If products are increasingly not just hardware items but also software driven, CR reasons, then it’s time to start letting consumers know just how secure, safe, and updated those devices are or aren’t — otherwise, how can buyers make an informed choice?

The organization is starting from a consumer-centric viewpoint, Rerecich said, giving the example of a room. Most users aren’t going to say that a room should be 68 to 70 degrees Farenheit with a specific humidity level, she said; instead, they are going to say, “I want this room to be comfortable. And it’s up to testers to break down the meaning of “comfort” into its elements — temperature and humidity among them — and test for their presence, absence, and strength.

CR has published a draft version [PDF] of the new testing protocol, showing the kinds of questions and categories the organization will be addressing when it comes to connected devices. But they’re not doing it alone: during her presentation, Rerecich called for members of the privacy community with expert knowledge or significant thoughts to reach out while the standard is still in development.

CR wants “to leverage the diverse expertise of this group and to start the process of putting together a digital standard,” Rerecich told the room. The organization plans to launch the new initiative in the first half of 2017.

Want more consumer news? Visit our parent organization, Consumer Reports, for the latest on scams, recalls, and other consumer issues.