Data Ethics and Privacy

23 May 2016

Ethics Data Expert

Eleanor Saitta, the new Security Architect at Etsy, was on one of my favorite podcasts this week, Data Stories. In the episode, she talks about the trade off between open data and privacy that I touched on in a previous post titled Transparency versus Privacy. There were so many gems in this podcast episode that I decided to jot down some of my favorite quotes of Eleanor’s, which I’ve reproduced here.

"In general, my work over that past 13, 14 years now has concentrated on the places where security rises above the machine."

“Not all users are at the same risk…The same tools that create some moderate risk otherwise create some serious risks in specific contexts. Risk isn’t distributed equally.”

“[There is] a common fallacy where [you think] reducing the friction around accessing some fact from a dataset or kind of increasing the voracity of it, because it’s already technically out there, you haven’t taken an act.”

“If you have individuals who are trying to navigate the complex social space of protecting themselves or people they care about, or whatever is the set of things they’re worried about, the calculations they make are made on the basis of they have certain resources, they have certain outcomes, their adversaries have certain resources and outcomes, and it’s all this kind of balancing act that’s very much driven by friction.”

"One of the reasons why journalism is a profession is that they are in the business of thinking about and understanding the structures around these kinds of ethics."

“This is something that happens again and again, that reidentification is really breathtakingly easy.”

“Even with consent, figuring out what you can pull out of [data] is hard. That’s the next big thing, ensuring that when I give consent, what am I giving consent to? Am I giving consent to my anonymous ride data being used in a way that exposes something about the geography of the city? Maybe. Am I giving consent for my home address to be released once I’m deanonymized? Absolutely not. But the problem is that if we don’t understand what we can pull of out this, we don’t know what we’re consenting to.”

“I think we’re also learning increasingly that there are more and more things that will be used against people. I think we’re in a position right now where public understanding, but also the more general societal understanding, if we can separate those two, really haven’t caught up with what’s possible. For instance, the number of divorce cases where FitBit data is suddenly being used in court. No one who was putting on that FitBit to try and understand, ‘Hey, am I getting enough exercise?’ thought that they were, say, recording their sex life in a manner that was going to be used in their divorce, and yet, it turns out that a large number of people have done exactly that.”

"As people who work with data, we have an onus to educate."

“It would be lovely if we could simply say, ‘Well, here is how you properly anonymize data, and here is the Ethics Book.’ It doesn’t work like that. There are starting to be context specific toolkits for understanding how you should look at using data in very specific cases and how data can flow in very specific cases.”

“There’s always going to be a need for that second check of, ‘Hey wait, did you guys actually think about what is actually going on here. Have you had somebody take a real swing at the way you are anonymizing this data to see if there are any real, obvious breaks?’ To go back and have an outside researcher – and I don’t think this is an ethics board thing, because this is slightly different – to say ‘Hey, are you creating axes of social discrimination in the work that you’re doing? Does this work uniquely expose certain groups to specific harms?’”

“If you’re doing this kind of work, look at what the journalism and the data journalism world is doing because they’re the people who are pretty much on the front line of this. And then, just follow the conversations, because it’s going to be an ongoing conversation for a while.”

comments powered by Disqus