Where are the ethicists?

For much of my time in privacy, I’ve heard about the coming next phase of “data ethics.” Privacy, I heard many people say, is about more than just doing what the law says you must do. Rather, privacy is about trust. Privacy is about going beyond compliance. Privacy is about doing doing what’s “right,” even if the law hasn’t caught up with technology yet.

You might remember this big conference the world’s privacy enforcers held last year, the International Conference of Data Protection and Privacy Commissioners (the ICDPPC, for those in the know - actually, no one can ever get all of those letters in the correct order on the first try when speaking, but whatever). The world’s biggest tech firms all lined up to agree that privacy is a human right, that they wanted a federal privacy law in the U.S., and that they truly cared about their customers’ privacy.

DPAs seemed mollified.

CNIL President and former Article 29 Working Party Chair Isabelle Falque-Pierrotin noted in her address that she welcomes “the commitments to data protection and ethics” made by Apple, Facebook, Google, and Microsoft. “I can only be pleased to hear the GDPR has been a major step for these companies in advancing toward more privacy, using it as a model.”

And yet we see, time and again, that all of that ethics talk just about always goes by the wayside in the face of profit to be made.

Today we’ve got a piece from NBC and others on Ever, a photo storage app that just happens to use the photos people upload to train their facial recognition AI and, you guessed it, profit. You could also probably guess that what they’re doing isn’t illegal in the United States. When asked about the company’s practices, whereby they encourage people to “make memories” and use facial recognition to group photos more easily, but then use that AI training to sell “Ever AI” to private companies for the purposes of using facial recognition in a variety of settings, the CEO has this to say:

Ever AI does not share the photos or any identifying information about users with its facial recognition customers.

Rather, the billions of images are used to instruct an algorithm how to identify faces. Every time Ever users enable facial recognition on their photos to group together images of the same people, Ever’s facial recognition technology learns from the matches and trains itself. That knowledge, in turn, powers the company’s commercial facial recognition products.

Oh, I get it. You don’t let your “commercial” customers peek at people’s photos. You just use some “non-commercial” customers as an unpaid workforce and then profit off all the personal data they’re providing you for free and trumpet it as “one of the largest, most diverse, proprietary tagged datasets in the world.

Were they transparent about this practice, so that people could think about whether they wanted to be Ever’s unwitting AI-training drones? I mean, maybe they’d be fine with it. People will do lots of things for free stuff (though it’s different if they’re using premium services). Well, they put it in their privacy policy! Here’s what they say in their “collect and use” section (why they capitalize “Files” and “Service” is unclear to me):

To organize your Files and to enable you to share them with the right people, Ever uses facial recognition technologies as part of the Service. Your Files may be used to help improve and train our products and these technologies. Some of these technologies may be used in our separate products and services for enterprise customers, including our enterprise face recognition offerings, but your Files and your personal information will not be. Your Files, and any personal information contained in them, are not shared with or provided to third parties. 

Say you’re an average user of Ever. Let’s parse how you might interpret those sentences:

1 - Okay, you use facial recognition tech to organize my files by grouping people. Cool.

2 - Okay, my files “may” be used to improve and “train” products (like a dog?). I guess fine. What do they mean “may”? Like, they only use some files? Whatever.

3 - So, these “technologies” - like, the facial tech? - will be used in other products for “enterprise customers.” What are those? Like the Starship Enterprise? I guess just big customers or something. What big customers? Like people who upload a lot of photos? And the technologies will be used but not my personal information or files? Okay I guess.

4 - Alright, good, my files aren’t being shared. I don’t want randos looking at my photos. That’s all good. Moving on!

Does the lawyer who wrote that convoluted paragraph with the weird capitalization honestly believe they have transparently informed the customer what’s going on? Why didn’t they just write, “Every time you make use of our facial recognition technology it gets slightly better and that allows us to profit off that technology by selling it to private companies who want to be able to use it for a variety of purposes, like identifying shoplifters when they come in their stores”?

Because maybe people would then say, “hey, that’s a little sketchy”?

These are the ethical decisions tech companies are supposedly making as part of this new frontier. Are they really asking themselves, “what’s the right thing to do here?” No, they’re not. And they don’t. Almost never. And they’re pretty transparent about not caring about what’s ethical.

Why did Ever get into the facial recognition business?

Aley, who joined Ever in 2016, said in a phone interview that the company decided to explore facial recognition about two-and-a-half years ago when he and other company leaders realized that a free photo app with some small paid premium features “wasn’t going to be a venture-scale business.” The shift to facial recognition boosted Ever financially: After it announced its new focus, the company raised $16 million at the end of 2017 — over half of its total investment to date.

Aley said that having such a large “corpus” of over 13 billion images was incredibly valuable in developing a facial recognition system.

When asked if the company could do a better job of explaining to Ever users that the app’s technology powers Ever AI, Aley said no.

“I think our privacy policy and terms of service are very clear and well articulated,” he added. “They don’t use any legalese.”

I mean, capitalizing random words for no reason and using words like “enterprise customers,” which don’t make sense to people who don’t traffic in business-talk, is the very definition of “legalese.” Part of the problem, obviously, is that business heads get so far into their own little worlds that they think everyone talks about “enterprise customers” on the daily, but this is a cultural thing, first and foremost.

Silicon Valley, and the tech boom associated with it around the world in all sorts of little “tech incubators,” has trained all of its denizens to see data as an asset. It’s the new oil! Rather than an extension of actual people with actual humanity.

Thus, Amazon is selling flawed facial recognition tech to law enforcement and Google is dissolving its ethics review board, etc. Sure, we’ve seen glimmers of ethical concerns, like the group of researchers who have asked Amazon to stop selling their facial tech product, but it’s always the line-level employees who do the ethical work. It never seems to be leadership.

Where are the tech leaders deploring stunts like Ever’s, pointing out that profiting off of their customers’ free labor and personal data while purportedly running a photo-storage company is ethically dubious? Where are the lines in the sand that organizations create for themselves and rally support for?

Until I see a lot more evidence to the contrary, I say all of this ethics talk is nonsense. Other than a very few outliers, who make privacy their very business plan, “ethics” just means “doing what the laws says we have to.” Not much more.

Sam Pfeifle