By Jake Snow
The information that you share with a company should not be repurposed or sold without your consent. But companies are building algorithms from massive repositories of personal information, collected from people who are not told about how the company will use their information.
A recent report revealed a troubling example of this invasive practice. A Silicon Valley-based company called Ever apparently used billions of private photos they collected from their users to secretly train a face surveillance tool marketed to the military and law enforcement.
Your private photos are yours and should not be the raw materials of surveillance technology.
This is an egregious violation of people’s privacy. When companies collect people’s personal information—like private photos—for one purpose, they should get permission from their users before they contort that data for an entirely different purpose.
So what happened here? Ever describes itself as “a company dedicated to helping you capture and rediscover your life’s memories.” Its website shows a photo album called “Weekend with Grandpa” depicting a young child playing outside. Ever’s “free” and “unlimited” photo storage and sharing app, called Everalbum, offers to make “it easy to share your favorite memories with the people who matter most.” Ever eventually accumulated billions of photos from millions of people into this “private” platform.
According to NBC News, Ever used these photos to create a face surveillance system that it markets to companies that want to “build a complete picture” of their customers, police who want to operate real-time face surveillance on video feeds or body cameras, and even the military. There’s no indication that Ever clearly explained to its users that it would take the faces of people in their private photos to construct such a system.
This is a privacy violation for the artificial intelligence age.
Face surveillance poses a grave threat to civil rights, free speech, and the safety of communities of color, immigrants, and activists. Ever should have asked the people who created—in Ever’s words—“one of the largest, most diverse, proprietary tagged datasets in the world” for permission to build a powerful piece of surveillance technology based on their private personal information.
Owning your personal information means deciding how it’s used. Our favorite devices and services also collect sensitive and private information ripe for misuse. Mobile phones maintain a comprehensive record of our locations that could be sold to a bounty hunter or to a marketer or a hedge fund. Apps, watches, and fitness trackers know intimate details about our bodies. Websites track what we read, what we consider purchasing, and what we eventually buy. Almost everything about us can find its way into a database.
Without legal protections, companies can use our most private information for purposes that people would not expect or approve of. And companies who gather, repurpose, and monetize our personal information will not have—and do not deserve—our trust.
This was true in 1972, when here in California, voters added a right of privacy to the state constitution to prevent governments and businesses from “stockpiling unnecessary information about us and from misusing information gathered for one purpose in order to serve other purposes.” But modern privacy laws have not kept pace with the new ways companies can misuse technology, even though voters overwhelmingly believe that companies need to do more to protect personal information.
Whether it’s Facebook running psychological experiments on people, a company like Ever building a surveillance tool from billions of private photos, or menstrual-tracking apps being used to monitor employees, companies are exploiting our personal information, and it needs to stop. It’s long past time to update our privacy laws for the 21st century to protect people from misuse of their personal information.
“Attribution” for Jake Snow?
As a guest contributor?
CV/organizational affiliations, etc.?
Goes to credibility…
That said, interesting piece…
Didn’t come with one, sorry