Atlanta Asks Google Whether It Targeted Black Homeless People


Atlanta officials are seeking answers from Google after a news report said that company contractors had sought out black homeless people in the city to scan their faces to improve Google’s facial-recognition technology.

The New York Daily News reported on Wednesday that a staffing agency hired by Google had sent its contractors to numerous American cities to target black people for facial scans. One unnamed former worker told the newspaper that in Atlanta, the effort included finding those who were homeless because they were less likely to speak to the media.

On Friday, Nina Hickson, Atlanta’s city attorney, sent a letter to Google asking for an explanation.

“The possibility that members of our most vulnerable populations are being exploited to advance your company’s commercial interest is profoundly alarming for numerous reasons,” she said in a letter to Kent Walker, Google’s legal and policy chief. “If some or all of the reporting was accurate, we would welcome your response as what corrective action has been and will be taken.”

Google said it had hired contractors to scan the faces of volunteers to improve software that would enable users to unlock Google’s new phone simply by looking at it. The company immediately suspended the research and began investigating after learning of the details in The Daily News article, a Google spokesman said.

“We’re taking these claims seriously,” he said in a statement.

The dust-up is the latest scrutiny of tech companies’ development of facial-recognition technology. Critics say that such technology can be abused by governments or bad actors and that it has already shown signs of bias. Some facial-recognition software has struggled with dark-skinned people.

But even companies’ efforts to improve the software and prevent such bias are proving controversial, as it requires the large-scale collection of scans and images of real people’s faces.

Google said it hired contractors from a staffing agency named Randstad for the research. Google wanted the contractors to collect a diverse sample of faces to ensure that its software would work for people of all skin tones, two Google executives said in an email to colleagues on Thursday. A company spokesman provided the email to The New York Times.

“Our goal in this case has been to ensure we have a fair and secure feature that works across different skin tones and face shapes,” the Google executives said in the email.

The unnamed person who told The Daily News that Randstad sent the contractors to Atlanta to focus on black homeless people also told the newspaper that a Google manager was not present when that order was made. A second unnamed contractor told The Daily News that employees were instructed to locate homeless people and university students in California because they would probably be attracted to the $5 gift cards volunteers received in exchange for their facial scans.

Randstad manages a work force of more than 100,000 contractors in the United States and Canada each week. The company, which is based in the Netherlands and has operations in 38 countries, did not respond to requests for comment. Google relies heavily on contract and temporary workers; they now outnumber its full-time employees.

Several unnamed people who worked on the facial recognition project told The Daily News that Randstad managers urged the contractors to mislead participants in the study, including by rushing them through consent forms and telling them that the phone scanning their faces was not recording.

The Google executives did not confirm those details in their email. They said that the tactics described in the article were “very disturbing.” Google instructed its contractors to be “truthful and transparent” with volunteers in the study by obtaining their consent and ensuring they knew why Google was collecting the data, the executives said in the email.

“Transparency is obviously important, and it is absolutely not okay to be misleading with participants,” they said.

A Google spokesman said that the volunteers’ facial scans were encrypted and only used for the research, and deleted once the research is completed.

In 2017, an Apple executive told Congress that the company developed its facial-recognition software using more than a billion images, including facial scans collected in its own research studies.

“We worked with participants from around the world to include a representative group of people accounting for gender, age, ethnicity and other factors,” the executive said.