Lola dupre seattle met wip 2 nikon edit cfuo8h

Image: Lola Dupre

Google “construction worker” images and you’ll see a lot of stock art, of men carrying lumber and standing in front of excavators with their arms crossed—even the Village People, if you scroll down far enough. But you’ll have to search for something like “construction worker woman” to find more females in hard hats, including photos that look more like inspiration for a sexy Halloween costume. See the woman pulling her pigtail, two traffic cones on her chest a la Madonna? Or the one wearing a tight, low-cut top, hammer raised, hand on her hip?

A few years ago, a team of researchers at the University of Washington wondered how image search results for occupations like construction worker and receptionist represented gender. One of the most persistent biases in the United States is against women in the workplace, the team said in a subsequent paper about their results. And featuring too few women in positions like, say, chief executive officer, can perpetuate that prejudice. 

Of course, few CEOs are women. In the United States, only 27 percent of this corner-office crew are female. But when researchers typed “CEO” into Google Images? Eleven percent of the people depicted were women, less than half of the portion of Americans who have cracked that cliche glass ceiling. In other searches, women were overrepresented, exaggerating gender stereotypes. About half of telemarketers are men, but 64 percent of the image search results depicted women.

In a keynote about bias at the Neural Information Processing Systems conference in December, Kate Crawford, a Microsoft researcher who studies the social implications of artificial intelligence, asked how many in the audience had recently searched for images of CEOs. 

“You’re going to get a lot of white dudes in suits,” she quipped. But she had wanted to know which female CEO would be pictured first. People chuckled as she drew their attention to the 70th thumbnail in the bottom right corner of a slide showing the results from a search about six months earlier.

“Can you kind of guess?” she said in a crisp Australian accent. “It’s CEO Barbie.”

Artificial intelligence—powerful, evolving, and ever expanding into our daily lives—is not immune to some very human flaws. Maps showing zip codes with large African American populations bypassed by Amazon’s same-day delivery service, for example, were “eerily familiar,” like old redlining maps, Crawford said during her keynote. “Long histories of discrimination live on in our digital systems often for very complex reasons.” 

But as principal researcher of the Fairness, Accountability, Transparency, and Ethics in AI group at Microsoft, she’s among a growing group of data scientists considering what to do about it. The Redmond tech giant founded FATE in 2015, and its four members—all women—are working to better balance fairness and functionality in the machine-learning world. 

Timnit gebru courtesy microsoft rwz2pw

Timnit Gebru, a researcher with the Fairness, Accountability, Transparency, and Ethics in AI group at Microsoft, thinks artificial intelligence needs some “guardrails” against bias.

“AI has tremendous potential to transform and provide benefits to society,” from health care to agriculture, says Timnit Gebru, one of the group’s researchers. But without “guardrails,” artificial intelligence may negatively target those who are already marginalized, like women or people of color. After reading a 2016 investigation by ProPublica that found software used across the country to predict future criminals is biased against blacks, “I realized how scary of a problem it was,” Gebru says. 

Algorithms, the mathematical instructions that teach computers to learn on their own, can be trained with biased data, but machines can also amplify the prejudices of the people using them, says Dan Weld, a computer scientist at UW. If people who are searching for images of CEOs click more on images of men than women, Weld says, the computer may assume those are the kinds of photos it should show in future searches. The clicks are positive reinforcement.

Of course, those images are confined to a browser, just like the advertisements you see on social media, also controlled by algorithms. But Weld expects the boundary between online and real life to grow increasingly fuzzier as the digital and physical world become more similar. Consider a recent ProPublica and New York Times story that reported on Facebook job ads that targeted younger users, raising concerns about age discrimination. 

Companies can act against offensive content, Crawford explained, by “scrubbing to neutral”—removing the biased data—or breaking a problematic association (in 2015, Google Photos infamously tagged images of black people as “gorillas”). 

But, she mused during her recent conference address, who gets to decide what should be removed and what’s neutral? 

“We know that less than 8 percent of CEOs in the world right now are women,” Crawford said. “Does that mean that your image search results should show 8 percent or less of women?” Or, she said, do we acknowledge studies showing women have faced discrimination in pursuit of the C-suite and try to change the image search results to have distributions we think would be fairer? 

The answer isn’t straightforward, she admitted, and there’s no silver bullet.

But diversity is at least part of the solution, Gebru says—in data and researchers. 

Show Comments