We Need More Women Working in AI

Author: Kim Z. Dale, CISA, CISSP
Date Published: 6 March 2020

There is a gender gap in artificial intelligence. A study by the World Economic Forum and LinkedIn found that only 22 percent of AI professionals are women. Research by the AI Now Institute found that women make up only 15 percent of the AI research staff at Facebook and only 10 percent at Google. Although the gender gap in AI echoes those in cybersecurity and information technology in general, the repercussions of a lack of diversity in AI broaden because the details of the how the systems work are not fully known. As a result, identifying and correcting bias introduced by the decisions of the development teams or the data they select to train their algorithms is difficult.

There may be a gender gap behind the scenes of AI, but female voices abound in our increasingly ubiquitous digital assistants. A paper from the United Nations Educational, Scientific, and Cultural Organization (UNESCO) decries the impact of female AI personas such as Amazon’s Alexa and Apple’s Siri. As the report describes, “Because the speech of most voice assistants is female, it sends a signal that women are obliging, docile, and eager-to-please helpers, available at the touch of a button or with a blunt voice command like ‘hey’ or ‘OK’.” These interactions reinforce existing gender stereotypes and can create new ones as these systems spread into communities that do not currently share those same prejudices.

Furthermore, the UNESCO report identified multiple examples where the digital assistants responded with coy or flirtatious language when prompted with sexual comments or verbal abuse. For example, at the time of the report Alexa’s standard response to “You’re a slut” was “Well, thanks for the feedback.” One of Siri’s responses to the same comment was “I would blush if I could.” These reactions bolster sexist tropes.

Although AI attempts to simulate women, AI systems often perform less effectively for them. For example, voice recognition systems are repeatedly found to be less accurate for women than men, and hiring software has been known to penalize resumes using language typically considered female. Women are at a disadvantage when such systems are in use, and some AI biases can even be life-threatening.

In her book Invisible Women: Data Bias in a World Designed for Men, author Caroline Criado Perez notes that AI systems are being introduced to assist with medical diagnoses despite the fact that there are large gaps in medical data reflecting women. Perez writes, “With our body of medical knowledge so heavily skewed towards the male body, AIs could make diagnosis for women worse, rather than better.”

These are only a few examples of AI systems that misrepresent women, are not effective for women, or outright put women in danger. Reducing the gender gap in AI will not immediately fix all these problems, but it will help. As the European Commission’s Ethics Guidelines for Trustworthy Artificial Intelligence states, “It is critical that, as AI systems perform more tasks on their own, the teams that design, develop, test and maintain, deploy, and procure these systems reflect the diversity of users and of society in general.” (The guidelines go on to say that AI team diversity should include gender, culture, and age as well as different professional backgrounds and skillsets.)

As we mark International Women’s Day on Sunday, it is clear that we need more women working in AI. We need women designing, building and testing AI systems. We need women in the venture capital companies that fund AI, the board rooms that govern it, the institutions that regulate it, and the teams that audit it. We need women in AI because half of the world’s population should not be considered an edge case.