The gender prejudice of AI

The gender prejudice of AI

Screenshot+2019-05-21+at+09.02.58.png

UNESCO has issued a warning about the impact of gender prejudice coded into popular artificial intelligence applications such as digital voice assistants. In its I’d Blush if I Could report, published together with the Equals Skills Coalition and the German government, the organisation points to the gendered submissiveness and servility expressed by ‘female’ digital assistants.

 The report is in line with recent research from New York University’s AI Now Institute which finds that the AI field is characterised by flawed systems that preserve gender and racial biases.

 “Obedient and obliging machines that pretend to be women are entering our homes, cars and offices. Their hardwired subservience influences how people speak to female voices and models how women respond to requests and express themselves,” says Saniye Gülser Corat, director of gender equality at UNESCO. “To change course, we need to pay much closer attention to how, when and whether AI technologies are gendered and, crucially, who is gendering them.”

The report suggests that teams developing these technologies must be more gender-balanced. Today 12% of AI researchers are female, while women represent a mere 6% of software developers and are 13 times less likely to file ICT patents than men.

Oddly, the research finds that those countries closer to achieving gender equality overall have fewer women pursuing the skills needed for careers in the tech sector. For example, in Belgium only 6% of ICT graduates are women, while in the United Arab Emirates this figure is 58%.

“This paradox underscores the need for measures to encourage women’s inclusion in digital skills education in all countries,” the report states.

Twitter buys in fake news nous

How fake news means more fake news

How fake news means more fake news