Advertisement
News

Microsoft Provided Gender Detection AI on Accident

Microsoft said it would retire its AI-powered gender classifier in 2022. Now it says some users still had access to it because of an error.
Microsoft Provided Gender Detection AI on Accident
Photo by Sam Torres / Unsplash

Microsoft has been giving users access to an AI-powered image analysis model that claims it’s able to detect a person’s gender, years after promising to retire the technology.

Microsoft announced it would phase out access to facial analysis tools that claim they can detect a person’s age, emotion, and gender in 2022. This technology, and gender classification tools specifically, have been criticized for often being wrong and particularly harmful for transgender people.

“API access to capabilities that predict sensitive attributes also opens up a wide range of ways they can be misused—including subjecting people to stereotyping, discrimination, or unfair denial of services,” Microsoft said at the time. “To mitigate these risks, we have opted to not support a general-purpose system in the Face API that purports to infer emotional states, gender, age, smile, facial hair, hair, and makeup. Detection of these attributes will no longer be available to new customers beginning June 21, 2022, and existing customers have until June 30, 2023, to discontinue use of these attributes before they are retired.”

404 Media learned that Microsoft was still providing the gender detection tool from Ada Ada Ada, an algorithmic artist whose practice involves processing photographs of herself through a variety of major AI image analysis tools and social media sites. Ada Ada Ada told me she set up version 3.2 of Microsoft’s Image Analysis API to detect her age and gender based on her photographs in early 2022, before Microsoft announced it was retiring gender detection. Her continued access to the tool showed that Microsoft never turned it off, and Microsoft did not know that people were still able to access the tool until I reached out for comment. 

Microsoft said the most recent Image Analysis API (4.0), which is in public preview, does not have detection capabilities for age or gender and that Microsoft has deprecated the age and gender function in previous versions of the API and no new customers should have access to those features. However, Microsoft said that while version 3.2 of Image Analysis remains generally available, customers should not have had access to the deprecated age and gender function. Microsoft said that after I reached out for comment it concluded that an error caused a very small number of customer applications to retain their access to the age and gender functionality. Microsoft said it’s taking immediate action to correct this and remove access to these functions.

💡
Do you know anything else about Microsoft's AI tools? I would love to hear from you. Using a non-work device, you can message me securely on Signal at emanuel.404, or send me an email at emanuel@404media.co.

“My issue with this has always been that Microsoft managed to get some undeserved good press by stating that they would retire these services, citing ethical concerns,” Ada Ada Ada told me. “It's my impression that no one really took their time to make sure this harmful technology was actually retired. They did not care that it actually got taken out of commission by the time they said they would. What mattered was just that they could reap the rewards of that good press, hence only caring about new user access, not existing customer access. If it weren't for us bringing this matter to them, it would likely have been available for years.” 

“I think this issue speaks to a lot of corporate talk surrounding ‘ethical AI.’” she added. “AI companies claim to follow these vague principles, but do not actually offer any researchers, journalists or even artists the means to properly follow up on them. We just have to talk their word for it.”

“You can’t actually tell someone’s gender from their physical appearance,” Os Keyes, a researcher at the University of Washington who has written a lot about automated gender recognition (AGR), wrote for Logic in 2019. “If you try, you’ll just end up hurting trans and gender non-conforming people when we invariably don’t stack up to your normative idea of what gender ‘looks like.’”

“This is exactly what I expected when they [Microsoft] deployed it [gender detection] full stop,” Keyes told me in an email. “The way that many companies, particularly vast ones such as Microsoft, develop and deploy technologies makes it inevitable that responsibilities will fall through the cracks. Perhaps if the executive team had spent less time firing employees and hawking generative AI products that don't work, they might have the continuity and reliability of staff and processes to do little things like ‘control their own products.’”

Access Now, a digital civil rights non-profit, also ran a campaign to ban AI-powered gender classifiers in the European Union.

Advertisement