Ann Arbor tech researcher explains how software can harm

This article is the first in a new series on diversity, equity and inclusion efforts in Washtenaw County’s tech sector. Support for this series is provided by Ann Arbor SPARK.

Bias is present in all decision-making processes, but in the world of technology, programmers’ biases can have particularly damaging effects on the people their products are intended to serve. Meg Green (they / them), an Ann Arbor-based senior user experience researcher for Rocket houses, have explored this subject at length in their personal research. Green’s work primarily focuses on research and also uses design to work with developers and designers. They share the challenges that biased data can create with developers.

Green gives an example of how data can be biased in buying homes.

“Customers who buy or sell homes want to easily find a home in a neighborhood they’ll love to live in, and there are public things about a neighborhood that can be easily searched,” says Green. “The most frequently asked question concerns crime statistics.

Green recognizes that the more police officers there are in an area, the more crimes are reported, which could affect the data and the way it is analyzed.
Meg Green.
“Anyone can access information that is in the public domain about the city they want to live in, but the way the information is used for data can perpetuate a problem that already exists,” says Green.

Bias can also play a powerful role in artificial intelligence (AI), and Green and others in the industry are reassessing these effects. For example, Green points to an AI product called Correctional Offender Management Profiling for Alternative Sanctions (COMPAS), which uses demographic information to assess how dangerous criminals are. Evaluations of COMPAS and similar programs are sometimes used to determine the length of prison or probation sentences, despite proof that they are biased against black offenders.

“When it comes to a computer, people are more likely to trust the data,” says Green. “… Data is just statistics, but if you put in biased data, you will get biased data.”

Bias can also come into play in machine learning algorithms, which are used to teach AI how language works. The way we categorize people and the terminology have changed over time, and context is very important in language. But unfortunately, AI can be biased towards concepts of gender and race.

“The words ‘doctor’ and ‘nurse’ in English are gender neutral, but when translated into German Google Translate uses the masculine term for ‘doctor’ and the feminine term for ‘nurse’,” says Green. . “The AI ​​tries to assume the context to determine whether to use the word masculine or feminine in languages ​​that use gender signifiers.”

Algorithms can also create negative associations with certain racial or gender identifiers such as “black”, “female” or “gay”. For example, a Google Search for “Black Girls“used to return results primarily for pornography. and include the word “transgender” in video titles resulted in a drop in advertising revenue for YouTubers on their videos.

“Being gay or being black or being a trans woman doesn’t mean these things are negative and you don’t want to read this information,” Green says. “Anything about being bisexual and gay is pornographic and unacceptable to children, according to some biased data found with AI.”

Green suggests that tackling these biases is as easy as sending software developers with user experience researchers while they interview the target users of their software.

“They can empathize with people and see their situations,” says Green. “Developers are more empathetic and can now talk to their team about situations. It’s about developing that empathy and helping people understand the biases that have come in. “

Bias in software can have harmful consequences, and Green says it’s up to programmers to reverse them.

“A machine algorithm won’t learn unless we teach it better and help reprogram the data,” says Green.

For more information on AI and gender roles, watch Green’s video presentation on “Gender and artificial intelligence. “

Monica Hickson is a freelance writer currently based in Ypsilanti. She joined Focus as a journalist in 2020 and is the author of a book, “COVID logs. “You can reach her at [email protected].

All pictures from Doug Coombe.


Source link

Comments are closed.