Exposing the Bias Embedded in Tech

Canada News News

Exposing the Bias Embedded in Tech
Canada Latest News,Canada Headlines
  • 📰 Women 2.0
  • ⏱ Reading Time:
  • 50 sec. here
  • 2 min. at publisher
  • 📊 Quality Score:
  • News: 23%
  • Publisher: 63%

“It was penalizing résumés that had the word ‘women’ in it, such as if you went to a women’s college,” she said. “Your résumé was spit out.”

that facial recognition is far more accurate with lighter-skin men than with women and, especially, with darker-skin people.

Other examples: A Microsoft customer was testing a financial-services algorithm that did risk scoring for loans. “As they were training the data set, the data was of previously approved loans that largely were for men,” Ms. Johnson said. “The algorithm clearly said men are a better risk.”the computer models were being trained using résumés submitted over the past 10 years, and most came from men. Therefore it was “taught” that men were better job candidates.

For example, she said, tenants in Brooklyn are fighting a landlord who wants to replace a lock and key entry system with facial recognition. In aopposing that, the AI Now institute supported the tenants’ fear of increased surveillance and that the inaccuracy in facial recognition, especially of nonwhites, would lead them to be locked out.

“The people at the top look more and more the same,” she said. And fewer, not more, women are getting bachelor’s degrees in computer science. According to the

We have summarized this news so that you can read it quickly. If you are interested in the news, you can read the full text here. Read more:

Women 2.0 /  🏆 149. in US

Canada Latest News, Canada Headlines

Similar News:You can also read news stories similar to this one that we have collected from other news sources.

San Francisco Will Use AI To Thwart Racial Bias When Charging SuspectsSan Francisco Will Use AI To Thwart Racial Bias When Charging SuspectsIn July, the San Francisco District Attorney's office will implement a 'first-in-the-nation' AI tool used to prevent prosecutors from being racially biased when deciding criminal charges.
Read more »

'Rafiki' Is A Stunning Lesbian Love Story\u2014In A Place Where That's Forbidden'Rafiki' Is A Stunning Lesbian Love Story\u2014In A Place Where That's ForbiddenOn GoodTrouble, we saw a rare instance of a Black woman addressing Black men's unspoken preference for white women, without reducing her to the stereotype of being bitter
Read more »

Genius Claims Google Stole Lyrics Embedded With Secret Morse CodeGenius Claims Google Stole Lyrics Embedded With Secret Morse CodeLyrics site used two types of apostrophes that, when converted to dots and dashes, spelled out “Red Handed”
Read more »

San Francisco Will Use AI To Thwart Racial Bias When Charging SuspectsSan Francisco Will Use AI To Thwart Racial Bias When Charging SuspectsIn July, the San Francisco District Attorney's office will implement a 'first-in-the-nation' AI tool used to prevent prosecutors from being racially biased when deciding criminal charges.
Read more »



Render Time: 2025-03-10 07:50:49