this post was submitted on 06 Jul 2023
164 points (98.2% liked)

Technology

59429 readers
2714 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 1 year ago
MODERATORS
 

New York City businesses that use artificial intelligence to help find hires now have to show the process was free from sexism and racism.

you are viewing a single comment's thread
view the rest of the comments
[–] nodsocket@lemmy.world 2 points 1 year ago (5 children)

One easy way to prove that the AI is racist is whether race is one of its parameters. However, it is possible for an AI to be racist even without an explicit parameter.

[–] reilwin@lemmy.world 17 points 1 year ago (3 children)

Not necessarily, something that overt would be obvious and avoided. I'm pretty sure that what they're looking for is more for subtle biases caused by bad datasets to train the AI.

For instance, you train the AI and tell it which candidates are good or bad. But maybe, by pure happenstance, the best candidates in your dataset are all male. If so, the AI might be accidentally trained to believe that all good candidates are male.

[–] DreamerOfImprobableDreams@kbin.social 7 points 1 year ago* (last edited 1 year ago) (1 children)

I remember hearing about a high-profile case where the AI would dock points if someone's resume listed them as participating in women's sports as an extracurricular, while giving extra points if it listed them as participating in men's sports.

Also, bias doesn't necessarily have to come from happenstance. Unfortunately, humans tend to have unconcious (or, sometimes, not-so-unconcious) biases against women and people of color. There was a study where researchers sent identical resumes to a random group of recruiters-- but half of the resumes had a male name and half had a female name.

They found that both male and female recruiters were more likely to rate the resumes with the male name higher and be more likely to recommend they be advanced to the next round of interviews. IIRC, similar studies have found similar results if you give the resumes a "Black sounding" name versus a "white sounding" name.

So if you train an AI on your own company's hiring data-- which is likely to be tainted by the unconcious bias of your own recruiters and hiring managers-- then the AI might pick up on that and replicate it in its results.

[–] heyadern@kbin.social 2 points 1 year ago

very interesting. somehow, resumes should be ranked with points. without gender or race or name. a point system based on... I guess merit? credentials? experience?
I feel like this should be a real thing. truly. but how.

load more comments (1 replies)
load more comments (2 replies)