Skip to content
Notes from the field

Notes from the field

Judgmental A.I. mirror rates how trustworthy you are based on your looks

Judgmental A.I. mirror rates how trustworthy you are based on your looks

In an interesting piece of research intended to highlight the biases embedded in AI systems and training data, a team at the University of Melbourne produced a system capable of making assessments of physical attributes and the emotional state of people in a photo. Hidden in the assessments are a range of biases – gender is assumed to be a binary state, there are only five ethnicities, and emotional judgement or "responsibilities" are clearly highly subjective.
https://rob.al/2vE5zdw
Would you be freaked out if a facial recognition mirror started making judgments about your age, gender, race, attractiveness, and even trustworthiness? Get ready to meet the Biometric Mirror, a…

Share this:

  • Click to email a link to a friend (Opens in new window)
  • Click to share on Facebook (Opens in new window)
  • Click to share on LinkedIn (Opens in new window)
  • Click to share on Twitter (Opens in new window)
  • Click to share on WhatsApp (Opens in new window)
2018-08-03

linkedin cross-post

Post navigation

PREVIOUS
Lip-reading artificial intelligence could help the deaf—or spies
NEXT
AI-driven robot hand spent hundred years teaching itself to rotate cube
Comments are closed.

Archives

The standard disclaimer…

The views, thoughts, and opinions expressed in the text belong solely to the me, and not necessarily to the my employer, organization, committee or other group that I belong to or am associated with.

Creative Commons License
This work is licensed under a Creative Commons Attribution-NonCommercial 4.0 International License.
© 2023 Rob Aleck, licensed under CC BY-NC 4.0
Go to mobile version