Judgmental A.I. mirror rates how trustworthy you are based on your looks

In an interesting piece of research intended to highlight the biases embedded in AI systems and training data, a team at the University of Melbourne produced a system capable of making assessments of physical attributes and the emotional state of people in a photo. Hidden in the assessments are a range of biases – gender is assumed to be a binary state, there are only five ethnicities, and emotional judgement or "responsibilities" are clearly highly subjective.
Would you be freaked out if a facial recognition mirror started making judgments about your age, gender, race, attractiveness, and even trustworthiness? Get ready to meet the Biometric Mirror, a…