‘Siri works vrouwonvriendelijke prejudices in the hand’

8efa88c1023309dd9312732ccdb8bef1 - ‘Siri works vrouwonvriendelijke prejudices in the hand’

Artificial Intelligence-speakers such as Apple’s Siri or Alexa of Amazon, which often make use of a female voice, confirming harmful gender biases. That proposes a study of the United Nations over which the British transmitter BBC Tuesday message.

According to a study by UN organization, Unesco, the female helpers, in technological applications, such as Siri on the iPhone, which was proposed as ” helpful and always willing to please’, which is the bias of the ‘submissive’ wife confirms.

The researchers find it especially troubling that the hulpstemmen often evasive, shallow, and little apologetic’ comment on insults.

The report of the Unesco is calling for technology companies to have this assistant-speakers do not, by default, the voice of a woman, but to choose a more neutral synthesized voice.

Last year, worldwide an estimated 100 million of these intelligent speakers on the counter. According to Gartner research, some of the people, by 2020, more calls to their assistant than with their own partner.