According to studies published by Massachusetts Institute of Technology MIT it claimed that after researchers compared tools from five different companies including microsoft and IBM, it found out that none was 100% accurate, it found that Amazon’s Rekognition tool performed the worst when it came to recognising women with darker skin.
Amazon responded to that by just saying their research was “misleading” thay claim amazon had an error rate of 31% when it comes to identifying images of women with dark skin, This compared with a 22.5% rate from Kairos, which offers a rival commercial product, and a 17% rate from IBM.
By contrast Amazon, Microsoft and Kairos all successfully identified images of light-skinned men 100% of the time. a blog post, Dr Matt Wood, general manager of artificial intelligence at AWS, highlighted several concerns about the study, including that it did notuse the latest version of Rekognition. He said the findings from MIT did not reflect Amazon’s own research, which had used 12,000 images of men and women of six different ethnicities.
So the question is “is artificial intelligent racist” (lolz).
Please dont forget to like and follow us on facebook and on twitter link down below.