Newsletter - July 2020 - AI bias
Newsletter July 2020
The Black Lives Matter movement, revived by the tragic death of George Floyd, has spread all over the world. This tragedy has put the discrimination that still undermines our societies in a harsh light. Technology and artificial intelligence in particular are not exempt from this self-examination that we must carry out collectively. Conceived by humans, they tend to reproduce their prejudices. This raises many questions about how AI is designed, by whom, and how we can ensure its ethics.
In 2018, Joy Buolamwini, a black researcher at MIT, is surprised that a facial recognition software she works with recognizes the faces of her friends but not her own. When she puts on a white mask, the algorithm identifies her immediately. An experiment that prompts her to do her thesis on the subject.
<iframe width="560" height="315" src="https://www.youtube.com/embed/TWWsW1w-BVo" title="YouTube video player" frameborder="0" allow="accelerometer; autoplay; clipboard-write; encrypted-media; gyroscope; picture-in-picture" allowfullscreen></iframe>
After testing facial recognition systems from IBM, Microsoft and Face ++, the results are clear: they all recognize more men than women and are more accurate on lighter than darker skin. There is no shortage of examples of the discrimination that this type of problem causes:
Human too human
There is too often a tendency to conceive of AI as an entity that arose ex nihilo. Yet it only learns from the images and information it is provided with. In other words, it only learns from the image of those who conceive of it. As such, it is not surprising that it leads to racist and sexist decision-making, when the algorithms that make it up are overwhelmingly developed by white men. "Algorithms are not endowed with consciousness, but built by people," says Mouhamadou Moustapha Cissé, head of Africa's first artificial intelligence research centre. "It is, in part, the biases of these people that affect the way the algorithms work.
Intrinsic defects
In an op-ed piece, Nelly Chatue-Diop and Soumia Malinbaum, respectively Chief Data Officer Betclic Group and Vice President Business Development Keyrus point out three more technical weaknesses in the way these algorithms are designed:
Awareness
In recent years, there has been a growing awareness of these issues:
Yes we can
An evolution towards the best is possible. "Noting stigmatizing differences in Google Translate's translations in some languages, Google has made sure to generate gendered translations for all sentences where the original template offered only a male version. Indicators have been put in place to measure progress and reduce this bias by more than 90% in translations from Hungarian, Finnish, Persian and Turkish into English. The new model is all the more relevant and now understands that in these languages "doctor" and "engineer" can be female! "Nelly Chatue-Diop and Soumia Malinbaum are delighted.
This is more than ever a major issue, as AI is becoming more and more involved in every little gap in our daily lives. It is up to us to make it in the image of the society we want to build.
Isahit: ethics at heart
Collaborating almost exclusively with young black women from developing countries, isahit contributes fully to the development of a more ethical AI and to the image of society as a whole, respecting all its diversity. By annotating and supervising machine learning algorithms for our clients, these women work every day to avoid the biases they could generate. The challenge is not only ethical, as diversity is one of the essential keys to the accuracy and efficiency of AI. Very concretely :
It is through the diversity of its community that isahit can guarantee its clients accurate and valid, specific and contextualized data.
"At isahit, we also put a lot of emphasis on the formation of our community. Our goal is to help these women achieve their professional goals while giving them the keys to take full part in the ongoing digital revolution.
Isabelle Mashola, CEO of isahit
Read the newsletter in your web browser
We have a wide range of solutions and tools that will help you train your algorithms. Click below to learn more!