In Kenya, ChatGPT “coaches” speak out against their working conditions

2023-10-19 18:00:12

ChatGPT is not just an algorithm that has the answer to almost everything. To build the famous artificial intelligence (AI) tool released at the end of 2022, thousands of little hands were hired across the planet in order to train it to respond well, in particular by training it to know how to recognize and put aside some of the content he finds on the Internet.

Bill Mulinya did this work for five months, between the end of 2021 and the beginning of 2022. This young 30-year-old Kenyan led a team of fifteen people who trained the future ChatGPT to discern hateful, violent or harassment comments on behalf of Sama , a major subcontractor of Big Tech, the large companies in the sector. As part of this contract concluded with the designer of the chatbot, OpenAI, their job consisted of reading, all day long, texts found in all corners of the Web and attaching precise qualifiers – or “labels” – to them in order to report them to the algorithm. A second team was assigned to texts of a sexual nature.

Read also: Article reserved for our subscribers The “Black Tax”, between springboard and burden for young African workers

« At first when you start reading these contents it’s ok, says Mr. Mulinya, cap and shirt, drinking a smoothie on the terrace of a café in Nairobi. But when you read this continuously, it starts to seep into your head. One of my colleagues was a very jovial, extroverted person. When the project ended in March 2022, he was totally changed, he was afraid of everything. » Necrophilia, suicide, child abuse… Hour following hour, text following text, several former employees described developing anxiety, sleep or sexuality problems. “ At one point, almost the entire team asked me to be on leave, he adds. As a boss, you know that means there’s a problem. »

“Damage to mental health”

The contract was to last one year. But Sama requested its termination “ immediately ” After ” that the teams have attracted the attention of management “, or following a few months, explains the subcontractor in an email – who has announced that he will permanently stop this type of content. For its part, OpenAI claims, also by email, to recognize “ the difficulty of this work » for subcontractors: “ Their efforts to ensure user safety of artificial intelligence systems are of immense value. »

The value of this work, in fact, is at the heart of several cases that have shaken up Nairobi’s tech ecosystem in recent months. The Kenyan capital, with its inexpensive, educated and English-speaking workforce, has become a subcontracting platform for Silicon Valley. The country even depicts itself, a little pompously, as “Silicon Savannah”: the government wants to introduce learning the basics of computer coding from primary school and touts its youth as a breeding ground for Web giants, African Bangalore style.

You have 55.82% of this article left to read. The rest is reserved for subscribers.

1697740189
#Kenya #ChatGPT #coaches #speak #working #conditions

Leave a Replay