AI series #4

D@S lab develops AI to benefit the unseen workers who make AI work

Artificial intelligence is this year’s theme at TU Delft. Delta highlights six AI studies. Part 4: how AI is backed up by millions of ghost workers.

Illustration generated by AI. (Image: New Media Center with Adobe Firefly)

Computers are getting smarter all the time because somewhere, far out of sight, there are humans feeding them with critical information. These hidden workers label products for web shops; classify pictures or texts for relevance; and, moderate uploaded content for platforms. This unseen army of ghost workers works for many long hours under primitive circumstances, as anthropologist and Microsoft researcher Mary L. Gray has revealed in her book Ghost Work. In America alone, in 2016, it was already about 20 million adults earning an average of just under 40 euros a day in 10 hours, Gray writes. Others predict high growth in the sector in India, especially among young people with higher education.

In a simple twist of fate, D@S Lab now develops AI to improve the health and well-being of these unseen workers.


Hybrid intelligence

Developing healthy interventions (‘health bytes’) for anonymous information workers is one of the ways the D@S Lab (Design @ Scale) is experimenting with combining artificial and human intelligence. These forms of hybrid intelligence can scale up and accelerate design processes, argues Dr Ujwal Gadiraju, one of the directors of the D@S Lab. As other examples that the lab works on, he cites the use of chatbots in interviews and video diaries as design tools.

If you scrutinise the shadows of AI, you will find humans working there. Gadiraju cares about these underpaid workers. “These are the golden words of Mary Gray, and they still ring true. People are finally becoming aware of the fact that these informal workers deserve fair wages. I’ve been working in this area for about 10 years, and now I finally see scientific articles arguing that paying fair wages is expected and the norm. These workers are real humans beings, and it’s easy to forget that when they’re hidden behind screens.”

Thus, moderating becomes a physical kind of swiping

Take the example of one ‘Joan’ from Gray’s book. Joan reviews for a platform flagged pictures that the automatic censors can’t handle. Those pictures Joan then gets on her screen. Is it a dickpick that needs to be removed (click left) or is it an innocent finger and the picture can stay (click right)? Joan looks and clicks, looks and clicks.


Traumatising content

With motion recognition via the webcam, the instruction could be stretch and bend to the left for delete and bend to the right to retain. Thus, moderating becomes a physical kind of swiping. That way, information workers can keep working while also getting in a bit of movement, is the idea.

In three trials with about 100 participants each, crowdworkers expressed almost unanimous appreciation for the gesture capturing, Gadiraju says. “Most of the crowdworkers were happy that their situation is recognised and with our effort to improve it.”

Yet there are some pitfalls to this technique. First, reliability. Does recognition also work sufficiently well in cramped and dark spaces? Does the programme not make mistakes?  Of course, gesture recognition should not produce the wrong response (delete instead of retain, or vice versa).

Another question is whether gesture recognition affects work rate, and how to deal with it. “Moving in stead of clicking may be healthier, but if it reduces someone’s production, who pays for the difference in income?” asks Gadiraju. Thus, there are still numerous bumps to overcome before exercise can be used functionally.

Moderation of content is another field where AI can assist crowdworkers. In every content moderation setting you will see humans keeping out explicit content, abuse, and violence. But looking at traumatising content all day long can cause mental health problems such as insomnia, depression, and post-traumatic stress disorder.

The D@S lab investigates how AI can help by blurring the content intelligently. The moderator would then still get a good idea of the content, without being exposed to bloody and gory details. “An edited and blurred image is often good enough to recognise blood, weapons or other undesirables,” Gadiraju says.

Reflecting on what lies ahead for the D@S Lab, Gadiraju says that “we have four doctoral students, each of whom are doing their PhD at two faculties. What I would like to achieve is to make an impact in all four research areas that we are currently working in.”


Motion recognition

Of course, Gadirja also realises that a webcam with motion recognition won’t do much to improve the condition of crowd workers. The project can, however, help make them less invisible.

He later writes: ‘I am currently working with some leading experts in the field of crowdsourcing from around the world to spell out exactly what is needed to ensure that the future of crowd work is brighter than the mixed shades that we have observed during the foundational years of this paradigm of work. A part of that answer resides in safeguarding workers’ health and building mechanisms that can foster sustained growth for workers, not only from the standpoint of monetary rewards but also in terms of career trajectories.’

Author and researcher Mary L. Gray holds the software companies responsible: ‘Just as we need companies to be accountable for the labor practices that produce our food, clothes and computers, so should the producers of digital content be accountable to their consumers and workers.’ (From: Ghosts in the Machine).

  • Dr Ujwal Gadiraju is D@S Lab Director from the Faculty of Electrical Engineering, Mathematics and Computer Science (EEMCS). He co-heads the lab with Dr Jie Yang (EEMCS) and Dr Evangelos Niforatos (Faculty of Industrial Design and Engineering, IDE). They work with four PhD students and IDE Faculty Professor Gerd Koutem.
  • See the D@S Lab website for more information on the many courses and master projects. Read more about the Ghost Work book by Mary L. Gray and Siddarth Suri on the dedicated website. Lastly, the 11th AAAI Conference on Human Computation and Crowdsourcing (HCOMP 2023) will be held on 6-10 November at TU Delft.
Science editor Jos Wassink

Do you have a question or comment about this article?

Comments are closed.