A new and rather simplistic game openly reveals the limitation of Artificial Intelligence (AI). The game is available through a website and hence, it is an in-browser game.
AI-powered software can identify human emotions, but its accuracy still needs a lot of work. A new browser game wants to lay bare the limits of the technology.
Emojify Project proves AI still needs a lot of work to correctly identify emotions in complicated human beings:
The game asks players to look at their computer’s web camera and try to produce six different emotions. These simple and basic emotions include happiness, sadness, fear, surprise, disgust, and anger.
— David Hopkins (@hopkinsdavid) April 6, 2021
As players engage with the platform, they will quickly realize how easy it is to trick the system. Simply put, players can put up a fake smile to make the AI in the game “think” they are happy.
The Emojify Project’s game shows how computers attempt to read emotions through visual inputs. In this particular case, it is the webcam.
@LeverhulmeCFI research fellow @anthroptimist talks about her citizen science research https://t.co/Uje1JRDHE8 exploring perspectives on the ethics & impacts of ERT #AIEthics @nesta_uk @Emojify_Project @LeverhulmeTrust https://t.co/nX6Z7Mxn52
— Future Intelligence (@LeverhulmeCFI) April 6, 2021
Needless to add, the software’s readings are far from accurate. The AI has often reportedly interpreted even exaggerated expressions as “neutral”. Speaking about the project, creator Alexa Hagerty, a researcher at the University of Cambridge Leverhulme Centre for the Future of Intelligence and the Centre for the Study of Existential Risk said the project was meant “to demonstrate that the basic premise underlying much emotion recognition tech, that facial movements are intrinsically linked to changes in feeling, is flawed”.
“The premise of these technologies is that our faces and inner feelings are correlated in a very predictable way If I smile, I’m happy. If I frown, I’m angry. But the APA did this big review of the evidence in 2019, and they found that people’s emotional space cannot be readily inferred from their facial movements.”
AI cannot differentiate between a wink and a blink?
In the movie, I, Robot, Will Smith’s character winks at the humanoid robot, Sonny, to convey his true intentions. In the real world, however, AI lags far behind even with the basic interpretation of facial cues.
There’s a second mini-game available on the website that demonstrates this limitation quite well. “You can close your eyes, and it can be an involuntary action or it’s a meaningful gesture,” says Hagerty.
Besides the limitations of AI, these games have clearly revealed yet another complexity of humans that algorithms still cannot fathom. There’s a huge disconnect between the emotions people experience internally and the face they show to the world.
Humans experience emotions differently, and oftentimes, weirdly. The Emojify Project highlights that claiming it’s possible to distill how we feel in a given moment into a simple set of emojis is deeply flawed.
Despite the obvious limitations, AI-based facial recognition and emotion readouts are becoming increasingly common. From governments to private organizations, many are actively investing in a technology that still needs a lot of work to accurately read humans.
“The dangers are multiple. With human miscommunication, we have many options for correcting that. But once you’re automating something or the reading is done without your knowledge or extent, those options are gone.”