There’s a really good reason for your business not to use emotional analysis software
>
Experts at the UK’s Information Commissioner’s Office (ICO) have warned against using AI-powered emotional analysis systems, fearing that the drawbacks outweigh the benefits, at least for now.
Portable screening tools for health monitoring, or behavioral monitoring such as posture, speech, and eye and head movements, are given as an example where the data to be collected could be considered risky.
The ICO says that storing and processing this type of data, including unconscious behavior, can be personally identifiable, posing certain challenges in terms of how companies can then use this data.
Biometric data security
“While there are opportunities, the risks are currently greater,” said ICO Deputy Commissioner Stephen Bonner. “At the ICO, we are concerned that incorrect analysis of data could lead to assumptions and judgments about an individual that are inaccurate and lead to discrimination.”
Then there’s the fact that “they may not work yet, or even never,” says Bonner. Learning to understand personal emotions can be extremely difficult for people; condensing this information into data that a computer could use to categorize its subjects even more.
Right now, the ICO wants companies to use technology that is “fully functional, accountable and backed by science.” The organization also says it is “yet to see” AI technology that meets data protection requirements, and further concerns about proportionality, fairness and transparency.
By the spring of 2023, the ICO hopes to have published biometric guidelines to help organizations understand the importance of security as this type of technology is increasingly common in the financial, fitness, education and even immersive entertainment industries.
In the meantime, Bonner says the ICO will “continue to research the market, identify stakeholders who want to create or implement these technologies, and explain the importance of improved data privacy and compliance, while building trust and confidence in how these systems work.”