Ethical adoption of AI is not a priority for new users, and that’s a problem
- Only 1.8% of the 45,000 people analyzed were concerned about the ethics of AI
- Consumers believe that organizations should be held accountable
- The EU AI Act carries fines of millions of euros
According to Pluralsight, of the 45,000 people who wanted to learn more about artificial intelligence, only 1.8% were actively looking for how to apply AI responsibly.
The survey found increased interest in generative AI, machine learning and AI for cybersecurity. However, Pluralsight Chief Content Officer Chris Herbert said there was no significant interest in ethical AI seen on the platform.
Herbert added: “It is crucial that students understand the risks and pitfalls of AI so that they can apply it in an ethical way.”
We are not interested in ethical AI
The report highlights Google DeepMind research that shows how AI can be misused, manipulated and exploited. Herbert said we should focus on “mitigating its risks and negative consequences, while maximizing its positive outcomes.”
Also chief content strategist Adam Ipsen noted Accenture research shows that more than three-quarters (77%) of global consumers believe organizations should be held accountable for AI misuse, highlighting the need for greater awareness.
The reality is that four in five executives and almost as many (72%) IT professionals say their organizations often invest in new technology without considering employee training. In a similar vein, only 12% of executives have significant experience working with AI.
The consequences of not ethically adopting AI are also expected to have a financial value, with the EU AI Act coming into force in August 2024 and enforcement gradually increasing over the coming years. The maximum fines are 35 million euros or 7% of global turnover.
Looking ahead, Ipsen urges companies not to view AI as a ‘one and done’ project, but as one that requires continuous training. Those who take the time to learn will realize the real benefits of AI instead of seeing it become a burden that leaves them facing regulatory hurdles.