AI Literacy: Navigating the EU AI Act
Last week two provisions of the EU AI Act (the “Act”) took effect. These provisions ban specific harmful AI applications and introduce a critical obligation: Companies that provide or deploy AI systems must ensure “AI literacy” among their staff and other persons dealing with the operation and use of the AI systems. This article focuses on the latter requirement.
Territorial Application of the EU AI Act
You may be thinking, “Wait. Does the EU AI Act even apply to my company?”.
Good question. Like the GDPR, the Act asserts significant extraterritorial jurisdiction. Regardless of a company’s location, the Act applies if a company is developing, selling, importing, manufacturing, distributing, or deploying AI that touches EU residents.
Here are a couple of examples of the Act’s breadth:
The Act addresses many circumstances beyond these examples.
In sum, assuming the Act does not apply to you as a non-EU company is a big mistake that could lead to steep regulatory penalties, reputational damage, and increased legal risk. Look closely at your AI systems before making a decision.
AI Literacy Requirement
So what’s this AI literacy requirement about?
“AI literacy” refers to the knowledge and understanding required to effectively use, interact with, and critically evaluate AI systems.
The AI literacy requirement aims to enable responsible AI deployment and usage within organizations. Although there are no direct fines for non-compliance, a failure to create sufficient AI literacy may increase penalties for other violations.
Recommended by LinkedIn
More importantly for your AI program, a lack of AI literacy will increase the likelihood of violations occurring. Untrained employees may inadvertently misuse AI, leading to unintended harms or non-compliance with broader governance policies.
Building AI Literacy in a Corporate Environment
Let's break it down now. What do you actually need to do?
The Act does not prescribe any format or method for creating AI literacy, but making a good faith effort will likely go a long way with regulators. The easiest way to do this to follow best practices already established in other compliance areas.
Start with these two prerequisites to AI governance:
1. Build an AI Map: Similar to a personal data map, you need to map where AI is being used in your company and by whom. Talk to your IT teams, survey your employees, and review a list of the applications you are paying for. Discover what’s in your tech stack to build a program that fits your organization.
2. Create an AI Policy: A key part of every AI literacy program should be educating your teams on your company’s guidelines and rules for using AI – just like your privacy policy is a key part of your privacy training.
Once you have these tools in hand, here is how I would approach increasing AI literacy within an organization:
I know this sounds like a lot. AI has the potential to make us more productive, but achieving the promise of AI will require investing in people and processes. Take it one step at a time and you will be amazed with what you can accomplish in a few months.
The Wrap-up
For companies doing business in the EU, AI literacy is now a compliance requirement under the Act. But beyond compliance, AI literacy is essential for effective AI governance. Employees must understand what they can and cannot do with AI to avoid exposing the company to legal, ethical, and operational risks. A well-trained workforce is a safeguard against AI-related liabilities and an enabler of responsible and effective AI adoption.
law meets fintech
3moThanks for your post! Funny (or maybe frustrating?) that we still have to turn common sense into law in Europe. But even before the AI Act, it was just good practice to assess AI properly before rolling it out—and to give employees clear guidance on what they can and can’t do. Training and guidelines should be a no-brainer. Every company has a huge interest in pushing AI internally, and let’s be honest—there are probably very few companies left where employees aren’t already using AI in some way. The 🔑 is to do it responsibly. We put the right measures in place early on, and now we’re happy to see that this also keeps us AI Act compliant.
Lawyer Turned Professional Stylist | Empowering Women Attorneys, Executives and Entrepreneurs Through Personal Style
3moChristine Uri I love that you are discussing the need for AI literacy in law!
Former Big Law Litigator | Master Billable Timekeeper - Helping attorneys capture more of their billable time - without the stress
3moGreat practical breakdown of the AI Act's literacy requirements Christine Uri! Love how you made something as potentially dry as EU compliance sound like "AI Map = your corporate tech stack's version of Where's Waldo?" 😄 Also, that baseline training stat about 95% of employees using AI within 12 months is crazy!
Lawyer turned money coach. | I help lawyers grow their net worth so they can live the lives of freedom and choice they deserve. | Personal Finance for Lawyers podcast
3moThank you for this update! It makes sense to me that part of regulating AI includes requiring AI literacy. That is a prudent move.
Vice President and General Counsel📌 Education 📌 Strategic Business Partner 📌 Problem-Solver & Turnaround Expert📌Author📌Speaker📌Veteran📌Former Adjunct Professor
3moI am going to have to read the article, but gut reaction is that AI literacy should be a compliance requirement. The benefits AND risks are too great. We have to understand and be vigilant.