AI Literacy: Navigating the EU AI Act

AI Literacy: Navigating the EU AI Act

Last week two provisions of the EU AI Act (the “Act”) took effect. These provisions ban specific harmful AI applications and introduce a critical obligation: Companies that provide or deploy AI systems must ensure “AI literacy” among their staff and other persons dealing with the operation and use of the AI systems. This article focuses on the latter requirement.

Territorial Application of the EU AI Act

You may be thinking, “Wait. Does the EU AI Act even apply to my company?”.

Good question. Like the GDPR, the Act asserts significant extraterritorial jurisdiction. Regardless of a company’s location, the Act applies if a company is developing, selling, importing, manufacturing, distributing, or deploying AI that touches EU residents.

Here are a couple of examples of the Act’s breadth:

  • Any company that makes AI systems available to employees in the EU for professional use is “deploying” AI in the EU.
  • Any company that makes AI systems available to EU residents is “distributing” AI, even if the system is free.

The Act addresses many circumstances beyond these examples.

In sum, assuming the Act does not apply to you as a non-EU company is a big mistake that could lead to steep regulatory penalties, reputational damage, and increased legal risk. Look closely at your AI systems before making a decision.

AI Literacy Requirement

So what’s this AI literacy requirement about?

“AI literacy” refers to the knowledge and understanding required to effectively use, interact with, and critically evaluate AI systems.

The AI literacy requirement aims to enable responsible AI deployment and usage within organizations. Although there are no direct fines for non-compliance, a failure to create sufficient AI literacy may increase penalties for other violations.  

More importantly for your AI program, a lack of AI literacy will increase the likelihood of violations occurring. Untrained employees may inadvertently misuse AI, leading to unintended harms or non-compliance with broader governance policies.

Building AI Literacy in a Corporate Environment

Let's break it down now. What do you actually need to do?

The Act does not prescribe any format or method for creating AI literacy, but making a good faith effort will likely go a long way with regulators. The easiest way to do this to follow best practices already established in other compliance areas.

Start with these two prerequisites to AI governance:

1.    Build an AI Map: Similar to a personal data map, you need to map where AI is being used in your company and by whom. Talk to your IT teams, survey your employees, and review a list of the applications you are paying for. Discover what’s in your tech stack to build a program that fits your organization.

 2.    Create an AI Policy: A key part of every AI literacy program should be educating your teams on your company’s guidelines and rules for using AI – just like your privacy policy is a key part of your privacy training.

Once you have these tools in hand, here is how I would approach increasing AI literacy within an organization:

  • Baseline Training: AI usage is spreading quickly. In most industries, it’s reasonable to assume that 95%+ of your employees will be using an AI system within the next 12 months (if they are not already). Add baseline AI literacy training to your mandatory training suite. Compliance vendors offer training like this that you can plug into your learning management system. Remember that you will likely want to make some customizations to ensure that employees know your specific policies.
  • Specialized Training: Besides general AI systems, groups of employees are likely to use specialized systems. These employees will need specific knowledge about operating the systems safely, using the output, and managing related risks. The AI system vendor may have a training you could use, or you may have to create one internally. This can’t be a one-off. New employees will need to be onboarded as the use of the system expands.
  •  Deep Training: A smaller subset of employees will be responsible for AI decisions, rolling out AI technology, and the daily operation of AI systems. These employees will need more extensive training. Consider sending key leaders out for external training or bringing in an expert for a targeted internal group training. As AI competencies increase within your company, you will be able to manage more of this process internally.
  • Communications: Employees will not fully internalize the need for responsible AI practices from a single training. You will need to build a cultural drumbeat around this and integrate it into your broader activities. Leverage your executives to talk about your AI policies. Include refreshers in your company communications. Consider using "AI Moments" at meetings to reinforce concepts. Get creative.

I know this sounds like a lot. AI has the potential to make us more productive, but achieving the promise of AI will require investing in people and processes. Take it one step at a time and you will be amazed with what you can accomplish in a few months.

The Wrap-up

For companies doing business in the EU, AI literacy is now a compliance requirement under the Act. But beyond compliance, AI literacy is essential for effective AI governance. Employees must understand what they can and cannot do with AI to avoid exposing the company to legal, ethical, and operational risks. A well-trained workforce is a safeguard against AI-related liabilities and an enabler of responsible and effective AI adoption.

Thanks for your post! Funny (or maybe frustrating?) that we still have to turn common sense into law in Europe. But even before the AI Act, it was just good practice to assess AI properly before rolling it out—and to give employees clear guidance on what they can and can’t do. Training and guidelines should be a no-brainer. Every company has a huge interest in pushing AI internally, and let’s be honest—there are probably very few companies left where employees aren’t already using AI in some way. The 🔑 is to do it responsibly. We put the right measures in place early on, and now we’re happy to see that this also keeps us AI Act compliant.

Like
Reply
Estelle Winsett, JD

Lawyer Turned Professional Stylist | Empowering Women Attorneys, Executives and Entrepreneurs Through Personal Style

3mo

Christine Uri I love that you are discussing the need for AI literacy in law!

Molly Kremer, Esq. - The Billing Coach

Former Big Law Litigator | Master Billable Timekeeper - Helping attorneys capture more of their billable time - without the stress

3mo

Great practical breakdown of the AI Act's literacy requirements Christine Uri! Love how you made something as potentially dry as EU compliance sound like "AI Map = your corporate tech stack's version of Where's Waldo?" 😄 Also, that baseline training stat about 95% of employees using AI within 12 months is crazy! 

Rho Thomas

Lawyer turned money coach. | I help lawyers grow their net worth so they can live the lives of freedom and choice they deserve. | Personal Finance for Lawyers podcast

3mo

Thank you for this update! It makes sense to me that part of regulating AI includes requiring AI literacy. That is a prudent move.

Lisa ⚖️ Lang

Vice President and General Counsel📌 Education 📌 Strategic Business Partner 📌 Problem-Solver & Turnaround Expert📌Author📌Speaker📌Veteran📌Former Adjunct Professor

3mo

I am going to have to read the article, but gut reaction is that AI literacy should be a compliance requirement. The benefits AND risks are too great. We have to understand and be vigilant.

Like
Reply

To view or add a comment, sign in

More articles by Christine Uri

Insights from the community

Others also viewed

Explore topics