How can European schools innovate under the AI ​​Act?

by Finn Patraic

When you buy through links on our site, we may earn a commission at no extra cost to you. However, this does not influence our evaluations.

Artificial intelligence is used in world classrooms EU artificial intelligence act?

A Recent conference On the AI ​​in European schools brought together a group of education leaders to answer this question. Their discussions were monitored by educators around the world while they explore the adoption of AI. How to direct. How to adapt. How to prepare schools for a future that will apparently be a very different reality today.

EU artificial intelligence law

In June 2025, the main provisions were already in force, with most of the obligations from August 2026. Its objective is to ensure that AI systems are safe, fair and transparent.

ACT classifies AI by risk. Tools that represent an unacceptable threat to rights or security are prohibited. High risk systems must meet strict criteria around transparency, data governance, human surveillance and security. Even systems considered to be lower risks must comply with new transparency rules.

The issues are high. Non-compliance could cost companies up to 35 million euros or 7% of global turnover. The rules apply to any supplier or deployer whose system reaches EU users, it doesn't matter where the company is established.

The law caused a debate, Especially in the United States. US companies and decision -makers have raised concerns about competitiveness, innovation and transparency of data. We are talking about regulatory exceeding.

So how should EU schools react? How do they advance in a way that meets this new standard, while putting students and learning at the center?

This is exactly what the conference decided to explore.

Start with the emergency

As an animator of the event, I opened with urgency. I work strategically with schools worldwide and certain governments on the adoption of AI. I started by explaining the current global developments linked to AI and emphasizing the innovative state of mind that education must adopt to progress with hope.

I asked the public's educators to lead with a goal, but I also recognized that the change was emotional. It's tiring. And not everyone feels ready. Managers cannot ignore this. They must support their teams with empathy, not just plans. If we can establish confidence, we can take momentum.

It is not only a question of adding technology to what we are already doing or jumping recklessly in new trends.

Unpacking the EU AI law in education

Matthew Wemyss, author of AI in education: an EU ACT guide And a leading figure on the AI ​​Act in schools. Its session was a practical introduction on how schools can start with the AI ​​Act. He traveled educators through what they had to understand and do to start the path of compliance.

The law does not treat all AI in the same way. Some AI tools have a minimum or limited risk. The AI ​​systems used to determine students' access to educational programs, assessing learning results, assessing appropriate education levels or monitoring the behavior of students during tests are listed as at high risk and transport more strict rules.

Wemyss was clear: compliance is not optional. But the act is not only to avoid fines. This is a framework to support the responsible and transparent use of AI in education.

He framed his message around three key actions: assess, examine and comply. Schools must start by auditing what is already used. This means knowing which tools are used, which they really do and who is responsible for it. This not only includes formal platforms, but also tools with integrated intermediary features used informally by staff.

From there, Wemyss encouraged schools to examine how these tools are used. Are decisions right? Are the outings explainable? Is human judgment involved? Schools should not take complaints from suppliers at their nominal value. If an AI tool affects students' learning or access, managers must understand how it works. If the providers do not comply, the school faces risk of compliance as a deployer.

Compliance, he explained, is not a control list. This means building ethical, safe and appropriate systems in the context of each school. What is necessary in a parameter may not apply to another. Even when using third -party systems, schools remain responsible as deployers. “Ask difficult questions,” he said. “Get the clear documentation you need on compliance measures.”

He also urged schools to appoint someone who can direct the governance of AI. Not only someone technique, but someone who can understand the ethical dimension and translate the regulations into daily practice.

Wemyss's closing message was practical: start now, but start intelligently. “You don't need to resolve everything,” he said. “But you should know what you are working with.” Schools should aim for compliance by August 2026. Leaving it too late. The risks have precipitated the decisions and risks.

Media threshing strategy

Then, author and consultant in education Philippa Wraithmell took the conversation in a new direction. She worked with Dubai schools in Dublin, helping them to use digital tools with a goal. His big message is not to confuse the activity with the strategy.

AI is not useful simply because it exists. It is useful when linked to a goal. Wraithmell has shown how some schools work well. They don't only use AI to speed up the classification. They use it to customize the support. To build better course plans. To give teachers an overview of how students learn.

But none of this happens by accident. Planning is needed. You need training. You have to trust. Wraithmell stressed that confidence must start with teachers. If they do not feel confident, the technology will not remain. This is why she recommends starting little. Pilots. Training. Time to think. And always, a space for teachers and students to build together.

One of the most practical advice she shared was a simple decision matrix. For each idea of ​​AI, schools should ask for: does this support the learning objectives? Are the data safe? Do teachers feel confident by using it? If he doesn't check the three boxes, they wait.

Its strongest point approached the end. “If your AI strategy does not include the entire school community,” she said, “then it's not really a strategy.”

Enlightened governance

Al Kingsley Mbe Entered last. It has been educational leadership for decades, both in technology and in schools, and is a prolific author. He emphasized. He talked about governance. This part of the school leadership which too often remains in the background.

Kingsley explained that schools needed more than leadership. They need structures that support the right decisions. Who approves new tools? Who monitors their impact? Who ensures that policies remain up to date?

He presented a maturity model that the advice can use to assess their preparation. Are they passive? Reagent? Strategic? Most are sitting somewhere in the middle. Kingsley challenged them to move further. He reminded everyone that if people who make decisions do not understand AI, they will eventually let someone else decide.

He put pressure for continuous training. Managers and governors need time and space to learn. Otherwise, the school goes ahead with a digital blindness.

He also underlined the need to bring parents into the conversation. Families want to reassure. They want to know how AI is used. And why. Kingsley said schools should be ready to explain both. Not with the jargon, but with clarity. With examples. With honesty.

State of mind on tools

What linked the whole session together was not a single answer. It was a state of mind. AI is there. But whether it becomes a tool for change or a source of confusion depends on the reaction of schools.

It is not a moment for education to ask better questions. What do our students need? What do our teachers need? What do we want to learn to feel?

The prosperous schools will not be those that move the fastest. They will be the ones who will move with intention.

This means defining objectives before downloading tools. This means listening to teachers before writing policies. And that means being honest on what works and what doesn't work.

So what now?

Use what you have. Learn what you don't know. Invite your whole community.

And do just like the future depends on it.

Because this is the case.

Leave a Comment

This site uses Akismet to reduce spam. Learn how your comment data is processed.