The educators have the tools, but not the training or the ethical framework to use AI in education judiciously. And that’s a problem

by Finn Patraic

When you buy through links on our site, we may earn a commission at no extra cost to you. However, this does not influence our evaluations.

The Trump administration wants to introduce artificial intelligence into class K-12 rooms. At first glance, it is not a terrible idea. Used well, AI can be a patient tutor. It does not become frustrated. He does not lose concentration. He does not roll his eyes, does not check the clock or abandons.

AI could help personalize learning, diagnose learning disabilities, facilitate administrative charges and free teachers to spend more time doing what only humans can do: connect, mentor, take care. But these results are not automatic. They depend on thoughtful design, clear monitoring and shared values.

And yet, the federal government still does not allow most of its own workforce to use generative AI tools, citing the two security problems and the lack of clear policy. They are adults trained – scientists, analysts, engineers. However, many always expect advice on how (or if they) can use the same tools that we now offer to put the third year students. Take the time to make sure that we are going to do things well – aligning educators and technology experts around what matters most: students' results.

We cannot just bolt Ai on the current system. We must rethink our educational values: not just the efficiency and test results, but also the use of ethical technology and human connection. This kind of quarter must be designed – openly and with the people who know the students best: teachers.

Related: Many things take place in kindergarten classrooms in high school. Follow our free Weekly newsletter on the education of kindergarten in the 12th year.

The truth is that AI is already in class. More than half of us, teachers from kindergarten to the 12th year Use of AI tools in 2025 – Double the figure of the previous year. However, a recent Pew investigation revealed that One in four teachers Believe that AI in education does more harm than good. Another third says that the impact is mixed.

The risks are real: biased algorithms, violations of confidentiality, excessive dependence on automation. But the possibilities too. Done in a thoughtful way, the AI ​​could restore something that our schools desperately need: time. It is time for students to go further. It is time for teachers to be present to lead to students' thoughts. Time to arouse curiosity. There could be time to establish confidence – it's time to learn to be more human, no less.

But to harvest these services, we must not make AI the next Google – adopted first, questioned later, if necessary. We must build an ethical framework alongside tools. We have to control, assess and revise before deploying on a large scale. And we need to create a space for teachers, parents and students to shape these decisions – not just businesses and politicians.

It's a moment for humility, not the media threshing. The question is not whether AI belongs to the class. It is if we are ready to make it serve people in class. If we let AI reshape education without purpose or care, companies will continue to build algorithms. And when they fail, our students will feel the cost.

It is a disturbance associated with negligence, leaving our teachers, not technological companies, to meet the benefits.

We have already been here. Just ask Google.

Over the past decade, the country's schools have quietly adopted the continuation of Google tools. Google Docs, Gmail, YouTube – These products are now forming the dynamic spine of American classrooms. During the pandemic, their adoption accelerated. In the United States, more than 30 million students used Google's educational applications in 2017. Globally, this number has since climbed 150 million students, teachers and administratorsAccording to the company itself. In many districts, chromebooks, based on the Google operating system, are a standard problem.

In relation: Children who use chatgpt as a study assistant are worse on tests

But this embrace came with few questions asked: Who owns the data? What is followed? Who benefits? We did not stop to ask the difficult questions about the fact of leaving a large technological company to mediate a large part of the learning experience – and now we strive to catch up.

We would be wise to learn from this experience. If we do not manage to build railing in AI now, we risk leaving a flattened education in terms of AI – transforming schools into a test laboratories for corporate algorithms, not human growth communities. Well done, AI would be designed with and for teachers – not like a shortcut around them. He would focus on tasks that release teachers to do what only humans can do.

Imagine a chatbot that gives a student comments in real time while he writes a test, signaling confusing sentences so that their ideas take place more quickly and that they strengthen confidence – without waiting days for corrections. Or an examination platform that does not only mark the bad answers, but explains why the answers are false, helping students learn from their mistakes while the memory is cool.

In both cases, AI does not replace the work of a teacher – it strengthens it, transforming the feedback loops into learning loops.

Take the calculator. When he entered the classrooms, much feared that it destroys the skills in basic mathematics. Today, we allow students to use calculators – even at SAT – but with clear standards. We treat them as assistants, not replacements.

The AI ​​poses a more important challenge than the calculators ever – but the lesson continues: the right design, the right rules and the right objective can make any new technology a tool for deeper learning.

We remember the teachers who challenged us, who believed in us – not the calculator they taught us to use. If we get things correctly, AI will stay in the background and human moments shine.

Michael Goergen is a writer and a policies focused on sustainability, technology and ethics. He worked in science and the government and believes that the future of learning depends on remembering what makes us humans.

Contact the editor -in -law at Opinion@hechingerreport.org.

This story on IA in education was produced by The Hechinger reportAn independent non -profit press organization has focused on inequality and innovation in education. Register for Hechinger's Weekly newsletter.

The Hechinger report provides in -depth reports and based on facts on free education for all readers. But that does not mean that it is free to produce. Our work holds educators and the public informed of urgent problems in schools and on country campuses. We are telling the whole story, even when the details are annoying. Help us continue to do this.

Join us today.

Leave a Comment

This site uses Akismet to reduce spam. Learn how your comment data is processed.