Strategies to disarm fears of the implementation of AI

by Finn Patraic

When you buy through links on our site, we may earn a commission at no extra cost to you. However, this does not influence our evaluations.

More than two and a half years after the launch of Chatgpt, many school districts have succeeded in the discovery phase of generative artificial intelligence, to learn what it is and what it could be able to do, and their implementation phase is in full swing. For those who seek to demystify technology and obtain the membership of teachers and parents while they do, heads of schools and technology of the Allen-Stevenson school in New York have some advice: highlighting professional development, participation and communication.

Co-leding a session on Monday at the International Conference of the Society for Technology in Education (ISTE) in San Antonio, the integrator of technologies in the higher division of the school, Sam Carcamo, opened with a summary of the obstacles to which people in his position face. What continues to make scary AI for teachers and parents, he said, are the fears of his apparent potential for bias, hallucinations, eroding critical thinking, devaluation of expertise, dehumanization and cheating, cheating, cheating.

His colleague Sarah Kresberg, director of library services and educational technologies, said that even in a private school of private boys well funded like Allen-Stevenson, teachers did not know what to do with Chatgpt at the start. In the hope of finding who were the first adopters, they launched an AI club for teachers and administrators, who ultimately became an AI council. For the 2025 school year, the members organized trials of different seven tools of AI, and finally, “after many manual strikes,” said Kresberg, they settled on Scholai for two main reasons: they wanted something that the students of the K-8 could interface directly at their age, and they loved the Privacy Policy of Schoto.


School technological integrator, Ainsley Messina, said he then developed a professional development course from eight to 10 hours.

“Our goal was to establish a shared vocabulary among our teachers, regardless of their level of comfort with AI, whether they had already used it or not,” she said. “Throughout this pd lesson, we talked about the literacy of AI, the ethics of AI, we talked about the bias that exist in AI, and we were talking about:` `What should some of our concerns as educators with the AI ​​come on our way? “”

Carcamo said that the training concerned the fact that teachers fulfill forms on what interested them, for future reference – reporting, unit planning, lesson planning or gamification. In the fall of 2025, said Kresberg, they gathered for round discussions by subject. Each teacher had plunged deep into a particular subject, so that those who participated met and discussed their thoughts and their in -depth results, which were then divided with the wider group.

Carcamo, Kresberg and Messina said that what came out of these discussions was an approximate overview of what to do, or what they did, with AI in every year:

  • Kindergarten in the first year: teaching students to recognize the difference between artificial and natural creations and to understand that AI is when people make machines act intelligently.
  • Second year: Start introducing the concept of generative AI and using Schoto.
  • Third year: ask students to discuss with historical figures from their greatest achievements and the challenges they have overcome, and to collect information to write a three paragraph essay in their historical figure.
  • Fourth year: go further in the literacy of AI. To use Common sense mediaTalking about bias in AI, how it can have an impact on lives, how AI is formed and how it works. Use Google Teachable Machine, which downloads data on an AI to train it to perform certain tasks.
  • Fifth year: using a giant giant mega-paragraph designed by a teacher, the students were able to ask questions of a version of Marcus Aurelius. They were then responsible for using Canva to create comics based on historical stories.
  • Sixth year: ask students to speak with an AI chatbot on the different rocks and minerals allocated to them, then use Adobe Express and the generator to create cards in geology. As other examples, the sixth year English students were invited to describe locations, then fed these descriptions to an image generator to see what she would offer. If the students did not like what they saw, they would refine their description to bring AI closer to what they had planned. Sixth year Spanish students were invited to write a story of an angel and a devil trying to convince a character to be mean or kind, to use Adobe Express to make an animated book, to record their own audio and to synchronize it with an animated mouth.
  • Seventh year: Use Chatgpt for detailed comments on trials and use Newsela to write comments on pre-test assignments. They started to allow students to use Newsela to get comments on each trial before sending it back. According to Kresberg, most of the students used it and said they had found its labeling of the very useful paragraph parts, but it stressed that students needed a lot of practice.

Kresberg said that the school had initially left by talking to the parents, but that it finally started a series of five parents' commitment meetings throughout the year called “Tech Mardays”, about an hour each. In the fall, the sessions concerned how the school used AI, and in the spring, they covered how parents could use AI at home to help their children learn and become better for executive operation.

For technology integrators or anyone working with teachers, Messina has recommended AI for educationThe six -week AI literacy trainer course, as well as Women of AI and the Education Community on Slack.

Carcamo said that as a staff starting to work on projects, it began to arouse the interest of his colleagues.

The school, Kresberg said, has not yet made AI a mandate, but that has not been a problem in their case.

“Obviously, everyone is not using a lot of AI right now. We do not have to use anyone who uses it. We encourage and facilitate people to use it. I'm not sure that we are just about saying that people have to use it for something at the moment, but we don't have real opponents either, “she said. “I know that in some schools there are people who make things difficult for those who want to use it, and fortunately, we don't have that. If people are not on board, they are very silent about it. ”

Andrew Westrope is editor -in -chief of the Center for Digital Education. Before that, he was an editor for government technology and was previously journalist and editor -in -chief at Community Newspapers. He holds a bachelor's degree in physiology from Michigan State University and lives in Northern California.

Leave a Comment

This site uses Akismet to reduce spam. Learn how your comment data is processed.