advertisement

As students adopt AI, critical thinking matters more than ever

Two weeks into our 11th-grade American Revolution project, I walked over to Sofia’s group and sighed.

The students were deep into a research rabbit hole about a yellow fever outbreak. The goal was to examine public health during the American Revolution and connect it to modern issues, such as COVID-19. I expected that Sofia and her peers would eventually land on the smallpox outbreak that plagued Washington’s army. Instead, ChatGPT had suggested studying yellow fever, which happened decades later, after the period we were studying. I warned them that I didn’t think artifical intelligence could perform the complex historical reasoning required for this project.

Sofia and her group were confident it could and assured me that they would not solely lean on it for their project. But after days of following AI down that path, they hit a wall. Their argument didn’t fit the time period, the themes didn’t align, and the necessary connections were simply not there. They couldn’t salvage it. They had to start over.

Watching them do this reinforced for me why critical thinking in the age of AI matters more than ever. AI can generate information instantly, but it can’t tell my students whether that information is meaningful, relevant or true.

According to recent national data, 86% of students used AI during the 2024-25 school year, and I am sure that number is increasing. AI isn’t the wave of the future; it's here now. We’re already in it. The question isn’t whether my students will use AI, it’s whether I can teach Sofia and the other students in my classes the critical thinking skills they will need to work with it effectively. Without intentional guidance, my students will be confident in the tool, but not in their own thinking.

Kelly Torres and her students at Fenton High School examine issues with the aid of artificial intelligence tools.

I use AI almost every day as a teacher. It helps me draft rubrics, brainstorm lesson ideas and discover texts I might not have otherwise chosen. But it works for me because I already know how to think critically. I know when an answer makes sense and, more importantly, when it doesn’t. My students don’t have these skills yet, and AI makes it easier for them to skip the thinking they need to learn them. To make sure students like Sofia develop the critical thinking skills they need to thrive and to become truly AI-literate for the future, teachers like me need support.

Teachers want to navigate this new technology in ways that support learning rather than replace it, but we cannot do that without direction. My colleagues and I need clear, practical guidelines that help us teach AI literacy without losing sight of the critical thinking skills our students still need.

Much of what I know about AI has come from experimenting on my own, and I am fortunate to work in a school that provides me with the space to do so, but many teachers do not have that flexibility. Even in my district, there has been no structured guidance or training on how to use AI responsibly and effectively.

Teachers like me should have the freedom to explore AI alongside our students, supported with structured time and meaningful professional development so we can learn how to effectively use AI as a teaching tool, rather than relying on trial and error that risks compromising student learning. This support is urgently needed.

Students also need their own guidance. AI evolves faster than any teacher can keep up with, and right now, most student use happens without direction or reflection. As evidenced by Sofia’s group, many students barely take the time to consider whether or not AI can even be used to reach their desired outcomes.

A statewide, student-centered framework created with teacher input could help students learn what responsible AI use looks like: when to trust it, when to question it, and when to walk away from it.

Last year, my colleagues in the Teach Plus Policy Fellowship helped secure passage of a bill establishing the first statewide guidance for AI in schools. We are now working on integrating student voice into the framework for schools and ensuring that this framework is both teacher-friendly and student-facing. If we don’t get this right when AI technology is still new, it will only get harder later, just as it did with cell phones. Without clear expectations and support, we risk losing the most important aspect of learning: the thinking itself.

When Sofia’s group finally started over, they approached the project differently. This time, they questioned their sources, compared perspectives and thought through their choices. Their new project wasn’t perfect, but it was theirs.

If Illinois gets AI right, teachers and students will be able to use this new technology to deepen learning, not dilute it. Together, we can build classrooms where technology sparks curiosity instead of stifling it, helping students like Sofia learn not just how to prompt a tool, but how to think deeply and question the world around them.

Dr. Kelly Torres teaches regular and Advanced Placement U.S. history at Fenton High School in Bensenville. She is a 2025-2026 Teach Plus Illinois Policy Fellow.