Schools Went After Cellphones. Now It’s Time to Ban Generative AI.
Sorelle Friedler, Nicol Turner Lee / Nov 11, 2025
"Datafication" by Kathryn Conrad (Better Images of AI / CC 4.0)
Students have been back in schools in the United States but in many cases their cell phones have not gone with them. Banning the devices has been a top priority for many US school districts this year amid mounting research finding that too much screen time is negatively impacting learning.
But where educators should be focusing their attention next is clear: the use of generative artificial intelligence products in schools. In fact, a White House executive order is encouraging tech companies to entrench AI use and adoption in K-12 schools, creating a frenzy of AI companies seeking to convince school districts to adopt AI tutors and other instructional AI tools.
It took years for researchers, parents and teachers to realize the risks of cellphones, and we shouldn’t wait to learn the impacts of AI on student learning to act. This time, schools should immediately ban students from using generative AI tools for essay writing, homework or studying help, coding projects and other activities until we know how they impact student learning and have sufficient safeguards to protect children’s wellbeing, privacy and opportunity to learn.
The one exception to a ban on AI use in schools should be for classes where students specifically learn about AI. Whether through computer science courses building such systems or in humanities and social science classes where students can critique and understand AI’s impact on society, students can and should learn when AI works, when it doesn't and how to tell the difference. For these lessons to be effective, teachers from pre-kindergarten to professors in higher education also need professional development on the topic.
To many readers, completely banning AI use may sound like an extreme position. Employees are already using the technology in workplaces across the economy, and some companies are even requiring the use of the tools. But in addition to readying students for the workforce, schools need to teach students how to learn, rather than having AI replace this function.
This exercise is different from having a machine dictate a set of possible results. Critical thinking skills, writing abilities, problem solving ideas and other competencies are gained through the process of writing essays, solving math problems and evaluating various scenarios — not just by reaching the correct outcome.
Will all generative AI use hurt student learning, or are there ways to use it that will help? We don’t know the answer to this question yet, and education is too important for us to wait for the research to be more conclusive. In the meantime, we must protect students’ right to learn.
Students from elementary school to higher education must learn to think for themselves and be prepared to sort through the inaccuracies, manipulation, hate speech or other inevitable outcomes of large language models and other generative AI systems. These models are not necessarily responding to queries based on textbooks or vetted knowledge, but remixing all the text available on the web.
If we truly believe that AI-driven jobs are the future, then students should be prepared to assess problems and learn from mistakes — to sort convincingly generated fiction from fact, to write more compellingly and create genuinely new ideas. After all, if an AI system can do the job, then our students won't be needed.
Evidence of these concerns has already emerged in higher education.
Computer science professors, like co-author of this piece, Friedler, have already seen the problems of AI overuse in their own courses. In a course where junior-level students are taught to build modern AI systems, the overreliance on AI coding tools in their introductory courses left some students staring into space at blank computer screens, unsure how to even begin their assignments. Computer science educators have started to refer to this unfortunately common phenomenon as the "junior-year wall." Students who were supposed to learn how to structure their thoughts and turn them into working computer programs in their introductory courses sailed through with the assistance of AI and are now stumped by the junior-level assignments where the tools aren't advanced enough to help.
AI companies have been eager for a big pay day for their investors and are heavily pushing for use of their tools in K-12 and college classrooms. Yet these initiatives undermine the companies’ own future workforce by threatening the core goals of learning.
However, students must be seen as more than just future workers. Education is a chance for them to develop their own thoughts, determine what they think about the world, what they believe. Learning environments must help students gain those skills. Otherwise, without a citizenry able to think freely and critically, the current global democratic backsliding will get even worse.
Another issue is that of equity between students from different socioeconomic backgrounds. AI companies, funders and others are building a narrative that claims "equity" is about ensuring working class students have access to use these systems, but that misses the mark.
Educational researchers have long identified that kids from working class backgrounds largely receive an education focused on teaching them how to be workers in a factory, while kids from upper class backgrounds learn to become managers and company leaders. We've seen these class differences already play out with upper class Silicon Valley schools restricting cellphones long before more recent bans to reduce the distractions of screen time.
Equity should be about ubiquitous access to these tools and ensuring that less wealthy students are equally protected from the potential misuse of technology, including generative AI.
Instilling new educational norms in the face of disruptive technology is hard, and many school districts are starting to see results, including increases in grades, from cell phone bans. Generative AI use should also be curtailed until future research shows if it can support student learning.
In the meantime, schools need community norms that make clear that students’ own ideas — and not those generated by AI — hold the highest value. Future innovators, poets, politicians, scientific researchers and factory and service workers all deserve the chance to develop their own voice and thoughts in the absence of tools that seek to tell them what to think or do.
Authors

