The AI Tools Used in Schools Today
AI innovation is increasingly present in K-12 education, and parents/educators need to understand the benefits & limitations to advocate effectively for AI use. What to know now...
This week, we're taking a look at how artificial intelligence is already transforming the classroom—a reality that might surprise many. AI has not only landed in our schools, it’s been getting comfortable there for some time now.
One of the challenges here is in how we define AI. Artificial intelligence is far more expansive than any one platform or tool that is “labeled” as an AI technology. Fundamentals such as machine learning, personalization, and automation have been built into educational apps for years and would definitely be classified as “AI.”
AI Fundamentals: How to Define Artificial Intelligence
If we can’t speak plainly, clearly, or understand one another, the task of using AI, or regulating its development will be impossible.
Undoubtedly, AI offers significant benefits in education, but as families we need to educate ourselves about these technologies and their implications. Similarly, teachers and administrators require better resources and training to properly evaluate the AI-powered tools already available to their students.
In This Week’s Edition:
The AI Applications in Schools Now
In some cases, schools may be deliberately introducing AI tools and finding ways to encourage their use to enrich the curriculum. In other cases, as mentioned above, popular apps and platforms, such as Google Classroom, are already introducing AI features, and some even technically using AI-technology but branding their offering differently. It’s important to consider every app and tool that is used in the classroom regardless of how it’s categorized because the fundamentals and privacy implications are stand alone considerations.
Adaptive Learning is Popular… and is ALSO *AI*
To start, many schools are using adaptive learning platforms that personalize instruction based on student performance. Programs like DreamBox for math and Lexia for reading analyze how students interact with lessons and automatically adjust difficulty levels based on individual progress.
These platforms don't necessarily market themselves explicitly as "AI" products, but they do characterize their offerings as using machine learning, data analysis, and automation—all core components of artificial intelligence technology. DreamBox highlights its "Intelligent Adaptive Learning" system, while Lexia promotes its ability to automatically adjust to each student's needs, both leveraging AI principles without using the term prominently.
The World Economic Forum shares some optimistic data points on AI use in the classroom, suggesting that 71% of teachers feel that AI tools are essential for learning.
Specific to adaptive learning, it's easy to appreciate the benefit of personalization. However, these platforms also collect a significant amount of data about a child's learning patterns—from time spent on problems to specific mistake patterns and learning preferences. While this information helps teachers identify strengths and weaknesses, it also raises important privacy concerns.
It's critical that we not just take any vendor at face value. Families must ensure they are comfortable with both a given company's data privacy policies and the enforcement practices of schools, districts, and state education departments.
Administrative AI Applications in Schools
Beyond direct learning, schools are also implementing AI for attendance tracking, grading assistance, and even behavioral monitoring. Some districts use natural language processing to help teachers provide feedback on writing assignments, while others employ predictive analytics to identify students who might need additional support before they fall behind.
The ability to streamline administrative duties and maximize teachers' time could mean more meaningful, in-person instruction in the classroom. But there are also many questions that need to be asked. For instance: Are schools and districts equipped with robust privacy and cybersecurity policies? Do they have the infrastructure required to manage a growing list of vendors and properly address the associated privacy concerns?
Some highlights of the platforms in use at US schools today:
Attendance management systems that often use facial recognition or QR codes to automatically track student attendance (with facial recognition being, of course, an ethical minefield).
Platforms such as SchoolMint's Hero that use predictive analytics to identify patterns in absenteeism and send automated alerts to administrators.
Automated grading systems are already familiar to most families, but now platforms like Gradescope (by Turnitin) also integrate AI into the grading exercise. Tools like ETS's e-Rater are also used to score essays based on grammar, organization, and content relevance.
Platforms such as Bakpax that can digitize assignments by reading handwritten assignments.
PowerSchool now incorporates predictive analytics to identify students at risk of falling behind or dropping out (*they are also currently in the midst of the fallout from a massive breach; read more here).
Infinite Campus uses AI algorithms to analyze student performance data and generate early warning indicators.
Companies like Cognition Builders that offer AI-powered classroom monitoring tools that track student engagement levels.
ClassDojo uses behavioral pattern recognition to help teachers track and encourage positive classroom behavior.
Platforms like Chalk are able to integrate curriculum planning with student performance data
Tools such as Ocelot that handle routine parent inquiries/appointments.
I could go on, but you get the picture. Do these represent great advancement? Or are they a privacy nightmare? Perhaps both. But we’ll never know unless we dig into what schools are using, formulate an opinion, opt out when warranted, and advocate for better communications and engagement on the subject.
Here is also a great presentation shared with me recently from educators about AI in schools.
Getting a Handle on AI at Home
Your children may already be using AI tools independently. Students as young as elementary school age are discovering ChatGPT and similar tools for homework help, while teens are using AI for everything from essay drafting to math problem solving—and often without guidance.
Communication is the key to getting it right. Consider this worksheet as a way to catch up with your kids about their AI use each week.
Talking About AI Should be Like Talking About Food
Remember my "Potato Chip Test”? If you objected to, say, potato chips in your child’s lunchroom would you hesitate to say so? It should be the same with educational technology. We need to have these discussions and find a path that meets individual family needs and school districts desires to innovate.
News You Can Use This Week
📲 The Powerschool breach drama continues with hackers making ransom demands. I wrote a couple of months ago about the Powerschool platform breach. While massive in scope, some cities and states (including New York) still haven’t been entirely forthcoming regarding the impact. Even more disturbing are the ongoing ransom demands made by hackers. Did you even know that this happens? Well, it does and we need to pay attention.
📲 Sam Altman says GenZ does not make big life decisions without asking ChatGPT. Well, I’d say it’s perfectly responsible of GenZ to do research before making “big life decisions.” Maybe Sam is giving ChatGPT too much credit here. Couldn’t Google have said the same thing just five years ago?
📲 The Pope is skeptical about AI. It’s great that the Pope is weighing in, because again, if we don’t all have a point-of-view, and discuss concerns, consequences and the like, then this great AI experiment won’t work.
From the AI for Families … AI and Kids Library
This week I want to share news about another publication I’ve added to the family: . Friends and family have asked for more product- and age- specific content, tools that kids can access themselves, video tutorials AND in-person tutoring (for parents and kids). So I hope you’ll also subscribe and get in touch if you want to learn more.
And finally, check out the new AI and Kids guide to setting up “digital guardrails” by age. Hope you’ll find it useful!
Your Questions Answered
From Emily in Colorado: How can I have a say in what apps are in my child’s classroom?
Answer: This is a good question because the obvious answer is to “ask them” but it’s never that simple. But, yes, first get a list from your child’s teacher, specifically, and then ask the school more generally for a list of technology platforms approved for use in the school. Then also ask your child (sometimes they are the best source from the start). Finally, attend any and every meeting where you can bring the request up (and especially if you aren’t getting the answer you need). Schools are also overwhelmed by new technology and receive an enormous amount of pressure from app vendors, etc., so parents need to get into the weeds here.
Also check out resources such as the Internet Safety Labs and familiarize yourself with the details of every app your kid is using — in the classroom and at home.
Coming Next Week
“Early Warning Signs: How to Spot When AI Is Negatively Affecting Your Child”
I’ve talked a lot about the risks of emotional attachment to AI. Next week we’ll look at how to identify the signs that kids are getting too emotionally attached to chatbots, or are encountering unsettling content, etc. Also ways that AI can be used to positive effect as well.