Educators failed students with social media. We can do better with AI.
Those of us educated before AI remember the joys of completing a hard task and the sorrows of failing at one, whether writing a beautiful essay or botching a tough math problem.
But with hindsight, we see that our struggle helped build intellectual muscle. The learning process, not the result, was the point.
How do we help our students build that intellectual muscle when AI is always at their elbows with an answer?
Currently, we educators focus much of our attention on teaching AI skills, using AI for administrative tasks — and, unfortunately, figuring out how to circumvent the labor-saving advantages of AI so we can cling to old teaching practices.
Our chief responsibility is much more fundamental. We need to help students strengthen their critical thinking, analytical, and teamwork skills — all core human traits needed to participate in a democratic society and prosper in a dynamic economy.
In my career as a professor and academic leader, I’ve seen the advent of scientific calculators, personal computers and the Internet.
Although some feared that these developments would threaten education, it’s clear they wound up strengthening it. They reduced menial tasks. They created new tools. They opened up troves of new information. All this led to different and richer questions and new ways of asking them.
AI is orders of magnitude more powerful. But it will also enhance student learning — if we are purposeful and collaborative in making it do so.
We should start with the shared understanding that our goal as educators is to inspire and cultivate learning. And we should insist that AI assist us and not impede us in this work.
The act of learning instills humility and wonder. It feeds curiosity. It cultivates intuition, that amazing power that arises from experience and thought. It’s essential for individuals to participate in society and to connect and work with others.
The question before us is not whether to use AI in teaching; faculty are already doing so in exciting ways.
Rather, the question is, how can we ensure that our use of AI makes students better learners and, by extension, better thinkers?
Part of the answer involves selecting the right AI tool for what we are teaching, whether large language models, machine learning, digital twins, computer vision, robotics, remote sensing, or others.
At UF, where we teach AI courses to 12,000+ students across 16 colleges, every discipline is different. Faculty are encouraged to use the AI that best suits their lesson, class or discipline.
Another way to make students better learners is to be flexible in our teaching.
Technological innovations have always adjusted the knowledge base and skills students are expected to acquire. AI is doing the same, and we educators need to adjust accordingly.
For example, the advent of digital calculators brought the ability to quickly and easily compute results that were impossible or impractical to verify. But how do you know if you mis-entered some keystrokes and got a ridiculous answer?
Teachers began stressing the tools and techniques of estimation, since only by estimating could students “check” the results spit out by their calculators.
We need to strive for similar pedagogical flexibility with AI. We know students will skip steps in the cognitive process. With that being the case, how do we help them learn to assess whether AI’s answers are true — or, conversely, hallucinatory?
A third way to use AI to help students become better learners is to stress teamwork.
We know that AI is here to stay. AI agents will become students’ constant companions as assistants and collaborators.
Educators have long recognized that teamwork is an important leadership skill that companies demand. As AI becomes the newest member of the team, we need to teach students how to bring out its best qualities and smoothly interact, just as with human team members.
Keeping our focus on enabling student learning will help us avoid past missteps with new technologies, such as when universities missed the opportunity to meaningfully influence student use and consumption of social media.
Had we been more proactive and focused on helping students navigate, contextualize and understand social media, we might see fewer of the negative mental health impacts we see today.
Imagine the consequences of young people losing the sense of accomplishment that comes from tackling big challenges due to offloading them to chatbots.
The more we help students to retain and strengthen their innate learning abilities, the better we can prevent that outcome.
The importance of getting this right goes beyond mental health.
Studies already suggest that GPS and navigation apps negatively influence people’s spatial memory and sense of direction. The more we offload human abilities to machines, the more we’ll lose those abilities, with negative impacts on individuals and society.
AI is reshaping what goes on in the classroom, our curriculum, and the public’s perception of the usefulness and relevance of higher education.
Schools and universities are on the front lines. We can and should rise to the occasion, making this powerful new tool an ally rather than an enemy of learning, with all its struggles, satisfactions and benefits for human well-being.
Joe Glover is Interim Provost of the University of Florida.