Radical Reads

AI in the classroom – the kids are alright

By Leah Morris, Senior Director, Velocity Program

Parents are recovering from back-to-school mania this week. Adding to the confusion and complexity is the outstanding question of generative AI in the classroom. In the spirit of back-to-school, we reflect on generative AI and the current moment in education.

Many teachers and parents feel unequipped to deal with generative AI tools. Feelings like this are not new. In the 1980s math teachers across the United States protested the use of calculators in the classroom – particularly at the elementary levels. Advocates wanted to ensure hand calculators were only used “after the child knows math facts” and proposed policies for each grade level to teach students “how and when to use the calculator.”

Comparing a handheld calculator to the complexities and enormity of the potential impact posed by generative AI is not a fair comparison, but the concerns raised by advocates and the suggested policies are remarkably similar to what has been heard around ChatGPT and generative AI in school settings.

Other teachers in the calculator conversation noted that spending time on hand-computed long division was “ridiculous” and advocated that the debate be turned away from its focus on the tools (be it pencil and paper or computers), and focus on having children solve more complex problems that required thinking beyond computing such as statistics or mental estimations.

High-quality generative AI tools are undeniably a game-changer, but classroom pedagogy has already been shifting for some time. Over the last three generations, the required skills to participate in academia have evolved from search to summary, and now toward critical thinking and contribution to a field of study through multiple outputs.

The search generation

Before modern search engines and the World Wide Web (WWW), a major skill for university students at all levels was search. Learning the Dewey Decimal system was a must. Many academics would even travel for specific resources or wait for texts to be shipped. This skill set also transformed as students learned to navigate early digital file directories.

While we still use these systems for some physical resources, the ability to crawl, index, and search the WWW created a new generation of students who were using websites more than books, and search engines more than web directories. Yahoo, AltaVista and Lycos dominated. This was the beginning of link-based ranking algorithms with innovations such as RankDex followed by Google’s PageRank which underpins modern search. While Microsoft lost this battle with MSN Search, the evolution of ChatGPT may be viewed through the lens of working to win search in the long run.

The summary generation

Today, a common method for learning at all levels is to write a literature review and summarize the arguments framed by the most convincing ideas. Writing has become less burdensome as barriers to search were reduced and tools improved to make writing faster and more efficient.

The essay is the obvious tool for learning. Research has shown that computers connected to the Internet lead K-12 students to “conduct more background research for their writing; they write, revise, and publish more; they get more feedback on their writing; they write in a wider variety of genres and formats; and they produce higher quality writing.”

However, the essay has become inflated and questioning its ubiquity is a worthwhile endeavour. Today, college students are writing longer essays more frequently. The average college essay in 2006 was 1,038 words, more than double the length of the average 1986 paper, which was itself much longer than the average length of papers written earlier in the century (often 230 words or less).

The essay started to break down as a pedagogical tool long before generative AI surfaced. Students, often facing increased pressure to write longer papers more frequently, turn to the “contract cheating” industry. A wave of workers in contract cheating hubs like Kenya are facing a drop-off in demand as US students replace their skills with ChatGPT.

While writing can be an excellent tool to demonstrate the output of research and critical thought, banning generative AI tools will not solve the shortcuts already being created by students facing an overwhelming amount of information now available on any given topic.

The output generation

The current generation could start their academic inquiries with a complete literature review on any topic. If we can solve reliability, safety and citation hurdles, students will be able to spend less time reading individual papers and more time asking questions, applying concepts, critiquing, and eventually finding new areas to contribute to the frontiers of a field of research.

There is “no view from nowhere” and language models underlying generative AI tools may contain various biases. This is often cited as a negative, but it also opens room for assignments that focus on critique and uncovering the influential texts from history that underpin ideas surfaced in chatbot responses.

Future output and contribution to academic dialogues could also be “multimodal” – taking the form of writing, as well as video or other forms of logging knowledge that were previously impossible to consider academically valid outside of conference materials.

Learning tools could not only change how we think about learning but also what constitutes a valid contribution to a field of study.

Some food for thought for schools adapting to AI today:

1. Humans using certain tools are prone to making certain types of errors.
An area for investigation is the type of errors students are likely to make when using generative AI tools. For example, students from 1917 to 1986 made the same number of errors in writing and the majority had to do with spelling mistakes. In 2006, spell-check and similar word-processing tools fixed spelling issues, but created new errors. In recent decades, the number-one error in student writing has become “wrong word choice.”

If generative AI tools are permitted to be used, we should ask what type of errors students are likely to make when using these tools and how we address these errors in homework assignments and lessons.

2. What does banning generative AI in schools mean?
Students, parents, and teachers facing AI bans and/or acknowledgement requirements should seek clarity from the policy-makers on where in the learning process these requirements apply.

For example, can AI be used by students to generate ideas or support in creating an outline? Can students ask AI tools for feedback or clarification on a question they got wrong? For advice? For use as a teammate or collaborator?

3. The essay is broken. However, while there are still many unknowns about generative AI, we do have a vast amount of research on how to teach.

For example, two ideas that are resurfacing in the age of AI are “Flipped Classrooms” and “Active Learning.” In flipped classrooms students are not doing the usual process: learn the materials inside the classroom and practice at home via assignments. Instead, they are using tools such as video and online supports to learn materials at home and then in the class they are working on problem sets with other students where a teacher can help. This pedagogy enables the students to be challenged beyond regurgitation from the lecture and apply the knowledge in real-time. Another layer is the ability to use personalized AI tutors in the learning portion to help students engage with the materials outside of the classroom at their own pace. Regardless of usage, any adaptation to new technologies should lean on well-established research in education.

4. The AI of the future will likely not be the AI that students are using today.
Using tools in classes for specific tasks, rather than dedicated “how to” classes focused on AI, is likely to be more successful as students grow with the technology.

For example, many millennials sat through monotonous typing classes in elementary school which was often spent finding inventive ways to socialize with classmates or covertly playing computer games. Anecdotally, AOL and MSN messenger motivated many a millennial child to gain self-taught typing skills and in-school typing classes may have been better used for adult learners who had to go back and learn as they did not have the opportunity to self-teach and learn alongside the technology’s growing applications.

5. The next few years will be messy as the barrier between in-class and out-of-class continues to merge.
Curriculums need to reflect the world children are operating in beyond the classroom. Returning to a relatable quote from our 1980s math teachers, this particular group wanted more complexity in the calculator debate and asked teachers to adjust their pedagogy, rather than simply banning tools:  “The kids basically learn what we teach them, and what we’ve been concentrating on is routine drills in computation. But we’ve totally ignored what’s been going on outside the schools. Our curriculum is outdated, and we’re turning out students who are not prepared for a world of computers and calculators.”
There is no doubt that education has changed, but that is a good thing. Our grandparents and parents did not learn with the same tools that we did in school because the technology in our world has changed. Even though it is uncomfortable, it is not rational to want the educational system to remain unchanged over the decades. Generative AI tools are here to stay. Leaning on the deep repository of knowledge we have on education and learning as we adapt our schools could lead to a better learning experience for more students globally.

Radical Reads is edited by Leah Morris (Senior Director, Velocity Program, Radical Ventures).