Saturday, March 28, 2026

The Quiet Classroom Takeover: AI, Learning, and the Future We're Already Living

 

There's a scene in Wall-E that I keep coming back to: humans floating in hover chairs, bodies weak, eyes locked on screens, unable to do anything for themselves. Technology didn't force this on them. It made life so easy that they stopped trying. A similar feeling runs through Good Luck, Have Fun, Don't Die, which pictures AI not as a robot villain but as something quieter and more dangerous: a slow, friendly slide into every part of daily life. It shows up as a helpful tool. It ends up as a habit you can't break. What both films get right, and what researchers are now confirming, is that AI's biggest threat isn't dramatic. It's comfortable. And what they're warning about is already happening in schools today.

The risks are real. Mary Burns and  Rebecca Winthrop researchers reviewed more than 400 studies across 50 countries and found that AI tools prioritize speed and engagement over learning and well-being. These tools are built to feel good and get things done fast, not to help students actually grow. AI's ease of use, combined with the reward of better grades for less effort, drives students to let AI do their thinking for them, weakening their grasp of basic knowledge and critical thinking. The damage isn't only academic, AI companions can create unhealthy attachments and get in the way of students building real social skills. And students who already have the least are hit hardest, as disadvantaged students are more likely to use AI in ways that replace their own thinking rather than strengthen it.

However, when used ethically and responsibly, AI can truly improve education. Tanya Milberg notes that students who receive one-on-one tutoring consistently outperform 98% of students in regular classrooms, and AI could bring that kind of personal attention to every child, not just those whose families can afford a tutor. AI can also handle routine tasks like grading and paperwork, giving teachers more time to actually connect with their students. UNESCO captures the bigger vision, calling for bold new ways for humans and machines to work together fairly, but only if we build it that way on purpose.

That's why teachers matter more now, not less. The patience, empathy, judgment and ability to push a student past what's comfortable are things AI simply cannot do. Great teaching in this era means three things: teaching about AI so students understand its limits and who shapes it; teaching for AI by building the human skills - creativity, empathy, critical thinking - that will always matter; and teaching with AI by modeling wise, intentional use. As the World Economic Forum puts it, teaching students about AI is just as important as teaching them with it. The best teachers ahead won't be the most tech-savvy. They'll be the ones who know what only a human can give a child,  and refuse to let that be replaced.

Disclaimer: I run my thoughts using the Claude.ai as a challenge to try this generative tool instead of ChatGPT.

Sunday, March 22, 2026

AI and School: Some Notes on my Digital Citizenship and Literacy Journey


Image created with ChatGPT

I’m nearing the end of my course in Digital Literacy and Citizenship as part of my master’s program, where I’ve spent a lot of time exploring how technology is shaping the way we learn. The cluster of articles this week was about AI and the future of education. I felt that particular mix of professional curiosity and personal unease that comes with reading about something that's actively changing the field we are studying. Howard Gardner called AI a fundamental a shift in education as anything seen in a thousand years, and as someone currently sitting inside an educational institution trying to understand digital transformation, that landed differently than it might for a casual reader. He suggested that most cognitive aspects of the mind - the disciplined, synthesizing, and creative - will eventually be performed so well by large language models that whether humans engage in them at all will become optional. That's a provocative claim for anyone in education, but for someone studying digital literacy specifically, it raises a direct question: what exactly are we preparing students for if the cognitive heavy lifting is increasingly outsourced?

Jim Shimabukuro pushed that question further for me. The argument is that AI's personalization, accessibility, and efficacy could shift the balance away from traditional schools as the dominant medium for academic learning, potentially within the next decade. The numbers cited are hard to sit with neutrally: some AI-first schools report that students master core curricula in two hours a day and score in the 99th percentile nationally. In my digital literacy coursework, we talk constantly about the difference between access to information and genuine understanding, and I find myself wondering whether those test scores reflect deep learning or something more like optimized performance. Mary Burns and Rebeca Winthrop speak directly to this tension, and it's the piece I'll probably be citing in my next assignment. Their research found that AI tools prioritize speed and engagement over learning and well-being, and that cognitive offloading and dependency can atrophy students' mastery of foundational knowledge and critical thinking, which in essence is the exact skill set a digital literacy education is supposed to build. There's something deeply ironic about that.

What keeps me from sliding into full pessimism, and what I think is actually the most useful framing I've encountered across all three pieces, is the insistence that none of this is fixed. As Burns and Winthrop highighlighted, the future of AI in education is in the hands of individuals and institutions, and we all have a role to play as active participants rather than spectators. Gardner's collaborator Anthea Roberts said something that resonates with me both as a student and as someone training to work in this space: we have the chance to cognitively offload, and we have the chance to cognitively expand. It is our duty to figure out how to pursue expansion rather than replacement. That feels like the core question of digital literacy to me. It's not really about the tools themselves, but rather, it's about the habits of mind we build around them.

I came into this master's program thinking I'd learn frameworks for evaluating technology. Increasingly, I think the deeper work is learning how to protect the kind of thinking that technology can't, and shouldn't, replace.


Saturday, March 14, 2026

Redesigning Schools: But Why?

Image generated with ChatGPT AI.

Although artificial intelligence (AI) is becoming more prevalent in classrooms, there is still a discernible difference in how teachers and students use it. According to Tyton Partners, as cited by the College of Education at the University of Illinois Urbana-Champaign, the ratio between students and teachers who regularly use AI shows a considerable gap between the two groups. These statistics highlight how quickly AI is entering students’ learning environments and suggest that schools may need to rethink how curriculum and assessments are designed.

Disclaimer: Please note that I ran these ideas through ChatGPT, but you can find my original notes here. 

Effects on Learning and Assessments. AI is already shaping how students learn and access information. AI tools can generate essays, summaries, and explanations almost instantly. While these tools can support learning, they also challenge traditional assignments that rely heavily on written responses. When students can easily generate work using AI, it becomes harder for teachers to assess what students truly understand. As a result, schools may need to design assessments that emphasize critical thinking, problem-solving, and real-world applications rather than simple recall or writing tasks (Rupp, 2024).

Outdated Curriculum. Many current curriculum frameworks were developed before AI technologies existed. Much of the traditional school structure still reflects a “factory model” of education that focuses on standardized content and testing. However, scholars argue that education systems must move toward learning environments grounded in relevance, rigor, and relationships. This shift would prioritize deeper learning, collaboration, and meaningful engagement rather than narrow test-based accountability systems (Darling-Hammond, 2025).

Role of Teachers. AI also raises important questions about the role of teachers in the classroom. Rather than replacing educators, AI should be seen as a tool that supports teachers’ work. Educators can act as collaborators with AI technologies, using them to enhance instruction while maintaining professional judgment and human oversight. Teachers remain essential in guiding students, providing ethical perspectives, and supporting the social and emotional aspects of learning (McRae, 2025).

Equitable Access. Another urgent concern is equity in access to technology. Access to AI tools and digital resources is not equal across schools and communities. If curriculum and policies are not redesigned thoughtfully, these differences could deepen existing educational inequities. Schools must therefore develop policies that ensure responsible and equitable use of AI while integrating these technologies into culturally responsive and student-centered learning environments (All4Ed, 2025).

The Future. Redesigning curriculum and assessment is necessary because AI will continue to shape future careers and industries. Schools have a responsibility to prepare students for a workforce where AI technologies are increasingly common. This includes helping students develop digital literacy, critical thinking skills, and the ability to evaluate and monitor AI systems. Although AI can present risks such as bias or misinformation, education systems should not avoid innovation out of fear. Instead, schools should carefully evaluate these technologies while adapting their practices to prepare students for the future (Rupp, 2024).

The urgency for schools to redesign curriculum and assessments comes from the reality that AI is already influencing education. The challenge for educators is not whether AI will be part of learning environments, but how schools can integrate it responsibly while preserving meaningful learning experiences. By rethinking curriculum goals, assessment practices, and the role of teachers, schools can ensure that students are prepared to thrive in a world increasingly shaped by artificial intelligence.


Friday, March 6, 2026

Under Surveiilance: Teaching Shaped by Cameras

Image generated with ChatGPT


My classrooms in Macao and Hong Kong had CCTVs. As a new, foreign teacher in these places, the cameras made me feel uncomfortable within the four walls. While we were assured that the cameras were there to manage students' behavior, the idea of "being constantly watched" created a sense that every behavior in the classroom may be viewed and interpreted by someone else. But over time, I started to see this experience through the lens of power and surveillance. 


In her post entitled, The Danger of Facial Recognition in Our Children’s Classrooms, Nila Bala discusses that schools often operate within clear power imbalances. Similar to my experience in Macao and Hong Kong, teachers have little influence over institutional policies, including decisions about surveillance. As a classroom teacher, I had little or no voice in whether cameras should be present in my classroom. Reporting to school always left me fearing possible situations, like if I snapped or acted in a way that might raise questions or trigger an inquiry about my classroom conduct. Fortunately, I never experienced such a situation. Still, the constant awareness of being watched created a lingering sense of caution. 


I got used to CCTVs over time, but the experience changed my understanding of my teaching environment, making me more aware of how authority and surveillance shape everyday classroom practice. We teachers often work in systems where rules and policies are made without our input, but we still have to deal with the consequences in our daily work. 


Surveillance in the classroom does not only come with the installation of CCTVs in classrooms. In his post entitled Are Teachers Under Increasing Surveillance, Ross Morrison McGill summarizes the three types of teacher surveillance that Damian Page explored in his paper Conceptualizing The Surveillance of Teachers: vertical surveillance, horizontal surveillance, and interpersonal surveillance. These surveys reflect the consumerism and neoliberalism ideologies and portrayal of teachers as commodities who represent themselves as products that can be evaluated and compared, as well as entrepreneurial individuals who continually improve themselves to stay competitive and employable. 


But the most concerning aspect of this surveillance is that many teachers may not fully recognize these forms of surveillance. Without this knowledge, teachers may feel pressure to keep an eye on their work, gather evidence of their efficacy, and establish their value as professionals. Over time, this can normalize a culture in which teaching becomes performative, with an emphasis on proving productivity and accountability rather than providing meaningful learning opportunities. 


Recognizing these dynamics is important. When teachers are aware of them, they can think more critically about the systems they work in and reflect on how surveillance influences not only their professional identity but also the learning environment they create for their students.


The Quiet Classroom Takeover: AI, Learning, and the Future We're Already Living

  There's a scene in Wall-E that I keep coming back to: humans floating in hover chairs, bodies weak, eyes locked on screens, unable to...