Sunday, March 22, 2026

AI and School: Some Notes on my Digital Citizenship and Literacy Journey


Image created with ChatGPT

I’m nearing the end of my course in Digital Literacy and Citizenship as part of my master’s program, where I’ve spent a lot of time exploring how technology is shaping the way we learn. The cluster of articles this week was about AI and the future of education. I felt that particular mix of professional curiosity and personal unease that comes with reading about something that's actively changing the field we are studying. Howard Gardner called AI a fundamental a shift in education as anything seen in a thousand years, and as someone currently sitting inside an educational institution trying to understand digital transformation, that landed differently than it might for a casual reader. He suggested that most cognitive aspects of the mind - the disciplined, synthesizing, and creative - will eventually be performed so well by large language models that whether humans engage in them at all will become optional. That's a provocative claim for anyone in education, but for someone studying digital literacy specifically, it raises a direct question: what exactly are we preparing students for if the cognitive heavy lifting is increasingly outsourced?

Jim Shimabukuro pushed that question further for me. The argument is that AI's personalization, accessibility, and efficacy could shift the balance away from traditional schools as the dominant medium for academic learning, potentially within the next decade. The numbers cited are hard to sit with neutrally: some AI-first schools report that students master core curricula in two hours a day and score in the 99th percentile nationally. In my digital literacy coursework, we talk constantly about the difference between access to information and genuine understanding, and I find myself wondering whether those test scores reflect deep learning or something more like optimized performance. Mary Burns and Rebeca Winthrop speak directly to this tension, and it's the piece I'll probably be citing in my next assignment. Their research found that AI tools prioritize speed and engagement over learning and well-being, and that cognitive offloading and dependency can atrophy students' mastery of foundational knowledge and critical thinking, which in essence is the exact skill set a digital literacy education is supposed to build. There's something deeply ironic about that.

What keeps me from sliding into full pessimism, and what I think is actually the most useful framing I've encountered across all three pieces, is the insistence that none of this is fixed. As Burns and Winthrop highighlighted, the future of AI in education is in the hands of individuals and institutions, and we all have a role to play as active participants rather than spectators. Gardner's collaborator Anthea Roberts said something that resonates with me both as a student and as someone training to work in this space: we have the chance to cognitively offload, and we have the chance to cognitively expand. It is our duty to figure out how to pursue expansion rather than replacement. That feels like the core question of digital literacy to me. It's not really about the tools themselves, but rather, it's about the habits of mind we build around them.

I came into this master's program thinking I'd learn frameworks for evaluating technology. Increasingly, I think the deeper work is learning how to protect the kind of thinking that technology can't, and shouldn't, replace.


Saturday, March 14, 2026

Redesigning Schools: But Why?

Image generated with ChatGPT AI.

Although artificial intelligence (AI) is becoming more prevalent in classrooms, there is still a discernible difference in how teachers and students use it. According to Tyton Partners, as cited by the College of Education at the University of Illinois Urbana-Champaign, the ratio between students and teachers who regularly use AI shows a considerable gap between the two groups. These statistics highlight how quickly AI is entering students’ learning environments and suggest that schools may need to rethink how curriculum and assessments are designed.

Disclaimer: Please note that I ran these ideas through ChatGPT, but you can find my original notes here. 

Effects on Learning and Assessments. AI is already shaping how students learn and access information. AI tools can generate essays, summaries, and explanations almost instantly. While these tools can support learning, they also challenge traditional assignments that rely heavily on written responses. When students can easily generate work using AI, it becomes harder for teachers to assess what students truly understand. As a result, schools may need to design assessments that emphasize critical thinking, problem-solving, and real-world applications rather than simple recall or writing tasks (Rupp, 2024).

Outdated Curriculum. Many current curriculum frameworks were developed before AI technologies existed. Much of the traditional school structure still reflects a “factory model” of education that focuses on standardized content and testing. However, scholars argue that education systems must move toward learning environments grounded in relevance, rigor, and relationships. This shift would prioritize deeper learning, collaboration, and meaningful engagement rather than narrow test-based accountability systems (Darling-Hammond, 2025).

Role of Teachers. AI also raises important questions about the role of teachers in the classroom. Rather than replacing educators, AI should be seen as a tool that supports teachers’ work. Educators can act as collaborators with AI technologies, using them to enhance instruction while maintaining professional judgment and human oversight. Teachers remain essential in guiding students, providing ethical perspectives, and supporting the social and emotional aspects of learning (McRae, 2025).

Equitable Access. Another urgent concern is equity in access to technology. Access to AI tools and digital resources is not equal across schools and communities. If curriculum and policies are not redesigned thoughtfully, these differences could deepen existing educational inequities. Schools must therefore develop policies that ensure responsible and equitable use of AI while integrating these technologies into culturally responsive and student-centered learning environments (All4Ed, 2025).

The Future. Redesigning curriculum and assessment is necessary because AI will continue to shape future careers and industries. Schools have a responsibility to prepare students for a workforce where AI technologies are increasingly common. This includes helping students develop digital literacy, critical thinking skills, and the ability to evaluate and monitor AI systems. Although AI can present risks such as bias or misinformation, education systems should not avoid innovation out of fear. Instead, schools should carefully evaluate these technologies while adapting their practices to prepare students for the future (Rupp, 2024).

The urgency for schools to redesign curriculum and assessments comes from the reality that AI is already influencing education. The challenge for educators is not whether AI will be part of learning environments, but how schools can integrate it responsibly while preserving meaningful learning experiences. By rethinking curriculum goals, assessment practices, and the role of teachers, schools can ensure that students are prepared to thrive in a world increasingly shaped by artificial intelligence.


Friday, March 6, 2026

Under Surveiilance: Teaching Shaped by Cameras

Image generated with ChatGPT


My classrooms in Macao and Hong Kong had CCTVs. As a new, foreign teacher in these places, the cameras made me feel uncomfortable within the four walls. While we were assured that the cameras were there to manage students' behavior, the idea of "being constantly watched" created a sense that every behavior in the classroom may be viewed and interpreted by someone else. But over time, I started to see this experience through the lens of power and surveillance. 


In her post entitled, The Danger of Facial Recognition in Our Children’s Classrooms, Nila Bala discusses that schools often operate within clear power imbalances. Similar to my experience in Macao and Hong Kong, teachers have little influence over institutional policies, including decisions about surveillance. As a classroom teacher, I had little or no voice in whether cameras should be present in my classroom. Reporting to school always left me fearing possible situations, like if I snapped or acted in a way that might raise questions or trigger an inquiry about my classroom conduct. Fortunately, I never experienced such a situation. Still, the constant awareness of being watched created a lingering sense of caution. 


I got used to CCTVs over time, but the experience changed my understanding of my teaching environment, making me more aware of how authority and surveillance shape everyday classroom practice. We teachers often work in systems where rules and policies are made without our input, but we still have to deal with the consequences in our daily work. 


Surveillance in the classroom does not only come with the installation of CCTVs in classrooms. In his post entitled Are Teachers Under Increasing Surveillance, Ross Morrison McGill summarizes the three types of teacher surveillance that Damian Page explored in his paper Conceptualizing The Surveillance of Teachers: vertical surveillance, horizontal surveillance, and interpersonal surveillance. These surveys reflect the consumerism and neoliberalism ideologies and portrayal of teachers as commodities who represent themselves as products that can be evaluated and compared, as well as entrepreneurial individuals who continually improve themselves to stay competitive and employable. 


But the most concerning aspect of this surveillance is that many teachers may not fully recognize these forms of surveillance. Without this knowledge, teachers may feel pressure to keep an eye on their work, gather evidence of their efficacy, and establish their value as professionals. Over time, this can normalize a culture in which teaching becomes performative, with an emphasis on proving productivity and accountability rather than providing meaningful learning opportunities. 


Recognizing these dynamics is important. When teachers are aware of them, they can think more critically about the systems they work in and reflect on how surveillance influences not only their professional identity but also the learning environment they create for their students.


AI and School: Some Notes on my Digital Citizenship and Literacy Journey

Image created with ChatGPT I’m nearing the end of my course in Digital Literacy and Citizenship as part of my master’s program, where I’ve s...