I’m nearing the end of my course in Digital Literacy and Citizenship as part of my master’s program, where I’ve spent a lot of time exploring how technology is shaping the way we learn. The cluster of articles this week was about AI and the future of education. I felt that particular mix of professional curiosity and personal unease that comes with reading about something that's actively changing the field we are studying. Howard Gardner called AI a fundamental a shift in education as anything seen in a thousand years, and as someone currently sitting inside an educational institution trying to understand digital transformation, that landed differently than it might for a casual reader. He suggested that most cognitive aspects of the mind - the disciplined, synthesizing, and creative - will eventually be performed so well by large language models that whether humans engage in them at all will become optional. That's a provocative claim for anyone in education, but for someone studying digital literacy specifically, it raises a direct question: what exactly are we preparing students for if the cognitive heavy lifting is increasingly outsourced?
Jim Shimabukuro pushed that question further for me. The argument is that AI's personalization, accessibility, and efficacy could shift the balance away from traditional schools as the dominant medium for academic learning, potentially within the next decade. The numbers cited are hard to sit with neutrally: some AI-first schools report that students master core curricula in two hours a day and score in the 99th percentile nationally. In my digital literacy coursework, we talk constantly about the difference between access to information and genuine understanding, and I find myself wondering whether those test scores reflect deep learning or something more like optimized performance. Mary Burns and Rebeca Winthrop speak directly to this tension, and it's the piece I'll probably be citing in my next assignment. Their research found that AI tools prioritize speed and engagement over learning and well-being, and that cognitive offloading and dependency can atrophy students' mastery of foundational knowledge and critical thinking, which in essence is the exact skill set a digital literacy education is supposed to build. There's something deeply ironic about that.
What keeps me from sliding into full pessimism, and what I think is actually the most useful framing I've encountered across all three pieces, is the insistence that none of this is fixed. As Burns and Winthrop highighlighted, the future of AI in education is in the hands of individuals and institutions, and we all have a role to play as active participants rather than spectators. Gardner's collaborator Anthea Roberts said something that resonates with me both as a student and as someone training to work in this space: we have the chance to cognitively offload, and we have the chance to cognitively expand. It is our duty to figure out how to pursue expansion rather than replacement. That feels like the core question of digital literacy to me. It's not really about the tools themselves, but rather, it's about the habits of mind we build around them.
I came into this master's program thinking I'd learn frameworks for evaluating technology. Increasingly, I think the deeper work is learning how to protect the kind of thinking that technology can't, and shouldn't, replace.

Hi Kris,
ReplyDeleteI thoroughly enjoyed reading your thoughtful reflection on this week’s readings; your insights really gave me a lot to think about. I was also struck by Gardner’s perspective that tasks like synthesizing information and creativity may one day be performed extremely well by large language models. That said, I sincerely hope society doesn’t come to value AI-generated creativity more than human expression.
For so long, the primary purpose of schooling has been to teach students how to engage in deep cognitive work. Now that some of this heavy lifting is being outsourced to AI, I find myself questioning how the role of schools may continue to evolve. At this point, I believe our focus should remain on what AI cannot and should not replace: fostering critical thinking, building meaningful human connections, and nurturing authentic creativity.
I believe that schools will continue to evolve along with AI tools. The question for me, though, is how fast are schools evolving? I thought about AI as a booming business that gets all the funding compared to schools and districts that struggle with limited financial resources. Hoping that the educational evolution isn't slow enough to miss crucial goals. But I am hopeful. And yes, certainly, I agree with you that what we can best do for now is to stay relevant as teachers, building on our core humanity, our ability to develop critical thinking and human connections.
DeleteGood luck with your project.