Saturday, January 31, 2026

Fakes and Reals in AI

I've been thinking a lot about how quickly AI is changing everything, from our beliefs to our way of life. AI is all around us now, affecting our daily lives, choices, and even what we think is true. It impacts both individuals and established infrastructures. 


Qadri’s research highlights several ways generative AI introduces serious ethical and societal risks, including political propaganda and media weaponization, social engineering and psychological manipulation, and economic and legal impacts of synthetic media. In the Real or Fake podcast by KRQE, guest spoeakers Melanie Moses and Sonia Rankin of UNM see AI as both thrilling and risky. Similar to our class discussions, Moses and Rankin see AI as possibly biased and misleading, but it sounds so convincing that it’s hard to tell what’s real and what’s not, and warning us that AI technology can not just keep spreading without any control. It is up to us to understand what AI can achieve, where it falls short, and how it gradually alters what we trust, whether it is information, companies, or even each other. 


AI is not just about the technology itself, rather the ethics and how it all affects us as humans. AI is forcing us to rethink how we do things, from schooling to what we even consider real. Using the value of money, the speakers linked AI's potential to construct fake individuals to fake realities, underlining that the concern is not lying, but the erosion of trust, which is more damaging than what is being put out there.


There is an a need for us to slow down, to think things through, and point out the fakes when we spot them. How does this work in school settings? Sometimes spotting fakes is easily said than done. But again as fakes come like wolves in sheep clothing. Moses and Rankin threw out the idea that we need to experiment and see what the limits of these AI tools are, while still staying connected to tactile learning (hands-on learning and real-world experiences). Afterall, school should not just be about getting quick answers. It is about the process of learning: learning how to think, ask questions, and be creative.


At the same time, there's a lot of potential here. AI could make learning better, boost creativity, and show us new things we never thought about. But to shape that future and control how AI impacts us, we need to decide what kind of world we want to create with it. People will always want something real, those with a human touch in it: real voices, real art, real relationships. What we need right now is to use these tools wisely, stay informed, and speak up about how we want them to change. There are things to be careful about, but there's also a lot to be excited about. With AI, we just need to pay attention to both.


Some videos I found on the internet about misinformation, fake news and deepfakes:

Misinformation


Reflective Question:

When AI can create things that look real, what human skills and learning experiences should schools focus on to keep learning meaningful and authentic?




Friday, January 23, 2026

Ribble's Nine Elements of Digital Citizenship: In Action

Image generated using ChatGPT

Ribble’s nine elements of digital citizenship describe the skills and habits people need to use technology responsibly, safely, and ethically. 

Reflecting on my experiences as an educator, I have seen Ribble’s nine elements of digital citizenship in action, both in positive and negative ways. One example that stands out is around digital communication and etiquette. I had a class working together on a shared online document, and at first, a few students were leaving comments that came off as harsh or dismissive. Naturally, it caused tension and slowed down their progress. We took a step back, revisited our earlier discussion on “netiquette” (a word used in our textbook), reviewed rules on how to communicate respectfully online, and set some clear expectations. While it took time for the dynamic to shift, students eventually supported each other instead of tearing each other down.

I’ve also seen the impact of neglecting digital health and wellness. Like many schools, the COVID-19 pandemic moved our school to online learning. During this stretch of remote learning, some students were online for hours on end. Some students were online for hours at a time, and fatigue, frustration, and disengagement were obvious. After the pandemic, my previous school in Hong Kong introduced a weekly no‑screen day and developed a comprehensive outdoor education policy to help restore balance in students’ learning. It was a thoughtful and forward‑looking approach that I still value today. Once we started building in breaks, structuring screen time, and incorporating offline activities, participation and focus improved dramatically. It was a reminder that taking care of students’ physical and mental well-being is as important as teaching them to navigate technology safely.

These experiences have reinforced something for me: digital citizenship isn’t just a set of rules. It shapes relationships, learning, and the classroom environment. When students understand and practice these skills, the classroom becomes safer, more respectful, and much more supportive of growth.







Friday, January 16, 2026

My Journey with AI

 
Image generated using ChatGPT

I first began hearing about Artificial Intelligence (AI) in 2022, when I moved to Hong Kong and taught Computer Education to Grade 5 students. At the time, my understanding of AI was fairly surface-level. It was only toward the end of a coding unit, when a mini AI-based coding project was introduced, that I began to grasp its basic concepts and the science behind it. Around the same period, my school encouraged teachers to integrate generative AI tools into classroom practice, particularly to support foreign and second-language learners in overcoming language barriers. I found this school initiative exciting. As a language learner myself, I recognize how tools like generative AI can make work more manageable, especially when access to learning resources is limited.


Although I have been using specific generative AI tools for some time now, I would say I am still becoming fully comfortable with them. Given the rapid pace of technological development, I am still building confidence and competence in applying AI meaningfully in both my personal and professional practices. Part of becoming “comfortable” with AI means consistently questioning when its use is appropriate and when it crosses ethical boundaries. As an educator, I see AI as both beneficial and a potential concern in the classroom. This is especially true for students, whose understanding of AI-related ethical principles is still developing and remains fragile. Reading Faverio and Sidoti (2025), I was fascinated by how much time teenagers spend on AI-driven social media and how frequently they use AI chatbots. With students actively engaging with AI tools and exposed to ideas that "look and sound real, yet are entirely generated by AI" (Dixon, 2025), I find it necessary to stay abreast of these developments. Staying informed allows me to guide students more effectively as they learn to navigate the opportunities and ethical challenges AI presents in today’s world.


In one of my previous courses, a question about what AI might look like in the future was asked, and bothered me for some time. When I think about how difficult it already feels to keep up with AI developments, I can’t help but wonder how far behind I would fall if I do not start fully embracing AI today. We are already seeing how AI-powered bots are taking over certain jobs, and that reality makes the question feel even more urgent. Instead of viewing AI solely as a threat, I am learning to see it as a tool that requires intentional and ethical engagement from educators. Only by embracing AI can I better prepare myself and my students to adapt to a future where human judgment, creativity, and moral responsibility remain essential alongside advances in technology.


How do you see AI changing our role as teachers in the next 5-10 years, and what skills do you think we should start building now so we don’t get left behind? I'd like to hear your thoughts:)


AI and School: Some Notes on my Digital Citizenship and Literacy Journey

Image created with ChatGPT I’m nearing the end of my course in Digital Literacy and Citizenship as part of my master’s program, where I’ve s...