Friday, February 27, 2026

AI Is Powerful - But Are We Aware of Its Bias?



Image generated with ChatGPT

This week's reading and listening made me think about the first time I experienced AI bias in action. It was years ago in Hong Kong, and a basic search showed Filipinos mainly as domestic helpers. This wasn't totally unexpected, but it showed me how AI systems, trained on huge amounts of data shaped by human choices, can produce biased and harmful stories. That experience really stuck with me, especially now that AI is quickly changing education

AI tools like ChatGPT learn from massive amounts of human-created data. If that data contains stereotypes, gaps, or historical inequalities, those patterns can show up in AI responses. There are several types of biases that exist in artificial intelligence: interaction bias (how users influence systems), selection bias (who and what gets included in the data), and latent bias (hidden assumptions buried in information). In schools, this bias is concerning. It subtly influences what students see as true or normal, creating a hidden curriculum where algorithmic responses seem right and unquestionable. Artificial Intelligence is transforming education positively and negatively.

The big question here is, what can we do?
  • AI developers need to carefully review training data, allow external audits, design inclusive teams, and build tools that identify false or harmful outputs. 
  • Educational institutions can devise programs to teach students how to use GenAI. They can modify assessments to check for new ideas and thinking skills. Also, they can set rules that are easy to understand, so everything is fair for everyone. 
  • Educators play a huge role in identifying and countering power and bias. We can facilitate open conversations about how algorithms work, use real-world examples of bias, and teach students to question AI-generated content instead of automatically trusting it.
There is no denying that AI is powerful. But as Lucretia Fraga underscored, an informed user is even more powerful. By initiating small talk openly about bias and teaching students how to think critically about technology, we shift the balance. AI can shape education; we just have to ensure it does so for the better."

In my catalyst discussion this Monday, I will address the question, "What are teacher responsibilites when they don't control AI?" I will explore several things educators can consider to develop their own digital literacy to be effective in helping their students navigate AI critically and ethically. This video introduces us to Monday's discussions.

Reflect
Share a time when an AI tool may have quietly influenced your child's or student's beliefs, identity, and opportunities without us even noticing.

Links to the materials I explored for this post:
https://www.youtube.com/watch?v=cygpAm4ooGs
https://www.youtube.com/watch?v=eeyL9jBky68
https://www.youtube.com/watch?v=59bMh59JQDo
https://pubmed.ncbi.nlm.nih.gov/39850144/
https://www.eff.org/deeplinks/2022/05/podcast-episode-teaching-ai-its-targets





Thursday, February 19, 2026

The Teacher Advantage in an AI-driven Classroom

I
Image generated with ChatGPT

Personal LearningExperience with AI

In my capstone class, our professor had us pull out the main ideas from our research and use an AI application to find similar topics. The themes generative AI tools identified were pretty shallow and did not really connect to our sources all that well. While the generative AI tools could identify patterns and repeated words, they overlooked the important ideas we found by closely reading the articles. This highlighted the AI's weaknesses: it can see patterns, but it lacks the judgment inherent to us humans. 

Given the rapid developments in AI, educators today need to develop critical skills, including good judgment, data and ethical literacy, and the ability to facilitate learning with curiosity. With these skills, educators will be able to teach kids about and work with artificial intelligence and prepare them for a digital world.

Strong Professional Judgment
In her opinion article, Tanishia Lavette Williams highlights that AI is always behind the times and only uses old information. Teachers have to carefully look at AI's answers and teach students to do the same. AI systems are often biased, and they look to the past. She notes that we need judgment, care, and cultural knowledge to teach with a human touch, which computers, though sophisticated, cannot do. To make good judgments, teachers need to think about what they are doing, carefully analyze what AI provides them, and identify bias. Teachers who lack these skills may unknowingly continue practicing unfair systems and rely too much on automated solutions. But with these skills, teachers can ensure that learning is meaningful for everyone.

Ethical Awareness and Data Literacy
In the podcast with Tom Vander Ark, Doug Fisher emphasized the importance of teaching children about AI from elementary school onward, so they can use the tool critically and responsibly. Many kids start checking out AI on their own by high school, so he thinks we should introduce it as early as second grade. Teachers should get kids curious, help them understand data, and teach them how to talk to AI. If teachers and students know how to understand data, they can check information, question facts, and make good choices. Fisher believes that to make good decisions, everyone needs to know more about data. To help teachers grow in this area, they can work together and set clear classroom rules for using AI responsibly. Without these things, students might not use AI the right way or might just believe whatever it tells them. But if these skills are taught well, they can encourage responsibility, respect for privacy, and good moral judgment.

Nurturing Curiosity and Professional Agency
Fisher also points out that some students think there's just one right answer. We need to pair curiosity with good judgment in this AI age - the ability to look at results, improve them, and make them better. Teachers can show how this is done by asking questions, creating activities where AI helps solve problems, and helping students become skilled with AI instead of just using it passively. As students learn to think critically about AI, they start to see themselves as professionals in training instead of just people trying to finish tasks.

Reflective Question:
What other skills do you think teachers need to keep up with AI and guide students as they use it?


Saturday, February 7, 2026

Rethinking Our Relationship With AI

Image generated using ChatGPT

The issue, "Does AI replace the human workforce?" has been bothering me for a while now. This question was somehow answered by listening to the podcast The Latest on AI and Work in Higher Ed by Educause Shop Talk wherein they explored the place, challenges, and opportunities of AI in the workplace and in higher education. 

Theme 1: The importance of collaboration. The panelists highlighted that across all fields, substantial AI policies have already been put in place, many of which directly address the concerns we keep raising. The real challenge is not drafting more rules but on working together to learn how to govern data consistently across all boards. As several panelists noted, we need to be honest about which policies are actually helping and which ones are falling short. 

Theme 2: The need to shift the narrative around AI and work. While headlines often warn that AI is “taking over” human jobs, data simply does not support that fear. Instead, the panelists argued that the real risk comes from our own resistance to developing digital fluency. When we avoid learning new tools, we create the very conditions that make us feel replaceable (a self‑fulfilling prophecy). 

What’s real? 

AI is a far more effective tool for increasing human productivity than it is as a replacement for human knowledge. The panelists urged us to view AI as a partner rather than a competitor. AI can help us brainstorm, generate ideas, and elevate our work to a more strategic level. One of Sarah Eaton’s tenets of postplagiarism supports this idea by underlining that human creativity is not threatened but rather strengthened and expanded by artificial intelligence. We are still the ones driving the process. Instead of giving our jobs to machines, the idea is to utilize AI to enhance the aspects of our jobs that require human interaction, creativity, and judgment. 

What’s the role of educators?

The panelists explained that our role as educators is not to shield students from AI but to create safe, thoughtful “playgrounds” where they can explore information, experiment with tools, and learn from one another. When we secure the integrity of the data students interact with, we free them to focus on the deeper work, such as developing values, strengthening collaboration, and sharpening their human skills. AI becomes a catalyst for learning, not a shortcut around it. This means moving beyond transactional uses of AI and instead embracing a strategic mindset. As AI becomes an increasingly integral part of our everyday lives, we therefore need to reskill and upskill, not because it will replace us, but because it will augment our capabilities. 

Next week's post?

I will explore the role of teachers next week, reflecting on the necessary qualities teachers need to leverage AI and the importance of viewing teachers from a humanistic perspective when discussing teaching and learning with AI. 

What teacher qualities do you think will be most beneficial in today’s AI-dominated school environment? Feel free to share your thoughts.

AI and School: Some Notes on my Digital Citizenship and Literacy Journey

Image created with ChatGPT I’m nearing the end of my course in Digital Literacy and Citizenship as part of my master’s program, where I’ve s...