Diversity Digest: AI and Bias – Safer Internet Day

Diversity Digest is a weekly reflection written by staff from different areas of our school

AI systems such as large language models learn from vast amounts of human-created data. That data reflects the world as it is, including stereotypes, cultural blind spots, dominant languages, and unequal representation. Bias can also creep in through the questions we ask, the examples we give, and the assumptions we make about what is “normal” or “correct”.

In an international school community, this matters. We can challenge bias by slowing down, questioning outputs, comparing perspectives, and asking whose voices might be missing. AI should support learning, not replace critical thinking, cultural awareness, or professional judgement.

B – Background

Who created the data? Which cultures or contexts dominate?

I – Inclusion

Who is missing, simplified, or stereotyped?

A – Assumptions

What is being treated as “normal” or universal?

S – Second source

Can this be checked against another perspective?

Share This Article

In this Issue

© 2025 Bangkok Patana School

Issue: 20
Volume: 28
Bangkok Patana School
Privacy Overview

This website uses cookies so that we can provide you with the best user experience possible. Cookie information is stored in your browser and performs functions such as recognising you when you return to our website and helping our team to understand which sections of the website you find most interesting and useful.