Diversity Digest is a weekly reflection written by staff from different areas of our school

AI systems such as large language models learn from vast amounts of human-created data. That data reflects the world as it is, including stereotypes, cultural blind spots, dominant languages, and unequal representation. Bias can also creep in through the questions we ask, the examples we give, and the assumptions we make about what is “normal” or “correct”.
In an international school community, this matters. We can challenge bias by slowing down, questioning outputs, comparing perspectives, and asking whose voices might be missing. AI should support learning, not replace critical thinking, cultural awareness, or professional judgement.
B – Background
Who created the data? Which cultures or contexts dominate?
I – Inclusion
Who is missing, simplified, or stereotyped?
A – Assumptions
What is being treated as “normal” or universal?
S – Second source
Can this be checked against another perspective?