Nepal, May 7 -- In a classroom discussion on women's work in India, a student confidently discussed the AI-generated result of her prompt "summarise women's work in India". The result was a well-structured essay on how women in the Indian economy are engaged in agriculture and how their work is under-paid.

While many dimensions of this response were correct, what it missed completely was the unpaid care work women do in Indian households. It was a textbook example of algorithmic bias-when women's unpaid work isn't measured well, it doesn't appear in the datasets AI models are trained on. And when students rely on these tools without question, they risk internalising those same silences. More and more teachers now face similar unsettling ...