top of page

The Biased Movie and Book Club, Episode 3 - Invisible Women

  • Writer: Luisa Herrmann
    Luisa Herrmann
  • 2 days ago
  • 2 min read

Caroline Criado Perez’s Invisible Women: Exposing Data Bias in a World Designed for Men is one of those books that forever changes the way you look at the world: the products, systems, and policies designed for us, and not necessarily with us in mind. It’s essential reading for anyone working in product, data, design, or policy, and also for anyone who wonders who makes decisions that influence things like medical research, drug discovery, workplace safety, etc. I have to say, this book broke my heart.


What hit me hardest is that it's about how easy it is to default the decisions and the inputs in the systems that govern all of us in our society to the men. If those things affect women the same way, great. If not, we are an afterthought, if we are a thought at all, and it's tough to see how normal and accepted this has become. Defaults that systematically ignore half the population because they weren’t accounted for in the data we collect or the systems we build.


From crash-test dummies designed for male bodies, to voice recognition systems trained on male voices, to healthcare data that overlooks female-specific symptoms for awesome reasons like "female tissues are harder to test on because they have hormonal fluctuations". Do you know who else has tissue with hormonal fluctuations? Female humans. Perez shows how “neutral” data is often anything but. These gaps don’t just cause frustration or inconvenience; they lead to worse outcomes, missed opportunities, and in some cases, real harm.


For those of us building AI and data-driven products, Invisible Women is a critical reminder that if we’re not consciously designing for inclusion, we’re likely designing for exclusion by default. And if all the people in the room building a solution look the same, sound the same, and have the same kinds of hormonal fluctuations (or lack thereof), these solutions are going to default to what works for them. Not out of malice, but out of convenience. This is why representation in data isn’t a “nice-to-have”, it’s foundational to building equitable, effective systems.


Inclusive data collection isn’t just a technical challenge, it’s a cultural and ethical imperative. Science has always had a male bias, which has only amplified the gap as it has became more influential in health, safety, and education, now supercharged by AI. The best time to change that was 200 years ago, the second best time is now.


Have you read it? What stuck with you?

Comments


bottom of page