Flawed Data Perpetuates AI Bias Against Minorities and Disabled
-
Motion-capture data used to build AI systems often comes from outdated and flawed studies using only white, male, "able-bodied" subjects.
-
These flawed assumptions get baked into standards and benchmarks many researchers rely on to build new technologies like fall detection systems.
-
Historical errors inform today's "neutral" technological systems, leading to bias against people who don't fit the preconceived "typical" body type.
-
Engineers must interrogate their data sources and confirm they accurately represent diverse populations, or risk building unsafe technologies.
-
Technologists need a socio-technical lens to ensure their systems don't create undue harm by excluding certain groups.