Former Google Employee Alleges Bias Risks Across Products
-
A former Google employee said "terrifying patterns" were found across Google products like Search and YouTube, showing issues like bias and discrimination. Google disputes these claims.
-
The employee claims Google's core algorithm powers many products, so issues can spread. Google denies having a single underlying algorithm.
-
The employee suggests Google's efforts to improve diversity, like skin tone palettes, may have unintended consequences if not properly implemented.
-
Bias can enter AI through lack of data on certain groups, causing the system to "hallucinate" or make things up. Massive data reviews are difficult.
-
The employee says Google may use "post-processing fairness" or "fairness gerrymandering" to edit AI responses and make products seem less biased than they are.