Machine-learning software didn’t just mirror those biases, it amplified them

From Machines Learn a Biased View of Women | WIRED

…Ordóñez wondering whether he and other researchers were unconsciously injecting biases into their software. So he teamed up with colleagues to test two large collections of labeled photos used to “train” image-recognition software.

Their results are illuminating. Two prominent research-image collections—including one supported by Microsoft and Facebook—display a predictable gender bias in their depiction of activities such as cooking and sports. Images of shopping and washing are linked to women, for example, while coaching and shooting are tied to men.

Machine-learning software trained on the datasets didn’t just mirror those biases, it amplified them. If a photo set generally associated women with cooking, software trained by studying those photos and their labels created an even stronger association.

Bias in artificial general intelligence may lead to catastrophic outcomes, but even the bias in “weak AI”, designed to just assist and expand human intelligence, poses a significant risk.

Perception of augmented humans might be more distorted than ever.