sitebranding-brian.png

|

Bad programming causes the self-driving cars to not see black people

Bad programming causes the self-driving cars to not see black people

This is a big problem for the fledgling industry of autonomous vehicles -- not being able to avoid hitting black people -- based on a report by Researchers at the Georgia Institute of Technology which found that state-of-the-art detection systems, such as the sensors and cameras used in self-driving cars, are better at detecting people with lighter skin tones, which makes them less likely to spot black people and to stop before crashing into them.

The researchers launched the study after observing higher error rates for certain demographics by such systems. Their tests on eight image recognition systems found this bias held true, with the accuracy proving 5% less accurate on average for people with darker skin. The scientists divided a large pool of pedestrian images into two groups of lighter and darker skin using the Fitzpatrick scale – a scientific way of classifying skin color. Even when the time of day was altered between day and night, or obstructing the image detection systems view, the average accuracy remained the same.

This is not necessarily intentional, or so one would hope, but tends to be based on the software engineers' (primarily white and Asian) common human association. However the error can be corrected with much more study and inclusion of dark skinned people.

The Author publishes OPED columns at various global media houses, and is renowned for numerous books
Copyright © 2020 The Mill Magazine. All Rights Reserved.
: :