I was under the impression that commercial self-driving software was deeply proprietary and confidential, and there is no way to know that this study will generalize if run on state of the art detectors. Tesla and Cruise are name-checked in the article - how do we know this isn’t a problem they have worked extensively on and made great improvements to, relative to the open source components?
Feels like a case of outrage-for-clicks.
The BI article is definitely outrage for clicks. I wouldn't be surprised if the actual journal article was more measured in its conclusions and this is just typical bad science reporting.
> how do we know this isn’t a problem they have worked extensively on and made great improvements to, relative to the open source components?
They are a private, for-profit entity with a strong incentive to mislead people about their products. I see no reason to assume they've addressed this issue.
Which isn't too say it's not never, as I remember studies in my own childhood that said human drivers were also bad at recognising how far away children were, and I've never heard of human perception of skin colour being tested in this way so it might just turn out that melanin is unfortunately good camouflage against tarmac…
…but unless and until that suggestion turns out to be correct of all humans, I default to assuming we're an existence proof of the capability to do without, and that means I still wouldn't say "never" to sufficiently advanced AI doing at least as well.
Like 99% of these “AI discrimination” articles.
>human-detecting AI is developed in a western country with ~60% white population. Most of the training data is collected there
>the AI performed slightly worse in Uttar Pradesh, where the people and everything else in the background look different
>AI is prejudiced! Get outraged!
Every time.
What is wrong with this statement in your opinion?
I am a big fan of Scandinavian style pedestrian safety reflectors. Attach one to your bag or jacket if you are walking late at night; it might save your life. But if you don't have a reflector, wear at least one piece of bright, light-colored clothing; this is particularly important it your skin color is dark!
Some saying: racists are people that are thinking of race, talking about race, and acting based upon race.
> The detection systems were 19.67% more likely to detect adults than children, and 7.52% more likely to detect people with lighter skin tones than people with darker skin tones, according to the study.
while they all had a harder time with adults vs children, that 7.52% is gotten by averaging 2 algorithms that performed abysmally, with 6 that had no statistically significant differences
The conclusion is kind of weird: apparently their "findings reveal significant bias in the current pedestrian detectors" despite the bias being almost entirely within the single-pass general object detectors. And where it's statistically significant in the other models, the miss rate is low in both cases, and the effect is reversed! (Dry-weather Cascade-RCNN does better on dark-skin than light-skin, among others.)
RE: 28% miss rate, I think this is meaningless as it's looking at single images/data points, while self driving cars get a continuous stream of data
There is telling whether these results are valid or applicable at all, but they purport that there are statistically significant unfairness based on gender and skin color. At best, this feels misleading.
What is it that makes it so hard for all these algorithms to work on people with darker skin? This has been an issue for more than ten years, surely someone has started adding various skin colors into the training data. Is it a case of lack of training material, or is it just faster to focus on one skin type?
It is nowhere near small or cheap enough for self-driving car applications, but will be one day.
Another challenge is affordable real-time processing of the data. Churning through 3,200MB/s of phase-history data is expensive but again that will solve itself given time.
Yes
> Surely someone has started adding various skin colors into the training data.
What has to occur for this to happen:
* Someone has to take the time and effort to measure things, to identify that there is a problem.
* They have to get that message out so that it's heard.
* That message needs to:
* hit the public hard enough that people demand intervention from their elected representatives
* or, alert the company directly, and hope that the incentives align. (Will the company make more money by fixing this?)
There's plenty of easier alternatives: * Call the problem too hard to solve
* Call it bad science
* Call it ragebait
* Call it woke
* Make up a bunch of equivalences and channel it into inertia:
* If people are wearing winter coats, then they won't show enough skin for the cars to be racist. And if the cars aren't racist in cold places, then it isn't a problem in warm places.
* People don't have radar/lidar either, and they're allowed to driveI've yet to see self driving cars successfully navigating during bad winter conditions. They can't even avoid killing pedestrians in California.
- edit -
Sorry, I read the article too quickly and assumed it was talking about the countries UK and China. Perhaps they only bothered testing the cards in UK, Silicon Valley and China, Silicon Valley.
From what I can see, a couple of the detectors used really seem shit overall, making the combined data of questionable value.
Conclusion - we call on lawmakers to make this technology illegal. We prefer more people die at equal rates more than we prefer less people to die at unequal rates.
I am not sure I agree with the ethics that underlies this way of seeing the world.
Drunk drivers kill more Americans each few months than terrorists on planes have in the last 25 years. Yet every airline passenger must prove they aren't a terrorist but no one driving a car has a default presumption they are drunk. Unless you've been convicted previously, then maybe sometimes.
I wouldn't be surprised if a better model exists for object detection and we aren't using it to save pennies. Politics and ethics in automobile safety is asinine. Fair point.
I think we have a precedence for that in testing of drugs. The majority of drugs are primarily tested on white men, meaning that their effect and dosages may be problematic for women or people of color.
There's also the issue of the majority of tools being designed for right handed people and any left handed either needs to spend more on tools or accept a certain risk when operating a chainsaw.
The way I read it is something like this...
Some researchers got their hands on software that purports to do similar stuff to what self driving cars might also do, but crucially isn't the same as what the cars actually use, and then extrapolate the results into the headline-like title of the research paper: "Dark-Skin Individuals Are at More Risk on the Street: Unmasking Fairness Issues of Autonomous Driving Systems". That's justified isn't it? After all, all software in a category is more or less the same program and the car company software and their research subject software all runs on computers? Right? Must be valid... clearly you can make factual assertions on that kind of extrapolation about computer systems and software.
Then some bright-eyed-bushy-tailed reporter comes along and applies the criticality of the typical college educated/professional journalist, which is to say they carefully considered the headline they could write, but otherwise just took the word of the researchers that something resembling knowledge was actually gained by the study. News is delivered! Job done!
Look, sarcasm aside, could I have read/understood things incorrectly? Sure... I'm not an expert in this field. Could this be a problem in production-used-in-the-real-world pedestrian detection systems? Sure. But insofar as I can tell, the best the paper could be telling us is that racial biases in pedestrian detection systems is a viable possibility: not the assertion that "Dark-Skin Individuals Are at More Risk on the Street". It might be true, but I don't think these researchers know that any better than I do. Of course, "Dark-Skin Individuals Could Be at More Risk on the Street" isn't nearly so catchy or attention grabbing, is it?
And who knows... maybe this research team should pick up the search for low temperature/low pressure super-conductivity... sounds like they have the right temperament.
I had one dark skinned kid in dark clothes casually crossing the road in front of me during the dark months here were I live.
I didn't have to slam the brakes or anything because it was a bit ahead of me, bit it was scary because of how hard it was to detect.
Pedestrians (and cyclists) should wear bright clothing and/or retroreflectors at night! (Ideally both. Retroreflectors are mostly useless if a car has its lights off.)
Or they are simply less visible.
The second line in the article
>bias towards dark-skin pedestrians increases significantly under scenarios of low contrast and low brightness
However, the article leads with a picture of a Cruise car, which use lidar technology. Those should afaik recognize people with the same accuracy regardless of skin color.