No.16164
> As scientists work to design and test robots and their perceived intelligence, Dr. Joy Buolamwini is dedicated to fighting algorithmic bias in order for AI systems to recognize individuals from marginalized communities.https://www.allure.com/story/joy-buolamwini-coded-bias-interviewWhat a traitor. Black women need to take her down.
How is she pretending she’s doing a good thing is she for real
No.16166
>>16164>What a traitor.what did she betray ?
She's a labor aristocrat helping surveillance capitalism to track people to get payed.
She made up a bunch of rationalizations to pretend she is doing something virtuous, and that is a little annoying.
No.16170
Yeah this will be used to identify black women in footage, and so that Teslas don't run them over.
The fact is datasets in computer vision research are crammed with white people, and for industry they want to produce models that generalise across racial categories. It's an AI problem to create better datasets that do that.
The social problem is that you have big tech trusted to make ethical decisions about where to draw the boundaries around harmful opinions, racial bias and privacy in the first place and then handle those requirements to engineers. Someone has to tune the algorithm design and decide exactly how much racial bias is acceptable and necessary. The fact that these decisions are created by companies motivated by profit and then obfuscated and hidden from the public in proprietary software, and then put in the justice system, CCTV cameras and social networks is the problem here.
No.16172
I wonder how much more difficult the masks have made such technology to use.
No.16173
>>16172The reason AI beats the human baseline on ImageNet is because it can distinguish dog breeds by the texture of their fur. Its not a problem.
No.16174
>>16172>>16173During the Hong Kong protests there was a lot of mask wearing, but didn't help them. It's mainly humans that rely so much on facial features because it's how we communicate. A computer vision model only has to be trained on datasets with masks and it will encode subtle features we don't notice.
No.16328
>>16175>gait analysisput something in your shoe or wear a heavy belt like a homemade one or weigh your ankles.
There are many ways to avoid this.