[ home / rules / faq ] [ overboard / sfw / alt ] [ leftypol / siberia / hobby / tech / edu / games / anime / music / draw / AKM ] [ meta / roulette ] [ cytube / git ] [ GET / ref / marx / booru / zine ]

/tech/ - Technology

"Technology reveals the active relation of man to nature" - Karl Marx
Password (For file deletion.)

Join our Matrix Chat <=> IRC: #leftypol on Rizon

File: 1659607582389.jpg (43.9 KB, 569x320, b.jpg)


> As scientists work to design and test robots and their perceived intelligence, Dr. Joy Buolamwini is dedicated to fighting algorithmic bias in order for AI systems to recognize individuals from marginalized communities.

What a traitor. Black women need to take her down.
How is she pretending she’s doing a good thing is she for real


>What a traitor.
what did she betray ?
She's a labor aristocrat helping surveillance capitalism to track people to get payed.
She made up a bunch of rationalizations to pretend she is doing something virtuous, and that is a little annoying.


Yeah this will be used to identify black women in footage, and so that Teslas don't run them over.

The fact is datasets in computer vision research are crammed with white people, and for industry they want to produce models that generalise across racial categories. It's an AI problem to create better datasets that do that.

The social problem is that you have big tech trusted to make ethical decisions about where to draw the boundaries around harmful opinions, racial bias and privacy in the first place and then handle those requirements to engineers. Someone has to tune the algorithm design and decide exactly how much racial bias is acceptable and necessary. The fact that these decisions are created by companies motivated by profit and then obfuscated and hidden from the public in proprietary software, and then put in the justice system, CCTV cameras and social networks is the problem here.


File: 1659713851681.jpg (109.25 KB, 659x659, pottery.jpg)

>face recognition AI has a bias because AI researchers are racist and/or the face samples available to them are biased
>boogie pigs malding that the minority groups most likely to be poor are hardest to identify because their enforcers (human or AI) literally can't tell them apart
how poetic


I wonder how much more difficult the masks have made such technology to use.


The reason AI beats the human baseline on ImageNet is because it can distinguish dog breeds by the texture of their fur. Its not a problem.


During the Hong Kong protests there was a lot of mask wearing, but didn't help them. It's mainly humans that rely so much on facial features because it's how we communicate. A computer vision model only has to be trained on datasets with masks and it will encode subtle features we don't notice.


Unique IPs: 7

[Return][Go to top] [Catalog] | [Home][Post a Reply]
Delete Post [ ]
[ home / rules / faq ] [ overboard / sfw / alt ] [ leftypol / siberia / hobby / tech / edu / games / anime / music / draw / AKM ] [ meta / roulette ] [ cytube / git ] [ GET / ref / marx / booru / zine ]