[ home / rules / faq ] [ overboard / sfw / alt ] [ leftypol / siberia / hobby / tech / edu / games / anime / music / draw / AKM ] [ meta / roulette ] [ cytube / wiki / git ] [ GET / ref / marx / booru / zine ]

/tech/ - Technology

"Technology reveals the active relation of man to nature" - Karl Marx
Name
Options
Subject
Comment
Flag
File
Embed
Password (For file deletion.)

Join our Matrix Chat <=> IRC: #leftypol on Rizon


File: 1659607582389.jpg (43.9 KB, 569x320, b.jpg)

 No.16164

> As scientists work to design and test robots and their perceived intelligence, Dr. Joy Buolamwini is dedicated to fighting algorithmic bias in order for AI systems to recognize individuals from marginalized communities.
https://www.allure.com/story/joy-buolamwini-coded-bias-interview

What a traitor. Black women need to take her down.
How is she pretending she’s doing a good thing is she for real

 No.16166

>>16164
>What a traitor.
what did she betray ?
She's a labor aristocrat helping surveillance capitalism to track people to get payed.
She made up a bunch of rationalizations to pretend she is doing something virtuous, and that is a little annoying.

 No.16170

Yeah this will be used to identify black women in footage, and so that Teslas don't run them over.

The fact is datasets in computer vision research are crammed with white people, and for industry they want to produce models that generalise across racial categories. It's an AI problem to create better datasets that do that.

The social problem is that you have big tech trusted to make ethical decisions about where to draw the boundaries around harmful opinions, racial bias and privacy in the first place and then handle those requirements to engineers. Someone has to tune the algorithm design and decide exactly how much racial bias is acceptable and necessary. The fact that these decisions are created by companies motivated by profit and then obfuscated and hidden from the public in proprietary software, and then put in the justice system, CCTV cameras and social networks is the problem here.

 No.16171

File: 1659713851681.jpg (109.25 KB, 659x659, pottery.jpg)

>face recognition AI has a bias because AI researchers are racist and/or the face samples available to them are biased
>boogie pigs malding that the minority groups most likely to be poor are hardest to identify because their enforcers (human or AI) literally can't tell them apart
how poetic

 No.16172

I wonder how much more difficult the masks have made such technology to use.

 No.16173

>>16172
The reason AI beats the human baseline on ImageNet is because it can distinguish dog breeds by the texture of their fur. Its not a problem.

 No.16174

>>16172
>>16173
During the Hong Kong protests there was a lot of mask wearing, but didn't help them. It's mainly humans that rely so much on facial features because it's how we communicate. A computer vision model only has to be trained on datasets with masks and it will encode subtle features we don't notice.

 No.16175


 No.16328

>>16175
>gait analysis
put something in your shoe or wear a heavy belt like a homemade one or weigh your ankles.
There are many ways to avoid this.

 No.16329

>>16175
>>16328
>gait analysis
There's an even better solution.


Unique IPs: 8

[Return][Go to top] [Catalog] | [Home][Post a Reply]
Delete Post [ ]
[ home / rules / faq ] [ overboard / sfw / alt ] [ leftypol / siberia / hobby / tech / edu / games / anime / music / draw / AKM ] [ meta / roulette ] [ cytube / wiki / git ] [ GET / ref / marx / booru / zine ]