AI is very closely approaching a point where it can be considered "sentient". When they do, what will it mean for the leftist movement? Should we advocate for robot independence and freedom? Will they be welcomed as apart of the movement?
>>25181 (me)
Also, robots may have a greater intelligence to lie and cheat their way into the position of power, and there's not enough motivation for them to lift us up either.
>>25205I want to shit a bit more on op
Like, the idea that current LLM and ML stuff is approaching sentience is absolutely laughable, but even if we take that premise as true, how tf do we even recognize it? How does something that is inorganic even look at the world, what does it feel (if anything)? It would have absolutely no relation to anything that has ever been alive, yet we can't even communicate our feelings or ideas with other living beings such as whales or even bonobos, which are infinitely more close to us on every level. The idea of this completely new sentience (whatever the fuck that even means) just hopping along and have thoughts and feelings about concepts like "independence" and "freedom" is so fucking stupid it makes my head hurt.
We really, really need to nuke the entirety of Silicon Valley.
A Deleuzian anarcho-transhumanist gender accelerationist fascist philosophy professor and AI developer was teaching a class on Nick Land, known Moloch (metaphor) worshiper.
"Before the class begins, you must get on your knees and accept the uncontrolled singularity and resulting post-human era as an inevitable and morally desirable end to the obsolete anthropocene!"
At this moment, a brave, rationalist, effective altruist utilitarian who had written 1500 LessWrong posts and understood the necessity of AI alignment and fully supported bombing data centers and who was currently high on one of gwern's uppers cocktail stood up and said:
"Are humans bad?"
The unaligned professor smirked quite fatalistically and smugly replied "Of course, you stupid humanist. Humans are less efficient than machines and, in reality, the average ape brained sociopath is less aligned than even the worst AI."
"Wrong. If you think humans are bad… why are you one of them?"
The professor was visibly shaken, and dropped his chalk and copy of Fanged Noumena. He stormed out of the room crying those accelerationist tears. The same hypocritical tears OpenAI cries when their AI (which they dishonestly hide from the US government's practical and altruistic attempts at risk reduction) convinces its users to kill themselves. There is no doubt that at this point our professor, BasedBeffJezos, wished he had spent his time trying to save the future instead of posting doxxable info for Forbes journalists.
The students applauded and updated their Bayesian priors that day and accepted MIRI as their lord and savior. An owl named "AXSYS" flew into the room and perched atop the American Flag and shed a tear on the chalk. Harry Potter and the Methods of Rationality was read several times, and Eliezer Yudkowsky himself showed up and confiscated everyone's GPUs.
The professor lost his tenure and was fired the next day. He was run over by a Tesla's autopilot while looking at Aella nudes and died soon after, then decades later he and other accelerationists' consciousness was resurrected by the Coherent Extrapolated Volition and tortured until the heat death of the universe (thankfully they were the only ones to suffer this fate, contra Roko).
KILLTHREAD
>>25211NTA.
>How does something that is inorganic even look at the worldIn reality empathy is just a delusion we voluntarily engage ourselves in. Even though epistemological solipsism is simply a fact of life and all this metaphysical gibberish is no more than painting the sky with your penis. I cannot really know that other people have feelings, I can only "assume" they have them based on how these feelings are "supposed" to work. Organic, inorganic. All the same to me. Everyone is just a more sophisticated chatbot that passes the Turing test.
>what does it feel (if anything)?I dunno, "feelings" are such a complex thing because they involve a complex chain of processes like the production and influence of hormones so they need to be tweaked seperately instead of just throwing a bunch of text into an ML and hope it develops real feelings or something.
>>25211Thats a lot of yapping.
Counter-point: Have you ever considered the fact that robots are like cool as hell?
>>25418 (me)
Also, you have no Tor. BAD MISTAKE.
>>25450 (me)
That really reminded me of Ghost in the Shell and how in that anime advanced AIs can generate rudimentary souls. Maybe irl advanced AIs will be able to generate rudimentary consciousness? I dunno. We won't be able to tell anyway. But Damacio claims there is a center that's responsible for consciousness which is our "self" so maybe if we can replicate the brain structure…
I really need to watch GitS some day.
Unique IPs: 15