>>2538535>The use of commercial AI has allowed many people to bring their fantasies to reality through the use of "AI Slop."It didn't "allow" anything, it just made the "what if what I make isn't goad enough" hurdle weirder by giving an instant option that will leave them unsatisfied, but then further paralyzed to take any other route of implementation because the AI can emulates things people consider their own beginner art to lack: rendering, backgrounds, greebles, etc… so the effort barrier is out of whack when it's an option.
>This has created a divide in the artistic sphere by sparking debates over AI art and whether it should be considered art.On twitter, maybe. No one really debates that anymore. Art is just a form of communication, and AI is just a very vauge and indirect one that doesn't actually need a sender, and thus raises issues on the internet–a massive communication machine–that senderless and thus incommunicative communication is better at search engine optimization, and thus building the internet around search engines and supermassive, unmoderatable platforms that curate on the same principal as search engines was a mistake and we'll need to adapt to using an internet that doesn't have either.
>AI has made accessing knowledge more convenientIf you have to do the same research to vet it anyway then it's just a pointless step. All it did was make "it came to me in a dream" socially accepatble among twitterites.
>Not to mention that AI systems can be of great importance to future socialist experiments, as we know that the quantitative leaps produced by AI can create qualitative changes.This paragraph does not convey information.
That said, machine learning has always been useful and will continue to be, there's a reason the USSR dabbled in it. AI, thoughever, is a marketing term and there's no place for commercial advertisement under socialism. So no, AI won't survive the bubble.
—
Since machine learning can be used for weather forecasting stuff (
https://www.nature.com/articles/s41586-024-08252-9) maybe it could be useful for brain-to-computer interface
Post too long. Click here to view the full text.