Dead Internet Theory
Is the internet still real? Are we real? Is reddit, Youtube and Tiktok...real?
While I wouldn't go so far as to say that there are no real people on the internet, we have for some years now been rapidly approaching a potential event horizon in the mismatch between the social expectations of effort to produce content, and the actual mechanical mechanisms by which it can be done. The conspiracy theory version of this is the Dead Internet Theory: the hypothesis that the vast majority of all content on the internet is fake, being produced by autonomous, non-sentient systems based on Markov chains and (in the more recent incarnations) Large Language Models.
There's a few variants of this of course: email spam has been a thing forever, but that doesn't mean the internet is dead - it means a huge amount of traffic is just useless and being blackholed in the competition between systems.
The one I worry about, is the supplication of
zeitgeist - the idea that there's a feel, vibe or overall sentiment to the present moment or discourse which pervades our social interactions. The question, quite rightly is, is that real? While any given person will happily say "don't believe what you see on the internet" they'll then turn around and
believe exactly what they see on the internet, and even if they don't their media sources
will.
How much of what we see and believe is being influenced by the broad strokes of autonomous content systems, with or without human directors at the helm?
The part of this which got me thinking about it is when I started to really think about the current "vibe" driver of the internet today: Tiktok. You've seen Tiktok videos, and people reacting to Tiktok videos, even if you haven't ever actually used Tiktok. It's pervasive, and it's everwhere. But...how much of it has to be real?
Because when you get right down to it, the classic Tiktok video is essentially: a person dancing, text pops up on screen with some sort of message...and that's it. That's how you construct something that is either meaningless, funny or ragebait. But...does any of that have to be real? How would we know if it's not? There must be tens of thousands of hours of this exact content that gets almost no views. OpenCV and other software can separate out the text components. Software like
FaceSwap can replace the faces of the people involved with
people who do not exist. The amateur quality of these videos is treated as a proxy for "effort to fake" and the default assumption is "it's not worth it": and yet the pipeline is accessible for a hobbyist, let alone anyone willing to front money and hire full time employees to do this.
There is a potential video pipeline here which would be almost undetectable: which is not state of the art, not the advanced text-to-video services, just basic, regular small bits of advanced software. Splice that together and how productive could you be in messaging manipulation, in affecting the view of the average person as to what they think the "average" is? Particularly when a service like Tiktok has been overtly designed to dark pattern users?
In short: I feel like there's been a baked in assumption that "big corporations and governments are slow and don't move with the times" that is being used to underestimate the technical changes and speed of adoption which is governing our social media and internet landscape, and if fake manipulative content pipelines haven't fully spun up yet, then they will soon. Headline developments of "AI can't do hands right" give people undue confidence in their ability to spot fake content, and there are plausible pipelines which could exist to day which would provide a way for a moderately funded actor to manipulate public opinion on a vast scale.
Posts
But what's really fucked up is the vast majority of YouTube is just... nonsense that's eerie and weirdly harmful? I don't have links at the moment but everyone here knows what I'm talking about I hope - vaguely Stuff For Kids except it's like, AI-generated and thus Very Off and Very Sick. The vast majority of stuff for kids is like, real weird shit.
and the gengars who are guiding me" -- W.S. Merwin
First of all, someone's named Caroline Busta.
Second of all there's definitely a sensation of marginalization / alienation of humanity online but I feel like there's an underreported or not-very-investigated issue of troll farms just as much as bot farms
and the gengars who are guiding me" -- W.S. Merwin
I feel like a bunch of the kids stuff on Netflix is AI generated
Whoa! Is there an Echo in here?
Throw out assumptions with too small a sample set, and start thinking in terms of memetic & stochastic hyper-sets throughout society(s).
The internet isn't 'dead', just vaster than any one person's Overton window can reasonably keep up with in this exponential age of human activity and advancement, IMO.
I think what we're experiencing is simply thermodynamic & chiral constraints pushing up against the current limits of the human psyche/neuro-physical/societal systems. 🧠🔍
And/or: It's a SNR paradox for a nascent hypothetical emerging AGI to deal with at some later point in human history... 🤔💭
Distinct in its capability and speed.
The issue we are facing with LLM's is that they will rapidly be indistinguishable from a human online, and capable of generating a fully convincing attractive persona to attach their voice to. Right now, we aren't quite there, but there seems no argument that we won't soon be there. This isn't to say that these bots will be truly intelligent, or capable of novel activities, but they will certainly be capable of
Looking sexy while reveiwing a handbag
Telling you Joe Biden once killed their dog
Just asking questions about the fact that Australian Immigrants commit 100x more crimes than other people
Unlike modern content farms etc, which require you to enslave thousands of people in Malaysia etc, you can create 10,000 narratively coherant AI bots to sell this message in minutes. Each can pump out a post an hour, and soon you've got yourself an astroturfed revolution on your hands.
Unlike with a mob of real people, where it may take me months to spin up an effective online mob, now I can do it in minutes.
Because it's all driven by automated algorithms and you can make money off it, people are constantly looking for ways to automate the creation of sludge to game the algorithm to make that money. And all AI does is make it easier and faster to make sludge. To the point that it's easier to make it then to remove it. At least, as the internet currently functions. It's always been there though as long as the algorithm has been a thing.
And it may only get worse:
https://arstechnica.com/ai/2024/03/researchers-create-ai-worms-that-can-spread-from-one-system-to-another/
I think part of it is a difference in delivery system. Limbaugh was only there if you chose to tune in to him. His opinions were spread and laundered to an extent, but I rarely encountered Limbaugh back in the day, and it was the sort of propaganda that wanted you to know where it came from, largely.
Random bullshit on Tiktok finds me even though I've never had Tiktok installed and generally try to avoid it. And it's all reactions to reactions to posts about reactions remixed to some song that sounds vaguely familiar.
And it doesn't feel like the introduction of other media, like film or comics or video games or whatever. It feels specifically designed to seem weird and alien. Like it's not supposed to look like something created by humans.
I don't necessarily think it's all created by bots, but there often seems to be a fundamental lack of sincerity to it. Everyone seems to be chasing the algorithm to a greater extent than whatever trend chasing existed in the past.
Just a regular one.
Celeste [Switch] - She'll be wrestling with inner demons when she comes...
Final Fantasy XII: The Zodiac Age [Switch] - Sit down and watch our game play itself
https://youtu.be/LKp2gikIkD8?si=Lp7mGIOgf3YcRRjp
This covers that topic some, and was created by a human who researches things and make reasonable well informed statements on YouTube.
Just as an example, Father Coughlin was mentioned and there's an interesting paper from a few years back where they do an analysis on the link between changes in public policy and the spread of Father Coughlin's radio show across the US. And they found that it seemed to have an effect. As his show got into more and more people's homes, it was changing people's opinions and through that, public policy
I think you'd be hard pressed to say that the right-wing media system of the last 50 years didn't have an effect either. Or that Youtube hasn't shaped culture. People have said these things about all these new inventions as they've come down the pipe and they haven't exactly been wrong.
i dont think the internet is or will die, but what I think IS happening is that we have already passed the high watermark for the internet as we have come to understand it... and is now doing something that I think could be fairly described as receding.. although thats not quite what its doing
but almost any major """"platform"""" is exposed heavily to artificial content in some way.. I am an avid reddit user, and its pretty remarkable what percentage of content posted to the larger reddits are just automated reposts of popular content... even if the automation isn't outwardly offensive it still leaves that impression that you are perusing some sort of artificial content mill
i think/hope that there will eventually be an embrace of "old internet" concepts... much like, say, this forum.. which apart from the occasional online viagra offer seem to be unscathed by that kind of junk content.. it doesn't attract it because there's basically no way you can possibly profit from it (yes, reddit bots are for profit... somehow.. jesus...)
the relationship between platforms and quality will come to be known as similar to that of network television and quality... as in.. not very good, but very appealing to a massive amount of people thanks to a formula derived at the speed of light
we also talk about other random shit and clown upon each other
Dancing Baby was funny because it was obviously fake and kind of creepy, though. It was a novelty. Kind of the exact opposite of the situation we're in now.
Im not sure that the internet was ever really great for content creation or high quality...anything. We got some by accident just like some high quality communities exist by accident, but overall, lots of eyeballs and uncurated content are a recipe for underwhelming results. As soon as communities become tolerant of low quality, things plummet. Stackoverflow has help vampires, Reddit has an awful front page, Youtube has algorithm chasers.
There's nothing stopping Youtube consumers from curating lists of good creators. I dont really get that issue, unless the desired use case is to mindlessly scroll AND get good content. Idk about that ever being...a thing?
Test for Echo...
https://www.youtube.com/watch?v=_zfIYInFMhg&t=22
Steam | XBL
More to the point, Dancing Baby wasn't a real baby, but it was put in the internet by a real person who thought it was funny and wanted to share it.
Versus something chosen by an AI and hosted on a platform in order to farm eyeballs and generate $0.0003 of ad revenue per engagement.
The fact that most internet content is shit doesn't bother me. The vast majority of, say, shit movies still involve people who are trying to make something good and want to succeed. Even the most shameless cash grab movie still has cast and crew that care, even if they produce the worst movie of all time.
The disturbing thing to me is how soulless a lot of it is. How nakedly devoid of not just heart but even intent. Someone doesn't sit down and attempt to make a story about talking dogs, they sit down and intend to make Content and some eldritch process spits out something that kinda seems doglike if you squint.
And then came dancing baby.
I am regular echo.
You say he's a real human, but that's just what a not-real non-human would say. And I have concrete evidence that this very YouTuber used to be a talking paper bag puppet.
It's a shame that he seems to have the issue that a number of science communicators have, where he seems pretty allergic to the idea of touching the political implications here. One reason we're not ready for this is that we have a very punishing economic system, where people have to someone appease empty suits to get money, so they can afford basic necessities for just surviving.
Now that someone can trivially spin up bots that will effortlessly generate content, usually subpar, and this can done in great numbers for a very low cost. It really does fuck a ton of people over, that rely on content creation to make a living. Sure most of it is pretty fucking subpar, but it is getting slowly better and sometimes you can get decent stuff. A flesh and blood human cannot keep up with the pace that a single generative AI can shit out content. So once the AI can sort of hit decent, we run into the issues where it can piss out enough stuff to really start fucking content creators.
Sometimes that's just giving shitty corporations a cheaper option to get hold of shit for their typical operations. Why pay an artist a living salary to make decent art, hire a writer to write decently enjoyable content and employ a developer to make a reasonable user website. When you can make a one time purchase of bots that compile some decent AI art, decent AI writing and decently AI constructed websites? Sure some people will insist on human made, but there are a shit ton of consumers and businesses that don't give a rats ass, they are fine with good enough if it gives them their dopamine hit and results in revenue going up.
There is also how this really fucks with the other shitty aspect of content creation, where a fair bit of it is getting a lucky break. It's not enough to be able to make something good, you also have to have people find it and talk about it before they find someone else that does similar or worse, you get some hacky fucker that decides it worth ripping off. Generative AI can just flood platforms with a ton of slop making it that much harder to find the gems. Depending on how things are done, it could also find the gems and then just start shitty out derivatives of those gems, ensuring that the original creators never get found and rewarded for their efforts. Finally, when businesses decide that the AI is good enough, that just means creators have one less avenue they could used to be discovered.
Then there is also how this makes doing certain things so much worse. I'd argue the job search was already a shit show before 2023, but it seems like it has gotten so much worse. It's become even easier for the scammers to burying job listing searches with all their fucking scams. It's gotten easier for less than honest business to post bullshit job listing they don't intent to fill, listing that only exist to either make people stop asking about hiring more people to lessen the workload or to just gather business intelligence on competitors (aka how many people are unhappy with their job and looking to get out and what are they getting for compensation). Also we're getting the shit show where the people that can afford to pay for signal boosting services to get noticed by an employer, are probably also buying bot programs to flesh out their resumes to easily get them past the bots gate keeping the application process.
Not to get into another thread, but this is a good prelude of how generative AI is going to fuck us with our current economic setup because it's not hard to see how it'll be used to suck down a shit ton of jobs that people currently work in the real world.
I also have to concede that I did get the order of automation eating jobs here wrong. I had predicted that the first major hit would be figuring out driverless cars, which would eat to a fuck ton of jobs. I didn't anticipate that first hit would actually be in content creation, writing beyond just stating the simplest facts in the most concise manner, art and programming. I had figured those would actually be a bit more complex than getting a vehicle to navigate around an area. Granted, I did underestimate how fucking cheap the elites are because I assumed they'd be pairing that up with a system that would allow the cars to better communicate, thus making the ability to see a little less needed.
Also there is how this really does undermine the idea of using the internet to form connections, which is going to do a real number of people in setups where they can't easily socialize out in the real world. As Kyle points out, people are likely to retreat into internet silos and the only way to get into those is to actually meet real people in the real world, so that there is a confirmation that you are an actual human before you're even allowed in. It's a bit mind numbing to realize that our species has built high density cities and created an information network that can put you in touch with anyone on this rock that has an internet or phone connection, yet we've also managed to set things up so that people people can be extremely isolated despite all of that.
Just wanted to touch on this real quick.
While AI is absolutely a major part of this problem with search engines turning to shit, a lot of this is also that people are gravitating toward mediums that are unsearchable. One of the big ones that I run up against constantly in both work and play is Discord. Your web searches don't cover Discord at all and every motherfucker has got a Discord. They don't have BB systems anymore, they barely use reddit, and those were the big sources of publicly sharing problems and results and feedback for the rest of the world to see and search for. Now everything is in millions of containerized chat servers that search engines have zero access to.
And it's not as simple as just joining the Discord for the subject you're searching for, and then searching that Discord. Because for every official Discord there's 10 unofficial Discords that might have the information you need, and so you gotta go scour those as well.
I'm at the point where, between my hobbies and my job, I'm subscribing to 10-15 discords each week just to search for shit, and at least in terms of my job, I'm rarely unsubscribing because I'll probably be back to these 120 or whatever Discords to search for some other fucking thing.
Discord has become a massive fucking menace. I would argue more of a menace than Twitter.
Also it's fun because if I'm bored I can go in and pull all the adcode (even though it doesn't work offline) and make the page look cleaner.
The internet in 2024 is absolutely less useful and usable to me than it was in 1994, 2004, and 2014.
I'm backing up shit like it's physical media these days.
Why do you have any expectation that a searchable online space exists to get that sort of information from though, with indexing etc? In the BB days if someone didnt have the info you were just SOL, it wasnt exactly a huge pool.
Doc: That's right, twenty five years into the future. I've always dreamed on seeing the future, looking beyond my years, seeing the progress of mankind. I'll also be able to see who wins the next twenty-five world series.
Because, I mean, here we are