Not only that, but subreddits like r/AmITheAsshole are full of AI slop. Both in the comments and in the posts. It's a huge karma mining operation for bots.
This is sort of funny. Given how common it is to spot bots on Reddit now, it seems like they are likely to completely overwhelm the site and drive away most of actual humans.
At which point the bots, with all of their karma will be basically worthless.
Kind of extra funny/sad that Reddit’s primary source of income in the past few years appears to be selling training data to AI labs, to train the
Models that are powering the bots.
> At which point the bots, with all of their karma will be basically worthless.
Not really, it will still be kind of valuable for influence campaigns, a lot of people don't get it when there is a bit in the other side. Hell, a lot of times, I don't get it.
I know a fair number of people “normies” who get some value out of smaller niche Reddit communities — for advice, and things like product recommendations.
If suddenly all the posts are coming from bots who are trying push a product or just farm karma, I assume (perhaps naively) that those folks will get a lot less value, and stop showing up — even if they don’t realize it’s bots on the other side of the conversation.
Even before the advent of AI reddit was notorious for obvious bullshit being posted for karma farming. r/aita is even more famous for people making up stories for unknown and known purposes (known in the old days as "bait").
Plus, there's the disproportionate ratio of posters:commenters:lurkers. The tendency to comment over keeping ones thoughts to themself is a selection bias inofitself.
Great insight, didn't thing about it even anecdotally. I was lurking on Reddit since 2008 and finally created an account in 2012 when someone was really 'wrong on the internet' and had to step in.