Hacker Newsnew | past | comments | ask | show | jobs | submitlogin
Why Is the Web So Monotonous? Google (2022) (reasonablypolymorphic.com)
20 points by throwup238 on Jan 25, 2024 | hide | past | favorite | 22 comments


I feel like every article about this misses the real problem.

Internet users do not want a "we are moots" relationship with their search results. They want a content creator to content consumer relationship.

It is literally insanity to expect a professional review or recipe site to exist in such way that it doesn't make money. Professional, by definition, means income. If you want such site that isn't littered by ads and calls to subscription, what you want is an amateur site, which, as the word implies, is going to be low quality since they are literally not in the business of making reviews or recipes.

It simply makes no sense for just somebody's blog with random stuff to rank high on Google. Most people who search search for resources, and resources means there should be a collection of similar stuff on the site, and that means the whole site has to be dedicated to reviews or recipes, and not a recipe from your grandma and then photos of your vacation.

If someone makes such site and put in all this effort, they likely intend to monetize it. You can not escape this fact, specially not when the web is as large as it is now. This isn't a tech problem. This is a people problem. People simply aren't motivated to create the ideal web world.

You could try to make a website that collects random posts from third party blogs to create this web paradise that you want, but unfortunately that already exists: it's called Tumblr and it's bleeding money.


Agreed, but that's a two part issue.

#1 - The modern web has a much lower ratio of citizenCreators : consumers than the early web.

#2 - The utility of information is professionalUnbiased > proHobbyist > professionalBiased > randomPerson >> SEO spam.

Google's current ranking is SEO spam >> professionalBiased > professionalUnbiased >> proHobbyist > randomPerson.

That's a clear misalignment.


The reason for this misalignment in my opinion is painfully simple.

Imagine you are making Youtube videos for money. You write a clickbait title. You make a funny face in the thumbnail. You talk about your sponsors. You tell your viewers to like, share on Facebook, and subscribe.

A nerd would hate this. They use an ad-blocker and a browser extension to skip sponsors in Youtube videos. They deleted their Facebook. They might not even have a Google account to subscribe. They think anyone who writes clickbait titles and make funny faces in thumbnails is a moron who makes low quality videos not worth watching.

But the nerd's father, mother, grandma, sister, uncle, wife, and child couldn't care less. They all have Facebook accounts. They get baited by the titles all the time. They feel some deep, irresistible attraction to shocked faces in tiny clickable images. They share their funny videos constantly and shamelessly. And they even watch ads by accident sometimes thinking it's the actual video.

Metrics are a popularity contest and nerds just can't win a popularity contest.

There is just no way to quantify that professionalUnbiased and proHobbyist are better when SEO spam and professionalBiased are getting all the likes, shares, and links. And this is all partly due to the fact that the nerd audience of these first two deliberately wants to be out of the metrics in first place.

By hating and avoiding the modern system so hard, they failed to influence it in their favor.

Now that I think about it, I just realized that if nerds stop using Google because Google results suck, that will make Google suck even more for them because they are no longer influencing Google's algorithms.


A future in which Google degrades further because intelligent people are using Kagi et al. and Google's feedback loops further degrade their own product is a future I want to live in.

Eventually, superior products win.

That's how Google beat Yahoo.


Yeah, about that... reread my entire post.

>Metrics are a popularity contest and nerds just can't win a popularity contest.

Even if Google SERPs are all garbage, they still win if most people are fine with garbage results. It doesn't matter if a loud minority of nerds bitch about it in their blogs and podcasts and research papers and hacker forums. Most people will still use Google and they will be "okay" with spammy results.

I know this sounds crazy, but if spammy results weren't drawing real numbers, they wouldn't be a viable business model in first place.


> I know this sounds crazy, but if spammy results weren't drawing real numbers, they wouldn't be a viable business model in first place.

I would agree... in a competitive marketplace open to new entrants. I don't think we've had that after ~2010.

Defaults or pushed adoption on dominant platforms are powerful.

Look at Chrome adoption once Google started using the homepage to push it.

So nowadays, I would say we can be stuck in a local optimum by virtue of marketing dollars and platform defaults/controls.

Or to put it another way -- if Google delivered half as quality (by some objective metric) search results as a free version of Kagi -- do you honestly think search share (including mobile) would shift? And how long would that take?


The main argument is this:

"I’d always thought SEO was better at selling itself than it was at improving search results, but my god was I wrong.

SEO techniques are extremely potent, and their widespread adoption is what’s wrong with the modern web."

Detailed explanation follows.

I think the most important revelation is the description of feedback loops that punish the long tail content.

But I agree with the comment here https://news.ycombinator.com/item?id=39132404

Perhaps our expectations have also increased?

But I must agree that without the SEO spam it would be more distinctly obvious that the answer perhaps is not out there - e.g. nobody has written about Koh Lanta like places in Croatia.


My understanding about SEO is that it has 4 tiers. This is due to how search engines rank results. They want to relevant results first, but they also want results that seem high quality (i.e. authorities in the subject) first. They have algorithms to judge relevancy and authority based on e.g. keywords and number of links. The idea is that organic results should be on top, but when you think about it there is no way for that to happen.

So the 4 tiers are:

1. Pages that don't optimize keywords and don't get links. This is most pages.

2. Pages that optimize keywords but don't get links. This is some introverted nerd who knows SEO technically but sucks at marketing.

3. Pages that don't optimize keywords but get links. This is a real product or service or thing that is popular made by someone who has no idea about SEO.

4. Pages that optimize keywords and get links. This is some serious internet marketing stuff.

"SEO spam" falls into the 4th tier. Optimizing keywords only gets you in front of unoptimized pages that nobody links to. A real SEO strategy includes getting as many links as possible somehow and, in some cases, by doing "black hat" SEO by buying links or spamming links in comments of Wordpress blogs and so on. Essentially, genuine 4th tier SEO means establishing a presence on the web.

But this is why the whole idea is fundamentally broken.

Imagine someone is a nobel prize winner who writes extremely technical posts on their blog. They are the authority of all authorities in the subject, but their content is by no means popular. Only a handful of people in the world could appreciate what this multi-PhD monster of knowledge is talking about. Their posts will never get links and their keywords probably sound like made-up gibberish for Google.

Basically, if your job is being an authority on something, you are not going to waste your time building a presence on the web. It's like search engines constructed such a high bar on "quality web results" that it will never get cleared by actual quality results that aren't deliberately trying to clear that bar.


The web has become easier and easier to publish on for decades now. The low barrier to entry plug unscrupulous people makes the web worse, not a site designed to help try to find things. You have millions of people actively working to artificially increase their rankings for money. That's not Google's fault. That's like blaming Verizon for spam phone calls, or Comcast for spam email. I don't like any of those companies but I'm not going to blame them for things that aren't their fault.


Google's search market share is to blame for fostering an SEO monoculture that makes specific Googly optimizations so attractive.

Absent Google/Chrome dominance, with a wider array of search engines, optimizing for "What Google wants" would be less important. Because you'd have 5+ other engines to optimize for too.

And maybe, at that competitive intersection of traits, we'd encourage something that was closer to actual page quality.


> If I need a bus schedule, I know to talk to my local transit authority

If this was the case, public transportation usage would go down so much.


Google is so annoying for doing this. They could have curated a better web, but they've consistently done what puts their profits first. Similar things have happened in other companies, and it always leads to their downfall (Kodak, GM, Polaroid, Blockbuster, etc).


Google doesn't curate the web. They create feedback loops for their ad revenue.


Curation is the action or process of selecting, organizing, and looking after the items in a collection or exhibition.

Google selects and organizes websites that comply with their ad infestation guidelines. Looking after them is arguable.


Doing something assumes intent. Like I told, Google doesn't have such intent, its intent is create positive feedback loops to increase its revenue.


A search engine with even the slightest hint of flavor by definition is a curation of the web.


They could create a positive loop if they wanted to.


> Anyway, forgetting that Google is bad at long tails, I search for what is the koh lanta of croatia?

Is it just me, or is that a very hard query? Koh Lanta is pretty obscure. Croatia while not as obscure, won't have the tourist pages like say Italy, Greece, China, etc.

The chances of an actual web page having both of those words that is not spam seems pretty low to me.

You would have to do some type of word vector for "Koh Lanta" and then try to find a location in Croatia that matches the word vector.

LLMs might actually be better at this. Bing chat gives me the following in response to

"what is the koh lanta of croatia?":

"I assume you are asking for a place in Croatia that is similar to Koh Lanta, Thailand. Koh Lanta is an island in the Andaman Sea south of Krabi, Thailand, known for its low-key vibe, beautiful beaches, and amazing food 1. Unfortunately, I could not find any direct comparison between Koh Lanta and any place in Croatia. However, Croatia has many beautiful islands with stunning beaches and crystal-clear waters. Some of the most popular islands in Croatia are Hvar, Brac, Korcula, and Vis 2. Each of these islands has its own unique charm and atmosphere. If you could provide more information about what you are looking for, I could help you find a place in Croatia that suits your preferences. "


I thought the same, this query is not really compatible with how Google works. There's either an article about this or there's none.

LLM-s are indeed great for these types of queries where we expect some knowledge to be synthesised that should exist.


Agree in part.

Fixing it. Have the Web site code, using Microsoft's ASP.NET, etc., running as intended.


Should be (2022), I think.


Added. Thanks!




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: