You can have a semi-manual system, with user input. So websites filled with SEO get manually flagged by users. Of course, you need to trust your users as well... this can be done with a Web of Trust-style reputation system: users endorse each other, and you can build a reputation graph/tree that traces reputation and easily cull bad subtrees. If this endorsement system conserves reputation (you give a fraction of your reputation away each endorsement, and new reputation is never created), then it becomes sybil-proof, where it's not advantageous to create say millions of users to increase reputation.
why can't all the users that are being paid to promote spam endorse each other? And won't distributed trust systems just make some people "trust billionaires" and others "trust impoverished" though no fault of their own? If everyone trusts Oprah she'll have the types of same power people complain about billionaires people now. Basically influence. Also, anyone who's close to her gets the blessing of her influence where as if you're far removed you get nothing. Seems like the just reproducing the exact thing so many are trying to counter.
Well, you need to define your objectives. No system is robust to a failure of all its actors. If every user (and even developer) is ill-intended, no system will give good results. So we need some "hopeful" (and accurate) assumptions.
One might be that the typical user can sensibly elect a few individuals to trust -- it could be developers (which are a natural choice for trust), to activist and publicly visible individuals (even close friends). Then presumably you could adopt his trust model (such individuals could be roots in independent conservative trust webs/graphs). I think a very large number of such webs might be computationally expensive, but hopefully you'd be able to find someone you trust or start your own independent graph (if you trust no one, you'd effectively lose all anti-SEO measures I guess). This very naturally leads to a decentralized reputation system!
Mutual endorsement, if reputation is conservative, is like giving eachother reputation. If both users reputation is equal, there's no net gain (total reputation is always constant).
If one users reputation is higher, there's redistribution, but algorithms would need to carefully weight reputation linearly in all decisions, making redistribution not advantageous, solving any issues with sybil attacks[1].
(previous suggestion here: https://news.ycombinator.com/item?id=31585340)