I'm surprised how often this comes up in YC startups, actually. I'm not sure if this is true of founders in general (it might be) but practically everyone we've funded not only wants to not be evil, but actively wants to make the world better.
This was certainly true of the Reddits. And in fact I think they succeeded. If you make a site where everyone can say what they want, some people are going to say things other people don't like. But isn't this a net improvement over the preceding model, where there were a few narrow channels for the distribution of news, and the companies that controlled them controlled the news? I'm not sure what you mean by "hate speech," and I doubt you are either, but I think we're net ahead if we have a world in which its harder for the powerful to suppress news, even if a few people take advantage of this new openness to say things that offend others. In fact, some of the best ideas started out that way.
I chose to join my current startup precisely because of the moral implications. It's a "change the world" (i.e, the real world) kind of an idea, the linchpin of which is technology-based. I wish I say more about it than that, but we're hush-hush about it for the time being.
It's interesting that I'm not alone here, that other founders here (YC-funded or otherwise -- we're not) also aim to improve the world somehow with their companies.
I saw an article recently in the WSJ [1] about how young people now are more philanthropic than previous generations. Whereas in the past, philanthropy was the domain of the rich (think Andrew Carnegie), the Internet now allows individuals, even children, to each contribute effectively in small ways. Combined together in large numbers, these small contributions can be significant.
Of course, I'm not only talking about money here. By "contribute", I also mean knowledge (e.g, Wikipedia), resources (e.g, OLPC "buy one get one"), and so on. The "new philanthropy" is all about the sum of small, individual contributions. I think this trend in the startups around here are part of this.
It doesn't surprise me that this is a common concern in YC startups. It also doesn't surprise me that the Reddit guys gave the issue consideration.
I agree that limiting the ability to suppress news, even if some people take advantage of increasing openness to advance harmful agendas, is a positive trend. I used to regularly derive enjoyment from Reddit, and occasionally, I still do. Nonetheless, I think Reddit's struggle with various forms of abuse (by idiots, racists, spammers, etc.), is relevant to those of us working on community driven web-based projects (even if "community-driven," and "web-based" are the only traits in common with Reddit, as is the case with my startup). If I were one of Reddit's founders, I'd be troubled by its current state, and I want to do whatever I can to prevent similar problems from happening to our site.
As to what I meant by hate speech, I'll lean on Justice Potter Stewart's words regarding the definition of pornography:
"I shall not today attempt further to define the kinds of material I understand to be embraced within that shorthand description; and perhaps I could never succeed in intelligibly doing so. But I know it when I see it..."
You said in that essay "like every other era in history, our moral map almost certainly contains a few mistakes." I agree with you. Nonetheless, I'm sure that it isn't a mistake to consider white supremacist propaganda, for example, to be harmful to society.
I'd be surprised if you haven't encountered hate speech on Reddit, but perhaps you haven't. I imagine it wouldn't take you long to find, if you tried, as I have stumbled across it quite regularly by accident. Regardless, the particulars of my critique of Reddit aren't relevant. Figuring out how to make a community good, and keep it good, is my goal. The consensus here appears to be that an ounce of bad sprinkled in with tons of good is still worth it. I agree, but I'm still going to think about how to reduce that ounce to a gram, or less. Any ideas would be appreciated.
If you read the essay carefully, you'd understand that in every era "right thinking people" are sure that the things they want to ban are bad. In 1700 after explaining how broad-minded you were you'd be saying "Nevertheless, I'm sure it isn't a mistake to consider atheistic propaganda to be harmful to society." And you'd be wrong.
The whole point of the essay is that you have to step out of yourself to have any hope of seeing beyond the prejudices of your time, and that this is extremely hard. Your casual use of blanket labels for forbidden ideas is a sign you don't appreciate the difficulty of the problem here.
You'll notice I have never said what kinds of speech I think should be banned. That's because I've seen enough to know that that that second clause following "I'm pretty open-minded, but..." is very likely to be mistaken. Like someone saying that some open mathematical problem will never be solved, you're setting yourself up to look like a fool to future generations.
So your use of "sure" to me is very convincing evidence that your filters will generate a lot of false positives. I spent a whole month thinking about this problem. WYCS took the longest of any essay I've written. And I would be very reluctant to use that word "sure" in this kind of situation. So either you understand this stuff so much better than me that you've passed through uncertainty and back into certainty, or you simply have the confidence in your opinions that everyone is born with.
I enjoyed the hell out of your WYCS essay, and the care with which you wrote it shows. I'm not contesting the arguments made in that essay. If, in this thread, however, you're contending that all ideas are completely relative, and that we don't have any right to pass judgments on them, then I have to disagree. If you're arguing that people should be allowed to speak freely in society at large, no matter how strange or offensive their ideas might be, then I agree. I don't think speech of any kind should be banned, and I fully appreciate how difficult it is to come to genuine truth. Aside from the existence of the self, what is truly knowable, anyway?
Be that as it may, in order to assess the quality of a community-driven site, one has to place value judgements on its content. Sometimes these value judgements fall short of perfection, but they're necessary and unavoidable.
It is interesting that you raised the prospect of filters yielding false positives. If you were talking about my judgement as an individual, then I'll admit that I'm blinded by the human condition, but I'd like to point out that I relied on blanket labels to avoid getting into a discussion of minutia of specific posts on Reddit. If you were referring to filters integrated into the software we'll use for our startup, our approach to avoid the "garbage" (we can agree that there is garbage out there, and avoiding it is a good thing, right?) is to narrow the focus of the site, rather than to build in karma-based influence restrictions and enhancements of individual users.
Hopefully I'll be posting a link here fairly soon, so all of my vague mumbo-jumbo will make more sense, and it'll take less imagination to see how our site could be used for negative purposes and how that might be limited.
This was certainly true of the Reddits. And in fact I think they succeeded. If you make a site where everyone can say what they want, some people are going to say things other people don't like. But isn't this a net improvement over the preceding model, where there were a few narrow channels for the distribution of news, and the companies that controlled them controlled the news? I'm not sure what you mean by "hate speech," and I doubt you are either, but I think we're net ahead if we have a world in which its harder for the powerful to suppress news, even if a few people take advantage of this new openness to say things that offend others. In fact, some of the best ideas started out that way.