Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

If only the company is put in danger and they stubbornly refuse to resolve the issue, I'm not exactly sure why anyone would work so hard to convince a company to do this. The job of reporting the issue is done, a corporate decision has been made. If that decision is to remain vulnerable, as long as it does not affect users directly, why bother?

Unless, as others suggested, you can legally make a profit out of it, then by all means! Otherwise, just let it go...



I think this raises two issues:

1) It can be difficult to know whether customers are (or could be) affected. Just because the author can't find the case doesn't mean someone else can't. 2) If the company refuses to fix this broken window, they may find other broken windows that aren't worth fixing, which may affect users. By releasing the vulnerability, one can force the company to become more conscious towards security in the long-term.


> If that decision is to remain vulnerable, as long as it does not affect users directly, why bother?

Because if that company is storing sensitive information belonging to others (emails, credit cards, etc), it would be irresponsible to not disclose it. Chances are someone else found out and has been actively exploiting that vulnerability.


It appears he wants to publish the vulnerability (might be a novice security researcher) without getting sued.


He is very, very unlikely to be sued provided that (i) he didn't explicitly agree to a contract forbidding security research when he acquired the application, (ii) he acquired the application lawfully, (iii) he at no point solicited business from the vendor of the application, (iv) he didn't exploit the vulnerability in any way that could be construed as having caused direct damages to the vendor, and (v) he is scrupulously honest and careful about how he writes the finding up.

Contrary to popular opinion on HN, finding vulnerabilities in software you yourself run on your own computer is rarely fraught. We hear about the exceptions in the news because they're exceptional. In reality, people publish vulnerabilities all the time.

The same thing obviously CANNOT BE SAID about finding vulnerabilities in other people's web applications. Finding web vulnerabilities without permission is highly fraught. You can easily find yourself both civilly and criminally liable for doing so.


I would adjust "other people's web applications" to be "in other people's deployments."

For example, it is fine to take someone else's commercial web app, install it on your own server, and beat it up.


That is a good point, thanks for amending.


I agree with you. Given some of the stories we've seen lately, my approach, after disclosing the vulnerability once, would be a three step process:

1) Do nothing. 2) Fuck 'em. 3) Not my problem.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: