Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Sorry, but this is just a "movie plot terrorism" entry for Schneier's annual competition, not reality as we live in it.

In what way is acquiring the skills required to download/compile/configure this software, then integrate it with an electrically detonated bomb - more likely to be undertaken by "the bad guys" than hooking the detonator up to the backlight of a burner phone and standing a block away and texting it? (Just like every reported IED from the latest war-torn country being bombed into democracy and freedom.)

It makes me mad when intelligent people think up "bad things" that might be done with extremely high barriers to entry, when way simpler and easier to achieve methods for the same "bad stuff" are obvious.

Case in point - one of my local councils has just blanket banned "drones" (without even bothering to define what a "drone" is) on the pretext that "there is a concern about people taking unauthorised photos of children in public areas" - See more at: http://www.ausleisure.com.au/news/safety-fears-see-leichhard....

Watch this video of a $600 point-n-shoot camera (at least past the 37 sec mark) and tell me you're more at risk from someone with a drone invading your privacy: https://www.youtube.com/watch?v=Csp6asxf00o

If people want to take your (or your families) picture, they will. Probably with their cell phone without anyone noticing, or with a $600 camera on a tripod so far away you can't even see them. They _won't_ buy a $1,200+ drone and learn to pilot it, then fly it up close where you can see it. (And they _certainly_ won't be learning to assemble and tune their own quadcopter for a few hundred dollars of Chinese sourced parts. Not just to be a creep with.)

Same if they want to blow something up - they're not going to clone some open source code from github, learn how to use it's python bindings, and build a RaspberryPi powered auto-detonator to trigger off your numberplate. There are _way_ lower barrier-to-entry methods to achieve that goal (which are also way more reliable).



Yes, who ever thought that aircraft could be used against skyscrapers right? Only in Hollywood.


And hence we get groped or porno-scanned at every airport check in, and secret no fly lists which are good enough to stop people with names vaguely similar to possible terrorists from flying but which are not accurate enough for use as lists of people who shouldn't be permitted to buy guns.

Do you think that's an appropriate response? Especially since it seems to be almost universally true that every time the TSA is tested, weapons still get through the checkpoints with startling regularity.

Sorry, but I still see this as kneejerk reactions to spectacularly unlikely scenarios of "bad things happening" being proposed and regulated by people who don't care about reducing other people's freedom because it won't affect them personally.

I'm still unsure what you're suggesting "shouldn't be allowed" here? Open sourcing computer vision projects? Publishing on github? SHould all hobbyists leave face detection algorithms to Facebook and Apple and Google, because someone else might misuse the results (worse that Zuckerberg already does)? It's all extremely reminiscent of the "crypto wars" and Homeland Security's new "House Un-American Mathematics Committee": https://twitter.com/puellavulnerata/status/67290345222221824...

Me? I'm 100% for publishing this(and similar) projects - because the tech is already out there and being used. Pretty much every towtruck and repo man has had this tech running for 5+ years, and almost nobody knows. Why is it a problem now that sufficiently motivated geeks can roll their own for ~$100 and a weekend's futzing around? Same with using promiscuous wifi adapters or TV-tuner SDRs to sniff MAC addresses or TMSIs - shopping malls and law enforcement are routinely using that tech to track you, I reckon more art projects showing how simple and creepy it is would be a good thing.

There's another movie-plot bomb detonator for you - an UberTooth One (or $5 Chinese counterfeit wifi adaptor in promiscuous mode) listening for the MAC address of your phone/smartwatch/tablet. What're we going to have to ban in response to that idea?

(I know, lets ban _ideas!_... (Sorry, that's way snarkier than intended...))


> Do you think that's an appropriate response?

No because it is not going to make much difference.

> Sorry, but I still see this as kneejerk reactions to spectacularly unlikely scenarios of "bad things happening" being proposed and regulated by people who don't care about reducing other people's freedom because it won't affect them personally.

Fully agreed on that one.

> I'm still unsure what you're suggesting "shouldn't be allowed" here?

This software has a ton of bad use possibilities, I just threw out the first one that I could think of, there are a whole raft of others.

> Open sourcing computer vision projects? Publishing on github?

No, it's inevitable. But there is currently no framework on how to deal with these things. Just because you can doesn't always mean that you should. There are a ton of things I could do that are legal but that does not mean that all those things have a net-positive effect on the society we live in and I think that the ability to build these systems comes with some responsibility.

> Me? I'm 100% for publishing this(and similar) projects - because the tech is already out there and being used. Pretty much every towtruck and repo man has had this tech running for 5+ years, and almost nobody knows.

Yes, but they are limited in quantity and enough of a quantitative change is a qualitative change.

> (I know, lets ban _ideas!_... (Sorry, that's way snarkier than intended...))

I think I beat you to that:

https://twitter.com/jmattheij/status/670367390828535808




Consider applying for YC's Summer 2026 batch! Applications are open till May 4

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: