Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

A question I've had about Signal is what is stopping Apple from modifying and rebuilding the source with a backdoor in it? Is this technically possible (seems like it would be since they control distribution of the binary to devices)? The article is correct in stating that web based chat is inherently insecure but it seems all iOS apps are also inherently insecure. I'm by no means an expert though so would love to hear from someone with more knowledge.

EDIT: Thank you for the responses! It pretty much confirms what I thought; Apple _could_ access your communication (either through keylogging at the OS level or backdooring Signal) but this solution is better than everyone use plain text communication. I personally would not trust Apple with my life if I needed that level of protection but maybe that's not the main use case for Signal.



Technically? There's nothing stopping them. For that matter, there's no stopping Google from doing the same. There's also no stopping Apple from patching LLVM so that only patched versions of OpenSSL are ever compiled against. The question is how paranoid are you and what is your threat model?

We have to trust someone, eventually. This is especially true for the 99% of the population who doesn't have the skill to compile source themselves (nor should they have to).


Just in case nobody has gotten to enjoy this gem:

http://wiki.c2.com/?TheKenThompsonHack

Ken describes how he injected a virus into a compiler. Not only did his compiler know it was compiling the login function and inject a backdoor, but it also knew when it was compiling itself and injected the backdoor generator into the compiler it was creating. The source code for the compiler thereafter contains no evidence of either virus.


Which is why standardization is just as important, if not moreso, than openness in making sure things stay secure. Such an attack is made a lot more difficult if you have a second toolchain you can use to verify things, and even moreso if you have a third.


> For that matter, there's no stopping Google from doing the same.

That's the exact reason why package signing is decentralized in the Android ecosystem. All apps in the Play Store are signed by their developers.


With such a system, you must end up trusting a certain entity; it's turtles all the way down otherwise. No system is independently secure.

Similar questions include: What if a CA is compromised? What if Apple/MS bundles unwanted certs with the OS? What if Intel/AMD biases the on-die hardware RNG or other hardware crypto primitives? What if Apple/MS bundles a backdoored compiler a la "Reflections on Trusting Trust"? What if MS/Apple backdoor the entire network stack, including the physical and data link layers? etc. etc.


Does Signal support reproducible builds, at least? Real question, I don't know.



Partially. They're moving towards it, but it obviously doesn't help that only half of the app is actually open source.


The way the App Store works is that the app is signed by both apple and the developer's private key. Without the developer's private key Apple is not supposed to be able to sign the app in a way that the App Store would upgrade an app and consider it to be the same app. But of course Apple could modify the app store or ios in a way to remove those restrictions.


I'm not sure it really matters -- if Apple wants to log your conversations they don't have to put a backdoor into Signal, they could just put a backdoor into iOS itself. An attacker with privileged access to the guts of the operating system doesn't have much need to muck around with hacking the applications that run on it.

Which is to say, security-minded users should strive to trust as few parties as possible, but since at the end of the day you have to trust somebody if you don't trust Apple the only really secure move would be to not use iOS devices at all.


Exactly: as soon as you use Apple, you can as well use iMessage and FaceTime with the other iOS users. You just need something to be able to communicate a bit safer with the users who don't have iOS.

But if the user has another OS, then you can believe those who get control of that OS/device can read your messages to that user and record your calls to him/her.

It's turtles all the way around. The more communication the less can you expect to remain "private." Come to think, it is so without computers too.


But one should never consider oneself secure from targeted attacks. What Signal et. al. protects from is dragnet surveillance, which Apple can perform remotely with iMessage without having to install an exploit on every iOS device. They do not have that opportunity with Signal.


> What Signal et. al. protects from is dragnet surveillance

Can it be claimed if

- the user has to log in with his phone number to Signal servers in order to communicate

- no user can use any other but Signal servers, which are hardcoded in the apps?

It seems that it's perfectly designed to at least collect the metadata and the owners of it don't want to let you change these rules.


Yes you depend on Apple's cooperation on iOS & Google's cooperation on Android. Even if you flash your phone with your own custom OS, your radio chips will still run proprietary firmware that can be updated over the air without the OS even knowing.

These aren't high priority problems right now for mass surveillance, as we have people using plain text chat. If you're expecting a wealthy adversary to directly target you, then the only safe move is to avoid technology.


There is no technical restriction. But if Apple did do this and someone figured it out it would probably hurt their perception by their users and it would be a bunch of bad PR.


What's to stop them from modifying the OS itself to spy on you? On a closed platform, you don't have peace of mind from spying.


On an open platform, you don't have peace of mind from spying...

Unless every single part of your platform is open and you build it yourself. But then make sure the parts you use to build it aren't tampered with.

I think somewhere along the conclusion of that train of thought you'll need to build a fab to make sure there aren't silicon level backdoors in your hardware.


If you care that much, just run your own OS. Many android phones have quite a few options available.


A custom Android OS is the last thing I'd consider secure...


++




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: