Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

> Apple's solution is iCloud Keychain which is E2E encrypted, so would not be revealed with a court order.

Nope. For this threat model, E2E is a complete joke when both E's are controlled by the third party. Apple could be compelled by the government to insert code in the client to upload your decrypted data to another endpoint they control, and you'd never know.





That was tested in the San Bernardino shooter case. Apple stood up and the FBI backed down.

It's incredibly naive to believe apple will continue to be able to do that.

Yeah and Microsoft could insert code to upload the bitlocker keys. What's your point? Even linux could do that if they were compelled to.

> Even linux could do that if they were compelled to.

An open source project absolutely cannot do that without your consent if you build your client from the source. That's my point.


This is a wildly unrealistic viewpoint. This would assume that you somehow know the language of the client you’re building and have total knowledge over the entire codebase and can easily spot any sort of security issues or backdoors, assuming you’re using software that you yourself didn’t make (and even then).

This also completely disregards the history of vulnerability incidents like XZ Utils, the infected NPM packages of the month, and even for example CVEs that have been found to exist in Linux (a project with thousands of people working on it) for over a decade.


You're conflating two orthogonal threat models here.

Threat model A: I want to be secure against a government agency in my country using the ordinary judicial process to order engineers employed in my country to make technical modifications to products I use in order to spy on me specifically. Predicated on the (untrue in my personal case) idea that my life will be endangered if the government obtains my data.

Threat model B: I want to be secure against all nation state actors in the world who might ever try to surreptitiously backdoor any open source project that has ever existed.

I'm talking about threat model A. You're describing threat model B, and I don't disagree with you that fighting that is more or less futile.

Many open source projects are controlled by people who do not live in the US and are not US citizens. Someone in the US is completely immune to threat model A when they use those open source projects and build them directly from the source.


Wait I'm sorry do you build linux from source and review all code changes?

You missed the important part:

> For this threat model

We're talking about a hypothetical scenario where a state actor getting the information encrypted by the E2E encryption puts your life or freedom in danger.

If that's you, yes, you absolutely shouldn't trust US corporations, and you should absolutely be auditing the source code. I seriously doubt that's you though, and it's certainly not me.

The sub-title from the original forbes article (linked in the first paragraph of TFA):

> But companies like Apple and Meta set up their systems so such a privacy violation isn’t possible.

...is completely utterly false. The journalist swallowed the marketing whole.


Okay, so yes I grant your point that people where governments are the threat model should be auditing source code.

I also grant that many things are possible (where the journalist says "isn't possible").

However, what remains true is that Microsoft appears to store this data in a manner that can be retrieved through "simple" warrants and legal processes, compared to Apple where these encryption keys are stored in a manner that would require code changes to accomplish.

These are fundamentally different in a legal framework and while it doesn't make Apple the most perfect amazing company ever, it shames Microsoft for not putting in the technical work to accomplish these basic barriers to retrieving data.


> retrieved through "simple" warrants and legal processes

The fact it requires an additional engineering step is not an impediment. The courts could not care less about the implementation details.

> compared to Apple where these encryption keys are stored in a manner that would require code changes to accomplish.

That code already exists at apple: the automated CSAM reporting apple does subverts their icloud E2E encryption. I'm not saying they shouldn't be doing that, it's just proof they can and already do effectively bypass their own E2E encryption.

A pedant might say "well that code only runs on the device, so it doesn't really bypass E2E". What that misses is that the code running on the device is under the complete and sole control of apple, not the device's owner. That code can do anything apple cares to make it do (or is ordered to do) with the decrypted data, including exfiltrating it, and the owner will never know.


> The courts could not care less about the implementation details

That's not really true in practice by all public evidence

> the automated CSAM reporting apple does

Apple does not have a CSAM reporting feature that scans photo libraries, it never rolled out. They only have a feature that can blur sexual content in Messages and warn the reader before viewing.

We can argue all day about this, but yeah - I guess it's true that your phone is closed source so literally everything you do is "under the complete and sole control of Apple."

That just sends you back to the first point and we can never win an argument if we disagree about the level the government might compel a company to produce data.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: