Saagar Jha

Finally got around to reading about Private Cloud Compute and let’s just say I’m not super impressed
Frog is holding a box (presumably filled with cookies) as he converses with Toad.

Frog ran the inferencing in Private Cloud Compute. “There,” he said. “Now we can cryptographically attest that we do not see the request.”
“But you control the hardware root of trust,” said Toad.
“That is true,” said Frog.
4 replies →
4 replies

Tim Mahoney

(replying to Saagar Jha)

@saagar what stuff did you read? I haven’t looked into it, but I’m super interested in what it can be used for.

Tim Mahoney

(replying to Saagar Jha)

@saagar so you’re saying that since Apple controls the hardware, Apple could potentially violate the privacy guarantees in a way that isn’t verfiable/visible externally?

1 replies →
1 replies

Saagar Jha

(replying to Tim Mahoney)

Tim Mahoney

(replying to Saagar Jha)

@saagar

So even if the software guarantees/attestation/etc remain secure, some sort of hardware change could compromise the system?

I find it hard to believe that a vulnerability like that would be overlooked. I guess I assume the flow would also include hardware attestation.

Or is it something else I’m missing?

Saagar Jha

(replying to Tim Mahoney)
@_tim______ Overlooked by whom? I don’t see the server. Apple can do whatever they want to it. Like, for example, it could have a little attachment on it that waits to pull decrypted data out of RAM (don’t take this example too seriously)
1 replies →
1 replies

Tim Mahoney

(replying to Saagar Jha)

@saagar overlooked by Apple. Unless the insinuation is that Apple purposefully left a vulnerability, which I’d find hard to believe.

You’re saying some sort of physical attachment to the hardware? I’m out of my area of expertise here, so I’m not sure what’s actually possible.

Saagar Jha

(replying to Tim Mahoney)
@_tim______ Not a vulnerability. This would be Apple acting intentionally to peek inside their servers
1 replies →
1 replies

Ben!

(replying to Saagar Jha)

@saagar @_tim______

“Our threat model for Private Cloud Compute includes an attacker with physical access to a compute node and a high level of sophistication — that is, an attacker who has the resources and expertise to subvert some of the hardware security properties of the system and potentially extract data that is being actively processed by a compute node.”

Saagar Jha

(replying to Ben!)
@enefekt @_tim______ Yes it’s good that they understand the actual threats well but the problem is their threat model starts off with an attacker with problematic capabilities. Like, “we model an attack that can already access data” is…not what Craig is saying in interviews

Saagar Jha

(replying to Saagar Jha)
@enefekt @_tim______ Normally you have a threat model that’s like “we think an attacker has arbitrary read” (not a desirable end goal) and it ends with “we prevent them from getting code execution” (desirable end goal). They start from an unfortunate place

Saagar Jha

(replying to Saagar Jha)
@_tim______ @enefekt If someone said “our threat model is that an attacker has access to your iMessage conversations” and they went on to say “we think it’s hard for them to get a *specific* message from your phone, without exfiltrating all of them” you’d be go “wtf”

Saagar Jha

(replying to Saagar Jha)
@_tim______ @enefekt You’d think that was an awful threat model–attackers have access already! But that’s exactly what they’re starting from here. And mind you, their protection is “we think we’d find them if they did a broad attack” not “it’s technically infeasible to do this”

Saagar Jha

(replying to Saagar Jha)
@_tim______ Like, Apple is selling that they are better than their competitors because even though their competitors can put “we don’t look at your data” in their privacy policy they can do it anyways. But Apple can also do the same thing

Tim Mahoney

(replying to Saagar Jha)

@saagar I believe they’re saying they _can’t_ do the same thing, but I might be wrong.

Saagar Jha

(replying to Tim Mahoney)
@_tim______ Their advertising says this. Their blog post does not corroborate it

Saagar Jha

(replying to Saagar Jha)
@_tim______ Who is checking for that? How often? There is a note about a third party audit but we don’t know anything about that yet. And I’m still not sure how much trust that would confer

Tim Mahoney

(replying to Saagar Jha)

@saagar That’s a good question. You’d hope that this whole system was provably secure even if someone has access to all the hardware, software, everything. I guess we’ll see as more information comes out.

Saagar Jha

(replying to Tim Mahoney)
@_tim______ See I don’t think we actually know how to do this yet at a theoretical level
2 replies →
2 replies

Tim Mahoney

(replying to Saagar Jha)

@saagar from what I gather, part of the hard problem here is running the code in an environment that is protected from peeking at the data even while the execution is in progress. Basically like Intel SGX, but not compromised.

The write up mentions the Secure Enclave. Maybe Apple figured it out?

Mark Pauley

(replying to Tim Mahoney)

@_tim______ @saagar I am assuming that if you cannot cryptographically guarantee the actual topology of the hardware it’s running on then you can never be 100% secure if Alice can’t even trust Bob. Maybe with homomorphic encryption?

Light

(replying to Mark Pauley)

@unsaturated @_tim______ @saagar
>Maybe with homomorphic encryption?
That's what I assumed was being talked about at first.

Mark Pauley

(replying to Light)

@light @_tim______ @saagar I’ve heard HE basically doesn’t work :tiredcat:


Dominic Hopton

(replying to Saagar Jha)

@saagar @_tim______ in many ways this whole space is riddled with ‘no liquids’ security theater. You only need *one* person to get through the ‘perimeter’ with a ‘magic dongle’ that reads stuff off the side of the rack. Or *one* person to open a gap. If you don’t get it through the first 10,000 times, it’s OK. But that 10,001th time…

Tim Mahoney

(replying to Tim Mahoney)

@saagar if any of my code or PR comments are involved, do you think the lawyers will enjoy my jokes?


demofox

(replying to Saagar Jha)

@saagar fully homomorphic encryption is pretty nice for this. Maybe you are referring to implementation details though, like just a bad setup of good technology?

Saagar Jha

(replying to demofox)
@demofox That’s not a thing. Nobody has done a performant setup of homomorphic encryption, much less run a large machine learning model in it

demofox

(replying to Saagar Jha)

@saagar yeah that's a fair take! People do use fhe but in real extreme situations only, as far as I know.

Saagar Jha

(replying to demofox)
@demofox If they rolled this out I would be suitably impressed

demofox

(replying to Saagar Jha)

@saagar same here. It'd also have nice use for peer to peer deterministic simulation video games. Like RTSs and some sports games.


Paul Cantrell

(replying to Saagar Jha)

@saagar @jenniferplusplus
It seems in my ignorance here that there’s just a fundamental disagreement about the threat model: if you think that the Apple corporation in its entirety is non-malicious by definition — and if you’re an Apple exec, isn’t it? — the blog post makes a bit more sense. “You don’t have to trust us because you can trust us!”

If on the other hand you’re worried about the call coming from inside the house….

Saagar Jha

(replying to Paul Cantrell)
@inthehands @jenniferplusplus I think all this work was done with the intent to make cloud computing more secure against attackers and some marketing idiot ran with it as “wouldn’t it be great if we could flex on the privacy of other companys’ cloud computing efforts”

Saagar Jha

(replying to Saagar Jha)
@inthehands @jenniferplusplus I would bet $5 some exec specifically had a goal of “I want to go up on stage and talk about how this is basically as private as what happens on iPhone”

J4YC33 ❌

(replying to Saagar Jha)

@saagar CDNs do a similar thing to this. With SSL.

Saagar Jha

(replying to J4YC33 ❌)
@j4yc33 Uh does this involve them having private keys

Saagar Jha

(replying to Saagar Jha)
Apple seems to just categorically fail at threat models that involve themselves. I guess for iPhone you just suck it up and use it anyway but for this the whole point is that it’s supposed to be as secure as on-device computation so this is kind of important

Saagar Jha

(replying to Saagar Jha)
Even shelving insider threat, there are a lot of words for “we did TPM”. Which as everyone knows (at least, once @osy86 has done their job) is designed around a long chain of things that verify each other–and is only as secure as the weakest link.

Saagar Jha

(replying to Saagar Jha)
@osy86 To be 100% clear: you know how NSO or Cellebrite keep hacking iPhones? This thing is made so that if you do that to PCC, you get to see what is going on inside of it. And because of how TPMs work it will likely send back measurements to your phone that attest cleanly
1 replies →
1 replies

osy

(replying to Saagar Jha)

@saagar if Apple's "TPM" is the Secure Enclave Boot Monitor then your Cellebrite malware cannot corrupt the measurement without bypassing SCIP support.apple.com/guide/securi

1 replies →
1 replies

Saagar Jha

(replying to osy)
@osy86 Yes but if you exploit the measured software it’s not attesting what you care about

osy

(replying to Saagar Jha)

@saagar Yeah but see any such exploit would be found by external researchers and reported to Apple obviously

spv

(replying to osy)

@osy86 @saagar obviously, would never be sold in a shady deal to any government agencies or private companies the big guys use to escape blame (fine a few corps, execs face nothing, gov faces even more nothing)

oh wait

osy

(replying to osy)

@saagar in TPM parlance, Apple (only) implements a D-RTM, which does hardware enforcement to ensure measured code is what's executed

Saagar Jha

(replying to Saagar Jha)
@osy86 The “solution”, as far as I can tell, is that Apple thinks they would catch attempts to hack their servers. Oh yeah also hacking the server is hard because they used Swift and deleted the SSH binary. Not like they ship an OS like that already to a billion people
2 replies →
2 replies

osy

(replying to Saagar Jha)

@saagar I think their "solution" is "look if there's bug in our code, it would be found by the community" which assumes enough security researchers would care to "jailbreak" PCC servers

Saagar Jha

(replying to osy)
@osy86 Well that is part of it but also read the bits where they are like “oh you cannot do a targeted attack without hacking all our servers and we will probably catch you if you do that”

Saagar Jha

(replying to Saagar Jha)
@osy86 It is left as an exercise for the reader if you don’t want to do a targeted attack (?) or you do it so that they don’t catch you (???)

Dominic Hopton

(replying to Saagar Jha)

@saagar @osy86 Microsoft’s Azure Department would like to get in contact with them to sell them security!

Saagar Jha

(replying to Saagar Jha)
@osy86 Also other people have been grumbling about this but I’ll come out and say it: gtfo with your “auditability”. You don’t care about auditability. You care about your intellectual property. This blog post is hilariously nonsensical

Saagar Jha

(replying to Saagar Jha)
@osy86 You can’t say “researchers can inspect both hardware and software [for iPhone]” and then later go “this is the first time we are providing plaintext iBoot”. You made it difficult for researchers to do their job for *years*. Now suddenly they are providing a vital service?

Saagar Jha

(replying to Saagar Jha)
@osy86 Like, you decide to hand researchers a decrypted version (of what is still a binary blob, to be clear) the moment you needed to advertise that your cloud was safe and secure? What do you want, a medal?
3 replies →
3 replies

osy

(replying to Saagar Jha)

@saagar I know it's not your intention but I find it funny that all the messages look like they're addressed directly to me. Yes sir I will consult with Mr Cook and get back to you.

Saagar Jha

(replying to osy)
@osy86 Appreciate it

Thijs Alkemade

(replying to Saagar Jha)

@saagar They don’t have the budget for a code audit, so they crowdsource it. And they can just close all reports that require Apple to act maliciously as “Informative”.


spv

(replying to Saagar Jha)

@saagar @osy86 yes, i actually do want a medal, thank you.