A stab at the sealer in E
Mark S. Miller
Mon, 08 Nov 1999 10:07:25 -0800
[Chip & Doug, be sure to read this one. --MarkM]
At 11:23 PM 11/7/99 , email@example.com wrote:
>In terms of your auditors, would it be that the PassByCopy BrandMaker
>would be accepted as confined (or some similar security property)
>while the PassByProxy version would not?
Yes! The PassByProxy BrandMaker of VatS would pass the confinement audit of
VatS's "confined" auditor, but the proxy (remote reference) to it would fail
to pass any other vat's "confined" auditor. Similarly for the sealers,
unsealers, and envelopes it makes. This is proper, as the VatS BrandMaker
can be used by objects on VatS with no loss of security. For the
CryptoBrandMaker, it and all the objects it makes, would pass the local
"confined" auditor on every machine onto which they are copied. (Actually,
this isn't true of "confined" for a peculiar reason that will only distract
the current discussion. The general point is correct.)
>You might have two flavors
>of BrandMaker in the system, the crypto one which is slow and bulky but
>passes the auditor for cases where that is needed, and the PassByProxy
>one you have now, which is very fast but does not pass the auditor? Then,
>you want to make sure that they are semantically equivalent other than
>this auditor test?
Yes, or close enough to equivalent that we can be confident that, if we can
take code written reasonably for the PassByProxy BrandMaker, and provide it
instead with the CryptoBrandMaker, that we won't break it. Unlike the
physical vs virtual memory case, we don't require the difference to be
undetectable by code that might try to do so. Let's call the first
"cooperative equivalence" and the second "adversarial equivalence".
When we need equivalence for security purposes we generally require
adversarial equivalence. Here, the need for equivalence is just the normal
software engineering need for substitutability -- or object-oriented
polymorphism. Hence cooperative equivalence will do. (Actually, the
CryptoBrandMaker is cooperatively upwards compatible from (conventionally, a
subtype of) the PassByProxy BrandMaker, since it provides stronger software
engineering guarantees: It will continue to work even if we lose the
connection to the machine of origin.)
>One problem I still see is that the PassByCopy version could leak
>information about how big the sealed data is. Imagine that instead of
>passing the sealed envelope directly to Bob, Alice first passes it to
>Ted who gives it to Bob, and Ted isn't supposed to know what is inside.
>It is supposed to be opaque to Ted. But the PassByCopy version lets Ted
>know how big the encrypted data is, potentially, while the PassByProxy
>version would not, I think. (Maybe this would only happen if the data
>being sealed was itself PassByCopy?)
You are absolutely correct, and I confess I had not thought about this
issue. (Thereby revealing that I'm not really a cryptographer.) How
important is it? Mustn't this be true of any public key system? Shannon
says the cyphertext obviously cannot be smaller than the compressed
plaintext. It would seem the only way to avoid revealing the size of the
plaintext is to pad to the largest possible cyphertext. Of course, there is
no largest. I'm confused.
>Is it anticipated that user-written objects would ever be PassByCopy?
Yes! I expect to write CryptoBrandMaker as normal unprivileged E code.
There are many other examples. See the old
http://www.erights.org/elang/same-ref.html , especially the section "Selfish
and Selfless Objects". What I was calling "selfless" then I am calling
"PassByCopy" now. (Obviously I need to update this document.)
>The security properties of such an object would be very different from
>those under the PassByProxy model, because the object would not be able
>to keep secrets from whatever Vat it runs on. It would be necessary to
>be very careful in writing such code, and the security analysis would
>be completely different than for PassByProxy code. The mint code,
>for example, would obviously be completely unsafe if the purse were
Exactly correct! We did such security analyses at EC (now
"Communities.com") as part of the decentralized secure system of social
virtual realities we constructed, "EC Habitats". The analyses were not as
formal as they should be, but we put a lot of time and energy into 'em. It
was always important to keep straight whether you were dealing with proxied
state or copied state, but we found we were usually able to keep it
straight. This kind of analysis is definitely a cost of our model, but I
believe the price is well worth paying.
The abstraction we used at EC to organize PassByCopy vs PassByProxy issues,
and all their attendant security concerns, is the Unum. E no longer has
Una, I expect it will eventually have them again, but I want to build back
up to it slowly and carefully. In theory, it was a great, highly reusable
pattern incorporating PassByCopy elements and PassByProxy elements,
orchestrated to use well the strengths of both. As a piece of engineering,
it was one of those disaster's to learn from. I expect to spend the next 10
years unravelling the lessons I learned in those disastrous 2, and build
back up to Una slowly. I'm still convinced that the Unum is a good, perhaps
Sorry for not explaining Una, beyond giving you that historical tidbit for
context. It would take a much longer message to explain Una (which should
really come from Chip anyway), and most of that explanation isn't needed in
the current discussion.
In any case, Una are built purely out of PassByProxy, PassByCopy, and
PassByConstrucion, which I haven't yet explained and need to. In a
reductionist sense, Una have no security properties other than the security
properties of distributed capabilities with these three passing modes.
Taking things slowly and carefully this time around, it is proper that we
postpone reinventing Una until we understand the security of these three
passing modes and how they interact. In any case, we can build vast numbers
of useful secure distributed apps and smart contracts before we reinvent Una.
>On the other hand it would be quite useful if such objects could exist,
>whereby you could bring in some code from a foreign machine, audit it
>for safety, and then run it locally. You would feel safe in giving your
>secrets to this piece of code because you'd know that there was nothing
>harmful it could do with them.
I hereby pronounce you to have "gotten" confinement! Very good.