Thoughts on droplets
Wed, 3 Nov 1999 10:51:39 -0500

> Presumably the "proper hardware" must also be tamperproof (or some vital
> parts of the OS must run on tamperproof hardware).

A couple of observations on this that I think I can safely make.

There is always a point at which one draws a line and says: from here down
I trust.  Where that line sits depends on the needs of your application.
For many applications, it suffices to establish with confidence that you
are running on a hardware implementation of a certain instruction set (i.e.
a Pentium or an AMD chip as distinct from an emulator), and that the BIOS
was an acceptable BIOS.  For others, one might decide it was sufficient
that the user had, say, EROS installed on their disk and not worry about
data forensics against the drive.

I do not believe that trusting a completely wide-open platform is a good
idea.  Tyler clearly feels (and has stated) that a number of reasonable
eCommerce applications can run with satisfactory guarantees under existing
operating systems.  For two reasons I do not believe this:

1. In reality, applications are installed on devices by users.  Turning off
the network services is therefore not adequate. Telling users not to
install applications is in most environments unrealistic.

2. The current virus development world is motivated almost purely by ego.
Statistically speaking, there is no financial incentive in creating viruses
today.  As people move eCommerce online this changes drastically.  When I
consider how bad the current virus situation is, and then consider the
impact that changes in incentive structure should be expected to have, I
shudder.  While I understand that nobody can *make* the end user behave
intelligently, I nonetheless find the idea that we should rely on the end
user to see to their own security reprehensible. It is morally comparable
to placing loaded guns in the hands of small children and saying that it is
the child's responsibility not to pull the trigger with the gun pointed at
their head. The vast majority of end users are simply not competent to
achieve security on their systems.

The main problem is that the incentive structure is wrong -- there are
STRONG incentives for the user to install new software in ignorance.
Stipulating that my view is purely subjective, I am much happier with a
position that says: "... provided the user doesn't engage in data
forensics"  or perhaps "provided the user doesn't reflash the BIOS prom."

It's not that I want the security to be perfect; security is a problem in
economic tradeoffs.  It's rather that I want the line to be drawn in a
place that seems consistent with the incentives experienced by the
"responsible" parties.

Jonathan S. Shapiro, Ph. D.
Research Staff Member
IBM T.J. Watson Research Center
Phone: +1 914 784 7085  (Tieline: 863)
Fax: +1 914 784 7595

Ben Laurie <> on 11/03/99 07:13:59 AM

To:   Jonathan S Shapiro/Watson/IBM@IBMUS
Subject:  Re: Thoughts on droplets wrote:
> > How do we ascertain that it is, in fact, running on tamper-proof
> > hardware?
> You engage in a challenge/response protocol with the tamperproof card.
> the card verifies that a proper OS is running on proper hardware is
> something I cannot comment on at this time.

Presumably the "proper hardware" must also be tamperproof (or some vital
parts of the OS must run on tamperproof hardware).

I see two interesting issues here:

1) There ain't no such thing as tamperproof hardware (so far, but I
don't see how there will be in the nearish future, either).

2) There's an interesting bootstrapping problem: presumably the
tamperproof hardware generates the key after manufacture (else it can be
stolen). Someone then has to take the corresponding public key and sign
and circulate it. Using another tamperproof key (and a pile of
tamperproof hardware). I see two problems, firstly the infinite
regression involved, but that can be broken by having one (or a few)
"high-confidence" keys (i.e. keys that _may_ have been tampered with but
we're pretty sure haven't). But that leads to another problem: we have
to trust a smallish number of people who a) have us all over a barrel
(and hence can't be trusted), and b) probably work for the NSA (and
hence can't be trusted).




"My grandfather once told me that there are two kinds of people: those
who work and those who take the credit. He told me to try to be in the
first group; there was less competition there."
     - Indira Gandhi