[cap-talk] Why tokens have short lifetimes in OAuth-WRAP
kenton at google.com
Sun Mar 21 17:46:57 PDT 2010
On Sat, Mar 20, 2010 at 8:42 PM, David-Sarah Hopwood <
david-sarah at jacaranda.org> wrote:
> The server can temporarily cache the results of a revocation check.
> If the server performs revocation checks only on giving out tokens, and
> it performed its last check at time t ago, then it may give out a token
> with expiration time at most T-t. If it performs revocation checks on
> requests, OTOH, then it may give out a token with arbitrary expiration
> time. In the latter case, it must check for revocation whenever its
> last check was more than time T ago.
> Notice that the server must perform at least as many revocation checks
> in the first approach (using short-term tokens) as in the second (using
> long-term tokens). In addition, it has the extra bandwidth and processing
> overhead of renewing tokens, which isn't necessary at all in the second
Again, consider the case where we have 1000 replicas of the server. Each
request for the resource may go to a different replica. In order for your
caching approach to work, all these replicas must share a cache, which means
that reading or writing the cache involves additional network hops. Not
that I think this is a compelling argument against long-lived keys, but I do
think there are complications to consider.
I think, though, that the real issue here may be more about failure modes.
In a system with volatile keys, it's easier to see that the "rest state" of
the system -- that is, the state it tends towards when things fail -- is for
grants to be revoked, rather than for keys to remain valid when they should
have been revoked. With long-lived keys, you have to do extra operations to
revoke a key, and if those operations fail, the key remains valid, possibly
allowing security breaches. With volatile keys, you have to do extra
operations to *prevent* a key from being revoked, otherwise it gets revoked
-------------- next part --------------
An HTML attachment was scrubbed...
More information about the cap-talk