Re: X509 digital certificate for offline solution
From: Valery Pryamikov (valery_at_harper.no)
Date: Tue, 16 Aug 2005 15:03:40 +0200
> crica 1999, one of the major PKI certification authorities approached a
> large financial operation and convinced them that they needed to deploy
> a PKI infrastructure enabling their customers to do online, internet
> account transactions. This was a financial operation that had
> significantly in excess of 10 million accounts.
> the scenario went:
> 1) the financial institution would register a customer public key for
> every account
> 2) the financial institution would transmit the resulting updated
> account database to the certification authority
> 3) the certification authority would munge and re-arrange the bits in
> each account record ... producing one digital certificate for each
> account record.
> 4) after a couple hrs, the certification authority would return all the
> recently produced digital certificates to the financial operation ...
> which would then store them in the appropriate account record and
> convey a copy to the appropriate customer
> 5) the customers would then generate digitally signed account
> transactions, package the account transaction, the digital signature
> and their copy of the digital certificate and transmit it over the
> internet to the financial operation.
> 6) the financial operation would pull the account number from the
> transaction, retrieve the corresponding account record, verify the
> digital signature using the public key in the account data base ... and
> NEVER have to make any reference to the digital certificate at all
> the financial operation had spent nearly $50million on integrating a
> PKI infastructure when it dawned on somebody to do the complete
> they had already agreed that the certification authority would get
> $100/annum/account for the production of (redundant and superfluous)
> digital certificates that need NEVER actually be used.
> doing the complete financials resulted in somebody realizing that the
> financial operation would be paying the certification authority
> $100m/annum per million accounts (or $1b/annum per 10 million accounts)
> for redundant and superfluous digital certificates that would NEVER
> actually be used ... aka certificateless operation (other than the
> internet payload for continously transmitting the
> digital certificates hither and yawn)
> the financial operation eventually canceled the project and took the
> $50m hit.
> this was actually a relying-party-only certificate scenario
> where the operational account contains all the information about the
> entity and the entity's public key (as well as copy of the entity's
> public key and a stale, static copy of a subset of the entity's
> operational information in the form of a digital certificate).
> this is offline, from the standpoint of the relying-party not needing
> to contact the certification authority when processing a digitally
> signed transaction ... in part, because the relying party actually has
> all the real-time operational information as part of executing the
> transaction (and NEVER actually needs to reference the redundant and
> superfluous, stale, static digital certificate).
> however, the certification auhtority was originally expecting to be
> paid $100m/million-accounts (well in excess of billion dollars) per
> annum for the redundant and superfluous, stale, static (and need NEVER
> be referenced) digital certificates.
> now, a number of operations have used tamper-resistant hardware tokes
> (USB dongles) as repository for protecting the confidentialty of
> private keys. This becomes truely a "something you have" operation ...
> since the hardware tokens perform operations using the embedded private
> key ... but the private key never exists outside of the confines of the
> human operators and other agents can still compromise any system useage
> involving the private keys ... which is an overall system security
> integrity issue. however, the private keys are never divulged ...
> eliminating the system security confidentiality issue with regard to
> the private keys ... and crooks can't obtain the private keys and setup
> a counterfeit operation impersonating the original system (possibly
> unknown to the original operation).
> this is my frequent refrain that most operations treat public key
> operation as a "something you have" authentication ... aka the party
> has accesws and use of the corresponding private key. When purely
> software implementation is used ... there typically are attempts to
> closely emulate real hardware token operation ... however software
> emulation of hardware tokens have several more threats, exploits and
> vulnerabilities compared to real hardware tokens.
> one way of looking at this issue is where does the security perimeter
> lay. the security perimeter for a hardware token ... tends to be the
> physical space of the token. the security perimeter for a software
> emulated hardware token may be all the components of the computer where
> that software is running.
> for financial environments ... like PIN-debit ... there are external
> tamper resistent hardware boxes that do all PIN related processing for
> the financial institution. PIN are entered in the clear at POS
> terminals and ATMs ... but then immediately encoded. From then on, they
> never appear in the clear. the backend gets the transaction and sends
> it to the box .. and gets back some answers ... but standard operation
> never sees the PIN in the clear.
> the (security) integrity of the backend systems might be comprimized by
> insiders ... but you don't find insiders harvesting PINs from the
> backend systems (i.e. security confidentiality) and using them in
> impersonation attacks with counterfeit transactions.
> part of this is that security integrity compromises tend to be a lot
> more difficult than security confidentiality compromises (copying the
> data). security integrity compromises also tend to leave a lot more
> traces back to the people responsible ... compared to security
> confidentiality compromises.
> one of the claims that we've frequently made with respect to aads chip
> for public key operation, was being free from having to worry about the
> ins & outs of PKIs and digital certificates .... we were able to
> concentrate on the fundamental threats and vulnerabilities of the
> actual operation of public key processes. For instance, a very
> fundamental threat and vulnerability is the integrity and
> confidentiality of the private key. If i was looking at wanting to pay
> $100/annum on stuff associated with public key operation ... I might
> consider it much better spent on hardware tokens than on digital
> in fact in the security proportional to risk scenarios ... slightly
> related (i.e. grading the integrity levels of hardware tokens for use
> with operations involving different levels of risk) ... slightly
> i.e. we've take it as a given that the integrity of the
> originating/remote environment is taken into account when evaluating a
> transaction for authorization. this includes the risk level associated
> with whether or not a real hardware token is being used and if a real
> hardware token is being used ... the evaluated, integrity of that token
> (which might change over time as technology changes). For a large
> percentage of the business processes in the world, we assert that the
> integrity level of the remote end is of more importance the a lot of
> personal information about the remote entity (which the vast majority
> of operations already have on file ... so it is of little interest to
> duplicate such information in digital certificates).
> so another simple test ....
> i would assert that the integrity level of the originating environment
> (software token or hardware token and the assurance level of such
> token) is one of the primary pieces of information that would be of
> interest to a relying-party ... right up there with what is the public
> key. so a real buisness oriented digital certificate would not only
> give the public key ... but also provide the integrity level of the
> environment protecting the private key.
> when i examined x.509 fields several years ago ... i couldn't find one
> that provided the integrity level of the private key protection
> although some have simple flag that can be used to indicate whether it
> is software private key or hardware token private key. how many
> certification authorities have you heard of that have a process of
> checking whether they are certifying a software private key or a
> hardware token private key?
> the literature has lots of stuff about the integrity level of
> public/private keys based on the number of bits in the key ... but I
> haven't seen anything on the integrity level of private key protection
> ... and/or writeups on applications that even make decisions based on
> whether they find they are dealing with a software private key or a
> hardware private key.
> another indication was a couple years ago, i was giving a talk on the
> importance of the privatey key protection integrity level ... and
> somebody in the audience (from some gov. agency) said that if i would
> provide the full definition they would see that it was added to x.509v3
This is really interesting case study that clearly demonstrated problem of
misplaced trust and dirty play of that "one of the major certification
For all means, financial institution could have set up their own CA for that
project... and of course blind signatures could pass better for
relying-party-only scenarios... but I don't know about their options to
license blind signatures at that time ( great that the patent expired now