Re: X509 digital certificate for offline solution
Date: 08/14/05

Date: 14 Aug 2005 14:26:25 -0700

Valery Pryamikov wrote:
> It's a bit embarrassing for me to admit that until now I didn't even check
> the original question ;-). But I don't think it was question about business
> process applicability, but rather a sign of complete misconception. My
> understanding of original question is that op was asking about a way of
> protecting piece information that is used by some service (daemon) from
> everyone else using this computer, including administrator/root (because if
> it was only about protecting against unprivileged users of this computers --
> simple access control would be more than enough). Of course PKI is
> completely irrelevant here!... but any other encryption related technology
> is irrelevant here as well... Since service/daemon requires protected
> information in clear text, which means that decryption key must be
> accessible to that service on that computer, but that automatically makes
> this secret key to be accessible to administrator/root of this computer as
> well. The op's problem as it is, is more close to DRM than to anything else
> (i.e. store secret key, and cipher text in one place and hope that nobody
> will be able to put them together).

remember in my context, i described asymmetric cryptography as
technology and
public keys, digital signatures and PKIs as all business processes X509 digital certificate for
offline solution X509 digital certificate for
offline solution

... so a case study from another PKI scenario where the relying-party
is offline and/or doesn't have realtime direct contact with the
certification authority (which somewhat turns out to actually be the
original design point for PKIs ... the offline situation where the
relying-party doesn't have realtime, online and/or local resources for
resolving information regarding first time communication with a

crica 1999, one of the major PKI certification authorities approached a
large financial operation and convinced them that they needed to deploy
a PKI infrastructure enabling their customers to do online, internet
account transactions. This was a financial operation that had
significantly in excess of 10 million accounts.

the scenario went:

1) the financial institution would register a customer public key for
every account

2) the financial institution would transmit the resulting updated
account database to the certification authority

3) the certification authority would munge and re-arrange the bits in
each account record ... producing one digital certificate for each
account record.

4) after a couple hrs, the certification authority would return all the
recently produced digital certificates to the financial operation ...
which would then store them in the appropriate account record and
convey a copy to the appropriate customer

5) the customers would then generate digitally signed account
transactions, package the account transaction, the digital signature
and their copy of the digital certificate and transmit it over the
internet to the financial operation.

6) the financial operation would pull the account number from the
transaction, retrieve the corresponding account record, verify the
digital signature using the public key in the account data base ... and
NEVER have to make any reference to the digital certificate at all

the financial operation had spent nearly $50million on integrating a
PKI infastructure when it dawned on somebody to do the complete

they had already agreed that the certification authority would get
$100/annum/account for the production of (redundant and superfluous)
digital certificates that need NEVER actually be used.

doing the complete financials resulted in somebody realizing that the
financial operation would be paying the certification authority
$100m/annum per million accounts (or $1b/annum per 10 million accounts)
for redundant and superfluous digital certificates that would NEVER
actually be used ... aka certificateless operation (other than the
internet payload for continously transmitting the
digital certificates hither and yawn)

the financial operation eventually canceled the project and took the
$50m hit.

this was actually a relying-party-only certificate scenario

where the operational account contains all the information about the
entity and the entity's public key (as well as copy of the entity's
public key and a stale, static copy of a subset of the entity's
operational information in the form of a digital certificate).

this is offline, from the standpoint of the relying-party not needing
to contact the certification authority when processing a digitally
signed transaction ... in part, because the relying party actually has
all the real-time operational information as part of executing the
transaction (and NEVER actually needs to reference the redundant and
superfluous, stale, static digital certificate).

however, the certification auhtority was originally expecting to be
paid $100m/million-accounts (well in excess of billion dollars) per
annum for the redundant and superfluous, stale, static (and need NEVER
be referenced) digital certificates.

now, a number of operations have used tamper-resistant hardware tokes
(USB dongles) as repository for protecting the confidentialty of
private keys. This becomes truely a "something you have" operation ...
since the hardware tokens perform operations using the embedded private
key ... but the private key never exists outside of the confines of the

human operators and other agents can still compromise any system useage
involving the private keys ... which is an overall system security
integrity issue. however, the private keys are never divulged ...
eliminating the system security confidentiality issue with regard to
the private keys ... and crooks can't obtain the private keys and setup
a counterfeit operation impersonating the original system (possibly
unknown to the original operation).

this is my frequent refrain that most operations treat public key
operation as a "something you have" authentication ... aka the party
has accesws and use of the corresponding private key. When purely
software implementation is used ... there typically are attempts to
closely emulate real hardware token operation ... however software
emulation of hardware tokens have several more threats, exploits and
vulnerabilities compared to real hardware tokens.

one way of looking at this issue is where does the security perimeter
lay. the security perimeter for a hardware token ... tends to be the
physical space of the token. the security perimeter for a software
emulated hardware token may be all the components of the computer where
that software is running.

for financial environments ... like PIN-debit ... there are external
tamper resistent hardware boxes that do all PIN related processing for
the financial institution. PIN are entered in the clear at POS
terminals and ATMs ... but then immediately encoded. From then on, they
never appear in the clear. the backend gets the transaction and sends
it to the box .. and gets back some answers ... but standard operation
never sees the PIN in the clear.

the (security) integrity of the backend systems might be comprimized by
insiders ... but you don't find insiders harvesting PINs from the
backend systems (i.e. security confidentiality) and using them in
impersonation attacks with counterfeit transactions.

part of this is that security integrity compromises tend to be a lot
more difficult than security confidentiality compromises (copying the
data). security integrity compromises also tend to leave a lot more
traces back to the people responsible ... compared to security
confidentiality compromises.

one of the claims that we've frequently made with respect to aads chip

for public key operation, was being free from having to worry about the
ins & outs of PKIs and digital certificates .... we were able to
concentrate on the fundamental threats and vulnerabilities of the
actual operation of public key processes. For instance, a very
fundamental threat and vulnerability is the integrity and
confidentiality of the private key. If i was looking at wanting to pay
$100/annum on stuff associated with public key operation ... I might
consider it much better spent on hardware tokens than on digital

in fact in the security proportional to risk scenarios ... slightly
related (i.e. grading the integrity levels of hardware tokens for use
with operations involving different levels of risk) ... slightly

i.e. we've take it as a given that the integrity of the
originating/remote environment is taken into account when evaluating a
transaction for authorization. this includes the risk level associated
with whether or not a real hardware token is being used and if a real
hardware token is being used ... the evaluated, integrity of that token
(which might change over time as technology changes). For a large
percentage of the business processes in the world, we assert that the
integrity level of the remote end is of more importance the a lot of
personal information about the remote entity (which the vast majority
of operations already have on file ... so it is of little interest to
duplicate such information in digital certificates).

so another simple test ....

i would assert that the integrity level of the originating environment
(software token or hardware token and the assurance level of such
token) is one of the primary pieces of information that would be of
interest to a relying-party ... right up there with what is the public
key. so a real buisness oriented digital certificate would not only
give the public key ... but also provide the integrity level of the
environment protecting the private key.

when i examined x.509 fields several years ago ... i couldn't find one
that provided the integrity level of the private key protection
although some have simple flag that can be used to indicate whether it
is software private key or hardware token private key. how many
certification authorities have you heard of that have a process of
checking whether they are certifying a software private key or a
hardware token private key?

the literature has lots of stuff about the integrity level of
public/private keys based on the number of bits in the key ... but I
haven't seen anything on the integrity level of private key protection
... and/or writeups on applications that even make decisions based on
whether they find they are dealing with a software private key or a
hardware private key.

another indication was a couple years ago, i was giving a talk on the
importance of the privatey key protection integrity level ... and
somebody in the audience (from some gov. agency) said that if i would
provide the full definition they would see that it was added to x.509v3