GB2424494A - Methods, devices and data structures for trusted data - Google Patents

Methods, devices and data structures for trusted data Download PDF

Info

Publication number
GB2424494A
GB2424494A GB0505746A GB0505746A GB2424494A GB 2424494 A GB2424494 A GB 2424494A GB 0505746 A GB0505746 A GB 0505746A GB 0505746 A GB0505746 A GB 0505746A GB 2424494 A GB2424494 A GB 2424494A
Authority
GB
United Kingdom
Prior art keywords
software
platform
trusted device
trusted
state
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Withdrawn
Application number
GB0505746A
Other versions
GB0505746D0 (en
Inventor
Graeme John Proudler
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Hewlett Packard Development Co LP
Original Assignee
Hewlett Packard Development Co LP
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Hewlett Packard Development Co LP filed Critical Hewlett Packard Development Co LP
Priority to GB0505746A priority Critical patent/GB2424494A/en
Publication of GB0505746D0 publication Critical patent/GB0505746D0/en
Priority to CN200680009269.XA priority patent/CN101147154B/en
Priority to PCT/GB2006/050063 priority patent/WO2006100522A1/en
Priority to JP2008502491A priority patent/JP4732508B2/en
Priority to US11/908,920 priority patent/US8539587B2/en
Priority to CN 200910137034 priority patent/CN101551841B/en
Priority to EP09178175.7A priority patent/EP2194476B1/en
Priority to EP06710180A priority patent/EP1866825A1/en
Publication of GB2424494A publication Critical patent/GB2424494A/en
Priority to US13/779,400 priority patent/US9111119B2/en
Withdrawn legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F21/00Security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
    • G06F21/50Monitoring users, programs or devices to maintain the integrity of platforms, e.g. of processors, firmware or operating systems
    • G06F21/57Certifying or maintaining trusted computer platforms, e.g. secure boots or power-downs, version controls, system software checks, secure updates or assessing vulnerabilities

Landscapes

  • Engineering & Computer Science (AREA)
  • Computer Hardware Design (AREA)
  • General Engineering & Computer Science (AREA)
  • Computer Security & Cryptography (AREA)
  • Software Systems (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Storage Device Security (AREA)

Abstract

A computer platform contains a trusted device protected from unauthorised modification. The trusted device is adapted to measure and vouch for software on the computer platform. The trusted device is also adapted to determine if when first software on the computer platform has been replaced by second software, that the second software is functionally consistent with the first software and is as trustworthy as the first software. Associated methods of upgrading software and of monitoring platform states are described. A second embodiment relates to determining a software state used by a computing platform comprising: requesting an indication of the software state; and receiving one or more indications attested by a software provider and verified by the computing platform. Privacy may also be preserved in providing the indications of software type.

Description

METHODS, DEVICES AND DATA STRUCTURES FOR TRUSTED DATA
Field of Invention
The invention relates to data which is trusted, in the sense that at least one trusted entity is prepared to vouch for the data. It is particularly relevant to data comprising software (such as data structures or executable instructions) and in embodiments to the upgrading or replacement of software on a computing device.
Background of Invent ion
A significant consideration in interaction between computing entities is trust - whether a foreign computing entity will behave in a reliable and predictable manner, or will be (or already is) subject to subversion. Trusted systems which contain a component at least logically protected from subversion have been developed by the companies forming the Trusted Computing Group (TCG) - this body develops specifications in this area, such are discussed in, for example, "Trusted Computing Platforms - TCPA Technology in Context", edited by Siani Pearson, 2003, Prentice Hall PTR. The implicitly trusted components of a trusted system enable measurements of a trusted system and are then able to provide these in the form of integrity metrics to appropriate entities wishing to interact with the trusted system.
The receiving entities are then able to determine from the consistency of the measured integrity metrics with known or expected values that the trusted system is operating as expected.
Integrity metrics will typically include measurements of the software used by the trusted system. These measurements may, typically in combination, be used to indicate states, or trusted states, of the trusted system. In Trusted Computing Group specifications, mechanisms are taught for "sealing" data to a particular platform state - this has the result of encrypting the sealed data into an inscrutable "opaque blob" containing a value derived at least in part from measurements of software on the platform. The measurements comprise digests of the software, because digest values will change on any modification to the software. This sealed data may only be recovered if the trusted component measures the current platform state and finds it to be represented by the same value as in the opaque blob.
It will be appreciated that any change in software will cause a number of problems, both with this specific process and more generally where measurement of software is taken as representative of the state of a computer system - however small the change to the software, effective forms of measurement (such as digests) will give different values. In the example of "sealing" above, this means that changes to software - which may be entirely desirable, for example to improve functionality or to remove bugs and weaknesses - have the disadvantage of preventing continued access to sealed data. This is only one exemplary problem, however - there is a general difficulty in having the same trust in new or replacement software as was had in original software, this general difficulty having attendant practical difficulties in maintaining functionality based on that trust.
It is desirable to find a way to update or change software that is trusted to provide continuity in trust as well as in software functionality.
Summary of Invention
In one aspect, the invention provides a computer platform containing a trusted device protected from unauthorised modification and adapted to measure and vouch for software on the computer platform, wherein the trusted device is adapted to determine when first software on the computer platform has been replaced by second software that the second software is functionally consistent with the first software and is as trustworthy as the first software.
Brief DescriDtion of the Drawings Preferred embodiments of the invention will now be described, by way of example only, with reference to the accompanying drawings, of which: Figure 1 is an illustration of an exemplary computer platform; Figure 2 indicates functional elements present on the motherboard of a trusted computer platform; Figure 3 indicates the functional elements of a trusted device of the trusted computer platform of Figure 2; Figure 4 illustrates the process of extending values into a platform configuration register of the trusted computer platform of Figure 2; Figure 5 illustrates a statement to vouch for new or replacement software in accordance with embodiments of the invention; and Figure 6 shows a linked list of statements of the type shown in Figure 5.
Detailed Description of Embodiments of the Invention Before describing embodiments of the present invention, a trusted computing platform of a type generally suitable for carrying out embodiments of the present invention will be described with relevance to Figures 1 to 4. This description of a trusted computing platform describes certain basic elements of its construction and operation. A "user", in this context, may be a remote user such as a remote computing entity. A trusted computing platform is further described in the applicant's International Patent Application No. PCT/CB00100528 entitled "Trusted Computing Platform" and filed on 15 February 2000, the contents of which are incorporated by reference herein. The skilled person will appreciate that the present invention does not rely for its operation on use of a trusted computing platform precisely as described below: embodiments of the present invention are described with respect to such a trusted computing platform, but the skilled version will appreciate that aspects of the present invention may be employed with different types of computer platform which need not employ all aspects of Trusted Computing Group trusted computing platform functionality.
A trusted computing platform of the kind described here is a computing platform into which is incorporated a trusted device whose function is to bind the identity of the platform to reliably measured data that provides one or more integrity metrics of the platform. The identity and the integrity metric are compared with expected values provided by a trusted party (TP) that is prepared to vouch for the trustworthiness of the platform. If there is a match, the implication is that at least part of the platform is operating correctly, depending on the scope of the integrity metric.
A user verifies the correct operation of the platform before exchanging other data with the platform. A user does this by requesting the trusted device to provide its identity and one or more integrity metrics. (Optionally the trusted device will refuse to provide evidence of identity if it itself was unable to verify correct operation of the platform.) The user receives the proof of identity and the identity metric or metrics, and compares them against values which it believes to be true. Those proper values are provided by the TP or another entity that is trusted by the user. If data reported by the trusted device is the same as that provided by the TP, the user trusts the platform.
This is because the user trusts the entity. The entity trusts the platform because it has previously validated the identity and determined the proper integrity metric of the platform.
Once a user has established trusted operation of the platform, he exchanges other data with the platform. For a local user, the exchange might be by interacting with some software application running on the platform. For a remote user, the exchange might involve a secure transaction. In either case, the data exchanged is signed' by the trusted device. The user can then have greater confidence that data is being exchanged with a platform whose behaviour can be trusted. Data exchanged may be information relating to some or all of the software running on the computer platform.
Existing Trusted Computing Group trusted computer platforms are adapted to provide digests of software on the platform - these can be compared with publicly available lists of known digests for known software. This does however provide an identification of specific software running on the trusted computing platform - this may be undesirable for the owner of the trusted computing platform on privacy grounds. As will be described below, aspects of the present invention may be used to improve this aspect of the privacy position of the trusted computing platform owner.
The trusted device uses cryptographic processes but does not necessarily provide an external interface to those cryptographic processes. The trusted device should be logically protected from other entities including other parts of the platform of which it is itself a part. Also, a most desirable implementation would be to make the trusted device tamperproof, to protect secrets by making them inaccessible to other platform functions and provide an environment that is substantially immune to unauthorised modification (Ic, both physically and logically protected). Since tamper-proofing is impossible, the best approximation is a trusted device that is tamper-resistant, or tamper-detecting. The trusted device, therefore, preferably consists of one physical component that is tamper-resistant. Techniques relevant to tamper-resistance are well known to those skilled in the art of security. These techniques include methods for resisting tampering (such as appropriate encapsulation of the trusted device), methods for detecting tampering (such as detection of out of specification voltages, X-rays, or loss of physical integrity in the trusted device casing), and methods for eliminating data when tampering is detected.
A trusted platform 10 is illustrated in the diagram in Figure 1. The computer platform is entirely conventional in appearance - it has associated the standard features of a keyboard 14, mouse 16 and visual display unit (VDU) 18, which provide the physical user interface' of the platform.
As illustrated in Figure 2, the motherboard 20 of the trusted computing platform 10 includes (among other standard components) a main processor 21, main memory 22, a trusted device 24, a data bus 26 and respective control lines 27 and lines 28, BIOS memory 29 containing the BIOS program for the platform 10 and an Input/Output (10) device 23, which controls interaction between the components of the motherboard and the keyboard 14, the mouse 16 and the VDU 18. The main memory 22 is typically random access memory (RAM). In operation, the platform 10 loads the operating system, for example Windows XPTM, into RAM from hard disk (not shown). Additionally, in operation, the platform 10 loads the processes or applications that may be executed by the platform 10 into RAM from hard disk (not shown).
Typically, in a personal computer the BIOS program is located in a special reserved memory area, the upper 64K of the first megabyte of the system memory (addresses FOO h to FFFFh), and the main processor is arranged to look at this memory location first, in accordance with an industry wide standard. A significant difference between the platform and a conventional platform is that, after reset, the main processor is initially controlled by the trusted device, which then hands control over to the platform-specific BIOS program, which in turn initialises all input/output devices as normal. After the BIOS program has executed, control is handed over as normal by the BIOS program to an operating system program, such as Windows XP (TM), which is typically loaded into main memory 212 from a hard disk drive (not shown).
The main processor is initially controlled by the trusted device because it is necessary to place trust in the first measurement to be carried out on the trusted platform computing. The measuring agent for this first measurement is termed the root of trust of measurement (RTM) and is typically trusted at least in part because its provenance is trusted. In one practically useful implementation the RTM is the platform while the main processor is under control of the trusted device. As is briefly described below, one role of the RTM is to measure other measuring agents before these measuring agents are used and their measurements relied upon. The RTM is the basis for a chain of trust. Note that the RTM and subsequent measurement agents do not need to verify subsequent measurement agents, merely to measure and record them before they execute. This is called an "authenticated boot process". Valid measurement agents may be recognised by comparing a digest of a measurement agent against a list of digests of valid measurement agents. Unlisted measurement agents will not be recognised, and measurements made by them and subsequent measurement agents are suspect.
The trusted device 24 comprises a number of blocks, as illustrated in Figure 3. After system reset, the trusted device 24 performs an authenticated boot process to ensure that the operating state of the platform 10 is recorded in a secure manner. During the authenticated boot process, the trusted device 24 acquires an integrity metric of the computing platform 10. The trusted device 24 can also perform secure data transfer and, for example, authentication between it and a smart card via encryption/decryption and signature/verification. The trusted device 24 can also securely enforce various security control policies, such as locking of the user interface. In a particularly preferred arrangement, the display driver for the computing platform is located within the trusted device 24 with the result that a local user can trust the display of data provided by the trusted device 24 to the display - this is further described in the applicant's International Patent Application - Nrr w OO/1-3'jj3 PCT/CBOO/02005, entitled "System for Providing a Trustworthy User Interface" and filed on 25 May 2000, the contents of which are incorporated by reference herein.
Specifically, the trusted device in this embodiment comprises: a controller 30 programmed to control the overall operation of the trusted device 24, and interact with the other functions on the trusted device 24 and with the other devices on the motherboard 20; a measurement function 31 for acquiring a first integrity metric from the platform 10 either via direct measurement or alternatively indirectly via executable instructions to be executed on the platform's main processor; a cryptographic function 32 for signing, encrypting or decrypting specified data; an authentication function 33 for authenticating a smart card; and interface circuitry 34 having appropriate ports (36, 37 & 38) for connecting the trusted device 24 respectively to the data bus 26, control lines 27 and address lines 28 of the motherboard 20. Each of the blocks in the trusted device 24 has access (typically via the controller 30) to appropriate volatile memory areas 4 and/or non-volatile memory areas 3 of the trusted device 24. Additionally, the trusted device 24 is designed, in a known manner, to be tamper resistant.
For reasons of performance, the trusted device 24 may be implemented as an application specific integrated circuit (ASIC). However, for flexibility, the trusted device 24 is preferably an appropriately programmed micro-controller. Both ASICs and micro-controllers are well known in the art of microelectronics and will not be considered herein in any further detail.
One item of data stored in the non-volatile memory 3 of the trusted device 24 is a certificate 350. The certificate 350 contains at least a public key 351 of the trusted device 24 and an authenticated value 352 of the platform integrity metric measured by a trusted party (TP). The certificate 350 is signed by the TP using the TP's private key prior to it being stored in the trusted device 24. In later communications sessions, a user of the platform 10 can deduce that the public key belongs to a trusted device by verifying the TP's signature on the certificate. Also, a user of the platform 10 can verify the integrity of the platform 10 by comparing the acquired integrity metric with the authentic integrity metric 352. If there is a match, the user can be confident that the platform 10 has not been subverted. Knowledge of the TP's generally-available public key enables simple verification of the certificate 350. The non-volatile memory 35 also contains an identity (ID) label 353. The ID label 353 is a conventional ID label, for example a serial number, that is unique within some context. The ID label 353 is generally used for indexing and labelling of data relevant to the trusted device 24, but is insufficient in itself to prove the identity of the platform 10 under trusted conditions.
The trusted device 24 is equipped with at least one method of reliably measuring or acquiring the integrity metric of the computing platform 10 with which it is associated. In the present embodiment, a first integrity metric is acquired by the measurement function 31 in a process involving the generation of a digest of the BIOS instructions in the BIOS memory. Such an acquired integrity metric, if verified as described above, gives a potential user of the platform 10 a high level of confidence that the platform 10 has not been subverted at a hardware, or BIOS program, level. Other known processes, for example virus checkers, will typically be in place to check that the operating system and application program code has not been subverted.
The measurement function 31 has access to: non-volatile memory 3 for storing a hash program 354 and a private key 355 of the trusted device 24, and volatile memory 4 for storing acquired integrity metrics. A trusted device has limited memory, yet it may be desirable to store information relating to a large number of integrity metric measurements. This is done in trusted computing platforms as described by the Trusted Computing Group by the use of Platform Configuration Registers (PCR5) 8a- 8n. The trusted device has a number of PCRs of fixed size (the same size as a digest) - on initialisation of the platform, these are set to a fixed initial value. Integrity metrics are then "extended" into PCRs by a process shown in Figure 4. The PCR 8i value is concatenated 403 with the input 401 which is the value of the integrity metric to be extended into the PCR. The concatenation is then hashed 402 to form a new 160 bit value. This hash is fed back into the PCR to form its new value. In addition to the extension of the integrity metric into the PCR, to provide a clear history of measurements carried out the measurement process may also be recorded in a conventional log file (which may be simply in main memory of the computer platform). For trust purposes, however, it is the PCR value that will be relied on and not the software log.
Clearly, there are a number of different ways in which an initial integrity metric may be calculated, depending upon the scope of the trust required. The measurement of the BIOS program's integrity provides a fundamental check on the integrity of a platform's underlying processing environment. The integrity metric should be of such a form that it will enable reasoning about the validity of the boot process - the value of the integrity metric can be used to verify whether the platform booted using the correct BIOS. Optionally, individual functional blocks within the BIOS could have their own digest values, with an ensemble BIOS digest being a digest of these individual digests. This enables a policy to state which parts of BIOS operation are critical for an intended purpose, and which are irrelevant (in which case the individual digests must be stored in such a manner that validity of operation under the policy can be established).
Other integrity checks could involve establishing that various other devices, components or apparatus attached to the platform are present and in correct working order. In one example, the BIOS programs associated with a SCSI controller could be verified to ensure communications with peripheral equipment could be trusted. In another example, the integrity of other devices, for example memory devices or co- processors, on the platform could be verified by enacting fixed challenge/response interactions to ensure consistent results. As indicated above, a large number of integrity metrics may be collected by measuring agents directly or indirectly measured by the RTM, and these integrity metrics extended into the PCRs of the trusted device 24. Some - many - of these integrity metrics will relate to the software state of the trusted platform.
Preferably, the BIOS boot process includes mechanisms to verify the integrity of the boot process itself. Such mechanisms are already known from, for example, Intel's draft "Wired for Management baseline specification v 2.0 - BOOT Integrity Service", and involve calculating digests of software or firmware before loading that software or firmware. Such a computed digest is compared with a value stored in a certificate provided by a trusted entity, whose public key is known to the BIOS. The software/firmware is then loaded only if the computed value matches the expected value from the certificate, and the certificate has been proven valid by use of the trusted entity's public key. Otherwise, an appropriate exception handling routine is invoked. Optionally, after receiving the computed BIOS digest, the trusted device 24 may inspect the proper value of the BIOS digest in the certificate and not pass control to the BIOS if the computed digest does not match the proper value - an appropriate exception handling routine may be invoked.
Processes of trusted computing platform manufacture and verification by a third party are briefly described, but are not of fundamental significance to the present invention and are discussed in more detail in "Trusted Computing Platforms - TCPA Technology in Context" identified above.
At the first instance (which may be on manufacture), a TP which vouches for trusted platforms, will inspect the type of the platform to decide whether to vouch for it or not. The TP will sign a certificate related to the trusted device identity and to the results of inspection - this is then written to the trusted device.
At some later point during operation of the platform, for example when it is switched on or reset, the trusted device 24 acquires and stores the integrity metrics of the platform. When a user wishes to communicate with the platform, he uses a challenge/response routine to challenge the trusted device 24 (the operating system of the platform, or an appropriate software application, is arranged to recognise the challenge and pass it to the trusted device 24, typically via a BIOS-type call, in an appropriate fashion). The trusted device 24 receives the challenge and creates an appropriate response based on the measured integrity metric or metrics - this may be provided with the certificate and signed. This provides sufficient information to allow verification by the user.
Values held by the PCRs may be used as an indication of trusted platform state.
Different PCRs may be assigned specific purposes (this is done, for example, in Trusted Computing Group specifications). A trusted device may be requested to provide values for some or all of its PCRs (in practice a digest of these values - by a TPM Quote command) and sign these values. As indicated above, data (typically keys or passwords) may be sealed (by a TPM_Seal command) against a digest of the values of some or all the PCRs into an opaque blob. This is to ensure that the sealed data can only be used if the platform is in the (trusted) state represented by the PCRs. ii
The corresponding TPM_Unseal command performs the same digest on the current values of the PCRs. If the new digest is not the same as the digest in the opaque blob, then the user cannot recover the data by the TPM Unseal command. If any of the measurements from which the PCR values are derived relate to software on the platform which has changed, then the corresponding PCR will have a different value - a conventional trusted platform will therefore not be able to recover the sealed data.
Aspects of the present invention will now be described with reference to embodiments employing - in some cases modifying - the trusted computing platform structure indicated above. An approach to providing new or updated software of equivalent function and trust properties will first be described, together with a mechanism for allowing a trusted computing platform to indicate its software functionality - and verify its trusted status - without revealing the specific software that it uses. An exemplary approach is described to demonstrate how PCR values before and after replacement of software with functionally and trust equivalent software can be shown to be equivalent, and the use of this approach to solve problems such as that indicated above in sealing data against a software state.
It is noted that in existing trusted computing platform arrangements, entities in fact base their confidence in a trusted computing platform on signed statements about the software that is installed in a platform. The inventor has appreciated that trusted platforms may provide evidence of verification of statements that the software is to be trusted, rather than providing the actual software measurements. This has several advantages. If the trusted device no longer contains values of software measurements, it is physically impossible for the trusted device to report the values of software measurements. If the verification process can include evidence of the trust equivalence of two values of software measurements (and the statement was made by a trusted measurement entity), the trusted device will contain information that can be used (as is described below, in an exemplary arrangement) to re-enable access to sealed plain text data after software is changed in a prescribed manner.
Positive consequences follow from working from statements that vouch for the software in a platform, instead of with the actual software in a platform. If a party that vouched for existing software is prepared to vouch that replacement software is just as acceptable as existing software, use of appropriate statements for this purpose can be used such that the platform can re-enable access to sealed plain text data after such replacement software is installed. In practice the owner of a trusted platform must choose the parties that he wishes to vouch for his platform. The owner could choose any party or set of parties, provided that that party or parties has credibility with those who will interact with the platform. The owner can change parties or sets of parties, provided that those parties are willing to acknowledge each other as trusted peers.
This enables both commercial companies and not-for-profit organisations to vouch for the same trusted platforms.
A party produces a signed statement if it wishes to vouch for a particular program.
The party creates a new statement if the program is not an upgrade or replacement, or creates the next entry in a list of statements if the program is an upgrade or replacement. A statement can describe one or more programs. If a statement describes more than one program, the implication is that all the programs are considered by the signing party to be equally functional and trustworthy for the intended task.
An exemplary form of statement is shown in Figure 5. A statement 501 has the structure [programDigestsN, statementlD N, prevStatementDigestN, nextPubKeyN] and has ancillary structures 532 [pubKeyN] (534) and [signatureValueN] (536). The
elements of statement 501 will now be described.
o programDigests 510 is the digest of the program that is vouched for by the statement. This need not be a digest of a single program - it may consist of a structure containing digests of more than one program that is vouched for by the statement. It may even be a digest of a structure containing digests of more than one program that is vouched for by the statement. Clearly, in such an implementation, the actual digests must also be available to the platform.
As is discussed below, there may be privacy advantages to the user of multiple programs being referred to in programDigests.
o statementiD 520 is a tag, enabling identification of a description ofthe statement's purpose. That description could include a description of the program(s), the intended use of the program(s), the effect of the program(s), and other information about the program(s). statementiD serves to identi1' the reason for the existence of this particular piece of data (as opposed to any other piece of signed data).
o If a program is not an upgrade to or replacement of another program, prevStatementDigest 530 is NULL and pubKey is the key that should be used to verify signatureValue. However, if a program is an upgrade to or replacement of an existing program, prevStatementDigest is the digest of that previous statement, and nextPubKey 540 from that previous statement is the key that should be used to verify signatureValue. In other words, nextPubKey in one statement is the pubKey that must be used in the next statement.
It can be seen that nextPubKey and prevStatement between them allow related statements to form a list linked both backwards and forwards such linkage is illustrated in Figure 6. A list of such statements is linked forwards by means of signature values using the private keys corresponding to pubKeyo, nextPubKeyO, nextPubKeyl nextPubKeyN. The list is linked backwards by means of prey StatementDigest I, prevStatementDigest2 prevStatementDigestN. Each member of a list is linked to a program or programs by means of a signature over programDigestsN.
In one approach, a list of statements starts with pubKeyo, followed by [programDigestso, statementlDO, NULL, nextPubKeyO] and [signatureValue0, which is the result of signing [programDigestsO, statementlD_0, NULL, nextPubKeyo] with the private key corresponding to pubKeyO. The list continues with [programDigests 1, statementl D_ 1, prevStatementDigest 1, nextPubKey 1] and [signatureValuel], which is the result of signing [programDigestsl, statementlD_l, prevStatementDigestl, nextPubKeyl] with the private key corresponding to nextPubKey0. The list continues in the same fashion.
It should be appreciated that the statements in the list have in common equivalence of function and common evidence of trust by the party issuing the statement, but that in other aspects, statements can differ. For example, a program associated with statementN is not necessarily the program associated with statementM; thus programDigestsN is not necessarily the same as programDigestsM. This means that the program(s) associated with a statement at the start of a list may be different (or the same) as the program(s) associated with the statement at any intermediate point in the list, or at the end of the list. Similarly, pubKeyN in a list may or may not be the same as nextPubKeyN. Thus the key used to verify signatureValueO may or may not be the same key used to verify signatureValueN, whether N is an intermediate statement in a list or is the last statement in a list. Thus a party may change its signing key at intervals (in accordance with recommended security practice) or may hand over trust to another party which has a different signing key.
In a modification to this approach, PubKey and nextPubKey could be digests of keys, or digests of a structure containing one or more keys. Clearly, in such an implementation, the actual public key must also be available to the platform. In such a case, any private key corresponding to any public key digest in that structure can be used to sign statements, and multiple parties can concurrently vouch for the trustworthiness of a platform.
It should be noted that for a given software type, it is necessary either to use statements of this type consistently or not to use them at all (and instead to use, for example, conventional TCG approaches). If statements are to be used for a software type, the first instance of the software type to be vouched for by the platform needs to have a statement. Switching from a conventional TCG approach to an upgrade with a statement, as described, is not an available approach.
A mechanism which allows a trusted platform to achieve a measure of privacy when asked to identify its software will now be described. This requires the party issuing a statement to actually issue two statements, the one described above plus a similar auxiliary statement that omits the programDigests field. These auxiliary statements might be returned to a challenger who receives signed integrity metrics from a trusted device instead of the main statements described previously. These auxiliary statements can prevent identification of the actual programs installed in the platform.
If the programDigests field in the main statement describes just one program, it certainly identifies the program being used by the platform there is thus a clear privacy advantage if the auxiliary statement should be used in a challenge response.
Even if the programDigests field describes a few programs, it may be considered to reveal too much information about the platform, and the auxiliary statement should be used in a challenge response if privacy is required. Only when the programDigests field describes many programs does use of a main statement in a challenge response seem not to be a compromise to privacy. The public key used to verify the main statement must also be that used to verify the auxiliary statement, and the same statementlD should appear in both statements. These constraints are necessary to provide a verifiable connection between a main statement and an auxiliary statement.
Naturally, the signature value for a main statement will differ from that of an auxiliary
statement.
The verification of new or replacement software associated with statements will now be described, as will be the recording of the verification process. The essence of this process is to replace a single extend operation (for an element of software) with one or more extend operations, each of which describe a statement.
For verification to be carried out and recorded, the following are required: that trusted measurement agents carry out the statement verification processes, and that a trusted measurement entity must verify programs, must verify statements, and must verify that lists of statements are fully linked. A measurement entity is trusted either because of attestation about the entity or measurements of the entity by a trusted measurement agent.
In order to verify a program, the measurement entity creates a digest of the program and compares that digest with information (from the field programDigests) in a statement. The measurement entity must record an indication of whether this process succeeded. One implementation is to record in the trusted device a verifiedProgram flag that is either TRUE or FALSE. If the program is associated with a linked list, this comparison should be done only using the last statement in the list. (Previous statements in the list merely provide a history of the evolution of the program and attestation for the program).
In order to create a verifiable record of a statement, the measurement entity must make a record in the trusted device of at least whether the signature of the statement was successfully verified. One implementation is to record in a trusted device a verifiedStatement flag that is set to either TRUE or FALSE.
In order to create an auditable record of the verification of a statement, the measurement agent must make a record of the technique used to perform the verification. One implementation is to record in the trusted device the public key (pubKey or nextPubKey) used to verif,' the signature over a statement. If practicable, the measurement agent also verifies that the public key used to verify the signature over statement is extant (has not been revoked), but this is probably beyond the capabilities of most measurement agents. Should it be possible to determine this, the measurement entity always sets the verifiedStatement flag to FALSE if the public key is not extant.
If the private key corresponding to a public key is only used to sign a single type of statement, no statement about the intent of the signature is required. Otherwise, information that indicates that the signature belongs to a particular statement must be recorded with the public key. One implementation is to record in the trusted device
the statementiD.
In order to distinguish a single statement or the start of a list, one implementation is to tag the verification of a statement with the flag startStatement==TRUE if the statement was not verified using nextPubKey of another statement or if the statement's prevStatementDigest is not the digest value of the previous statement, and otherwise to tag the verification of a statement with the flag startStatement==FALSE.
Any member of a linked list must be verified both forwards and backwards. If a linking test passes, the statement is tagged with the flag verifiedList =TRUE.
Otherwise the statement is tagged with the flag verifiedList ==FALSE.
In order to create a verifiable record of a list of verified statements, the measurement entity must record in the trusted device at least the essential characteristics of the first statement in the list, the essential characteristics of the last statement in the list, and whether all the statements in the list passed their verification tests. The preferred implementation is to make a record in the trusted device of the first statement in the list and the last statement in the list, omitting any record of intermediate statements in the list, while recording separately in the trusted device the results of verification tests
on every statement in the list.
Following this approach, after assessing a statement, the measurement entity may record in the trusted device at least the data structure STATEMENT_VERIFICATION containing at least (1) the public key used to verify the statement, (2) the statementlD. For each PCR, the trusted device maintains an upgradesPermitted flag that is set TRUE on PCR initialisation but reset FALSE whenever the platform encounters a statement associated with that PCR with verifiedProgram==FALSE, or verifiedStatement==FALSE, or verifiedListFALSE.
If upgradesPermitted is FALSE, the information content of the associated PCR is unreliable. If upgradesPermitted is FALSE, the trusted device (TPM) must refuse to perform security operations predicated upon the correctness of that PCR value, such as sealing data to that PCR (e.g. creating TCG's "digestAtCreation" parameter), unsealing data (e.g. checking TCG's "digestAtRelease" parameter), and new TPM commands described below, for example. If upgradesPermitted is FALSE, the TPM may refuse to report (using TPM_quote, for example) that PCR value, or alternatively may report the value of upgradesPermitted along with the PCR value.
An algorithm to substitute a conventional TCG integrity metric with a record of a statement or a record of a list of statements is described: o A measurement entity (prior to loading a program) initially follows normal TCG procedure by creating a digest of the program. The measuring entity then determines whether one or more statements are associated with the program.
o If no statements are associated with the program, the measurement entity follows existing TCG procedure by extending the program's digest into the trusted device using TPM_Extend.
o If statements are associated with the program, the measurement entity must parse the statements into single statements and chains of statements. When a single statement is identified, the appropriate flags and STATEMENT VERIFICATiON must be recorded in a trusted device (but note that the appropriate verifiedList is not recorded and need not even be computed). When the start of a chain is identified, the appropriate STATEMENT_VERIFICATION and flags are recorded in the trusted device (but note that the appropriate verifiedProgram flag is not recorded, and need not even be computed). When the end of a chain is identified, the appropriate STATEMENT_VERIFICATION structure and flags are recorded in the trusted device. Some algorithms for use while parsing statements are: (a) Always compute verifiedStatement; (b) Compute verifiedList if there is a previous and/or following statement; (c) Compute verifiedProgram if (there is no following statement) or if (following statement has startStatement==TRUE); (d) Record STATEMENT VERIFICATION if (there is no previous statement) or if (verifiedProgram is to be recorded) ; (e) Whenever verifiedProgram==FALSE, or verifiedStatement=FALSE, or verifiedList==FALSE, reset the appropriate upgradesPermitted flag.
The verification process described above captures the information needed to establish that upgraded or replacement software is trusted The privacy concerns associated with reporting software state on challenge can thus be met by the use of statements describing many programs or by the use of auxiliary statements. Further steps are required to solve the problem of accessing data in opaque blobs sealed to an earlier platform state with earlier software. As will be indicated below, it will be possible to reseal these opaque blobs to the PCR values associated with the new software state.
This requires a number of steps, effectively amounting to proof that the later PCR values are derived legitimately from the earlier PCR values against which the opaque blob is sealed. Three steps of proving equivalence are required: proving the equivalence of measured digests; proving the equivalence of PCR values composed of measured digests; and proving the equivalence of composite PCR values composed of multiple PCR values. Each step builds on the preceding step, and each step will be described below in turn in an exemplary embodiment, proposing new functions to complement the existing Trusted Computing Group system.
One approach to proving the equivalence of measured digests is to guide the trusted device to create signed pairs of data, one element of which is potentially an initial value in a PCR and the other element of which is potentially the value after further extensions into that PCR. The trusted device can recognise such structures as proof that the later value can be derived from the earlier value by extending the earlier value. We can use such pairs of data to create tuples of data with the same properties.
There are many ways to achieve the same objective involving checking by the trusted device of signed statements, but the approach chosen is selected for consistency with Trusted Computing Group methods.
In one implementation, a trusted device requires new capabilities to create a data structure that provides proof to a trusted device that a start digest value can be transformed into an end digest value via an intermediate value by the process of extending two or more values into the start digest value. There are many ways of doing this. The following set of prototype trusted device commands illustrates the concept: 1. A new trusted device capability, TPM_digestPair(action, index, value), creates digestPair data containing (startValue, endValue) that provides proof to a trusted device that a start digest value "startValue" can be transformed into an end digest value "end Value" by the process of extending one or more values into "startValue".
o If "action" == START, the trusted device creates a temporary digest engine, loads (not extends) "value == startValue" into that engine, and returns the "index" of that temporary engine; o if "action" == CONTINUE, the trusted device extends the engine referenced by "index" with "value" and returns the "index" of that engine; o if "action" == END, the trusted device extends the engine referenced by "index" with "value" and returns an opaque blob containing the digestPair structure (startValue, end Value), where end Value is the final state of the engine. This blob can be recognised by the trusted device that created it as an unaltered blob that it created.
Such methods are well known to those skilled in the art, and are used in existing Trusted Computing Group technology.
2. For convenience, it may also be desirable to have an additional new trusted device capability, TPM_concatDigestPair(digestPairl, digestPair2), which creates digestPair data that provides proof to a trusted device that a start digest value can be transformed into an end digest value by the process of extending one or more values into the start value. The trusted device verifies that it used TPM_digestPair to create digestPairl having the data format (starti, endl) and that it used TPM_digestPair to create digestPair2 having the data format (start2, end2). If end lstart2, the trusted device returns a blob containing the digestPair structure (startl, end2) which can be recognised by the trusted device that created it as an unaltered blob that it created.
3. A new trusted device capability, TPM_DigestList(digestPairl, digestPair2), creates data (start 1, end 1, end2) that proves to a trusted device that a start digest value "startl" can be transformed into an end digest value "end2" via an intermediate value "endi". digestPairl is presumed to have been created by the same trusted device via the command TPM digestPair(action,listindex,next) and having the format (starti, end 1). digestPair2 is presumed to have been created by the same trusted device via the command TPM_digestPair(action, listindex,next) and having the format (start2, end2).
The trusted device verifies that it created digestPairl and digestPair2. Ifendlstart2, the trusted device returns an opaque blob containing the digestList structure (start 1, endl, end2) which can be recognised by the trusted device that created it as an unaltered blob that it created.
The next step is to prove the equivalence of PCR values composed of measured digests. The value in a PCR is conventionally derived from individual integrity metrics, which are extended one-by-one into the PCR. As has been indicate above, in embodiments of the invention, the single extend operation of each individual integrity metric is replaced with (potentially) many extend operations. We can use the new trusted device capabilities previously described to prove to a trusted device that a value in a PCR could have been derived from a previous value in that PCR, but it remains to prove to the trusted device that a derived value is actually a legitimate PCR value.
An approach described here is to prove that a value is a legitimate PCR value, and then prove that it can be derived from a PCR reset value via an intermediate value. It can be assumed from the properties of digest algorithms that this provides sufficient confidence that the intermediate value was also a legitimate PCR value. The trusted device is guided to create signed data structures that can be recognised by the trusted device as legitimate data. As before, there are many ways to achieve the same result if the trusted device checks signed statements created outside the trusted device.
In one implementation, a trusted device requires new capabilities (beyond those currently described by the Trusted Computing Group). The basic concept is to prove that a value is a current PCR value, and hence must have been created by legitimate means by trusted measurement entities. Then prove that the value in a PCR at reset can be transformed by a series of legitimate "measure and extend" operations into that current PCR value via an intermediate value. The following prototype trusted device commands illustrate the concept: o A new trusted device capability, TPM_validPCR( digestValue, PCRindex), creates data 1sPCR that proves to a trusted device that the value "digestValue" is a legitimate PCR value. The trusted device verifies that "digestValue" is the current value of the PCR indicated by PCRindex. The trusted device then creates an opaque blob containing the isPCR structure (digestValue, PCRindex) which can be recognised by the trusted device that created it as an unaltered blob that it created.
o A new trusted device capability, TPM_history( digestListi, i5PCR1), creates data PCRhistory that proves to a trusted device that a start PCR value can be legitimately transformed into an end PCR value. The trusted device verifies that it created "digestList 1" and extracts (starti, end 1, end2). The trusted device verifies that it created "isPCRI" and extracts (digestValue, PCRindex). The trusted device verifies that "start 1" is the reset value of the PCR indicated by "PCRindex" and that "end2" == "digestValue". The trusted device creates a PCRhistory structure with the contents (previousPCR==end 1, currentPCR==end2, PCRindex).
In this approach, a management program uses capabilities such as those described above to prove to a trusted device that a current PCR value (PCRnow) is an extension of a previous PCR value (PCRold). Using the above commands as an example, TPMdigestPair(action, list index,next) is used to create digestPairA with the contents (PCRresetvalue, PCRold), where PCRresetvalue is the value of the relevant PCR before the first value is extended into the PCR. Then TPM_digestPair(act ion, list index,next) is used to create digestPairB with the contents (PCRo1d, PCRnow). Then TPMDigestList( digestPairA, digestPairB) creates digestListC with the contents (PCRresetvalue, PCRoId, PCRnow).
TPM_validPCR(PCRnow, PCRindex) is used to create the isPCRI structure (PCRnow, PCRindex). TPM_history( digestListC, isPCR1) is used to create the PCRhistory structure (PCRo1d, PCRnow, PCRindex). At any time in the future, PCRhistory can be loaded into the trusted device to prove to the trusted device that PCRold was an intermediate value in the creation of the value PCRnow in the PCR indicated by PCRindex.
The next step is to prove the equivalence of composite PCR values composed of multiple PCR values. The PCR values used in Trusted Computing Group sealed data are derived from sets of PCR values. In one approach, the trusted device is guided to create a data structure that provides proof to a trusted device that a starting set of PCR values can be transformed into an end set of PCR values via an intermediate set of PCR values by the process of extending values into the starting set of values. As before, there are many ways to achieve the same result if the trusted device checks signed statements created outside the trusted device.
In Trusted Computing Group terminology, sets of PCR values are described as TPM COMPOSITE HASH values. Trusted Computing Group defines a TPM_COMPOSITEHASj-j value as the digest of a TPM_PCR_COMPOSITE structure, which is defined as: typedef struct tdTPM_PCR_COMPOSITE { TPM_PCR SELECTION select; UINT32 valueSize; [size_is(valueSize)] TPM_PCRVALUE pcrValue[]; } TPM_PCR COMPOSITE; This means that a TPM PCR COMPOSITE structure is (in essence) a TPM_PCR_SELECTION structure followed by a four Byte value, followed by a concatenated number of PCR values. A TPM COMPOSITE HASH value is the result of serially hashing those structures in a hash algorithm.
In one implementation, a new trusted device capability, TPM_compositePair(action, compPairlndex, PCRhistory, select, valueS ize) creates data that provides proof to a trusted device that the same starting set of PCR values can be transformed into two end sets of PCR values by the process of extending values into the starting set of values. (In TPM_compositePair, "select" and "valueSize" are the parameters defined by TCG in the typedef declaration reproduced above.) o If "action" == START, the trusted device feeds "select" followed by "valueSize" into two separate digest engines (engine_i and engine_2), and returns the compPairlndex (a handle) of that pair of engines; o if "action" == CONTINUE, the trusted device verifies the PCRhistory parameter (PCRprevious, PCRcurrent, PCRindex) and extracts the PCR values and PCRindex. The trusted device verifies that PCRindex is greater than any previous PCRindex value and feeds "PCRprevious" into digest engine_i and feeds "PCRcurrent" into digest engine_2, where the pair of engines are identified by compPairlndex; o if "action" == END, the trusted device verifies that the number of CONTINUE operations was [valueSize-1]. The trusted device returns an opaque blob containing a compositePair structure with contents (contents_of_engine_i, contents of engine i, PCRIndex) which be recognised by the trusted device that created it as an unaltered blob that it created.
Of course, if the same value is to be fed into both engine_i and engine_2, a single value can be inputted instead of a digestList parameter.
In one approach to implementation, a management program uses capabilities such as TPM_compositePair(action, compPairlndex, digestList, select, valueSize) to prove to a trusted device that a starting set of PCR values can be transformed into an end set of PCR values via an intermediate set of PCR values by the process of extending values into the starting set of values. At any time in the future, the caller can use a compositePair structure to prove to a trusted device that the two values in the compositePair structure are equivalent as far as trust is concerned. Additional trusted device commands (not described here) can then be used to instruct a trusted device to replace an existing TPMCOMPOSITE HASH value with another TPM_COMPOSITE HASH, provided that both TPMCOMPOSITE HASH values are extant in a valid compositePair structure.
Preferably the existing Trusted Computing Group method of generating a composite PCR is changed to extending each subsequent PCR value into an intermediate composite PCR value. Then the command TPM_compositePair is not required.
In a complementary approach, a further new trusted device capability creates and signs credentials containing the contents of data blobs produced by these new capabilities. Such credentials could be supplied to third parties along with evidence of current platform state (such as that created by TPM_Quote), as evidence that a platform's new state is as trustworthy as a previous state. Such credentials must include a tag, such as digestPairData, compositePairData, and so on, to indicate the meaning of the credential. Such credentials should be signed using one of a TPM's Attestation Identities, in accordance with normal practice in the Trusted Computing Group, to preserve privacy.
This approach as a whole allows a later software state (as evidenced by associated PCR values) to be associated with an earlier software state. It is therefore possible for sealed blobs to be accessed despite the change of software state, as the new PCR values can be shown to be derivable from the old PCR values and a new TPM_COMPOSITE HASH value can replace the old TPM_COMPOSITE_HASH value. A suitable approach is simply to update all such "sealed blobs" as one step in the process of upgrading or replacing software on the trusted platform.
If a platform is such that it does not have sufficient resources to perform statement verification processes, the trusted device can be provided with capabilities that perform those verification processes. Thus new trusted device capabilities could be used by a measurement entity to verify the signature on a statement, verify a linked list of statements, etc. The trusted device could even act as a scratch pad to store intermediate results of verification processes, so the results of one verification process can be used by future verification processes without the need to storage outside the trusted device. Similar techniques are already used in Trusted Computing Group technology to compute digests and extend such digests into PCRs.

Claims (33)

  1. A method of providing evidence of a state of a computer platform, comprising: measuring a state of the computer platform, wherein measuring a state comprises a measurement of first software in the computer platform, to provide a first measured state; using the first measured state in evidence of the state of the computer platform; replacing first software with second software in the computer platform; measuring the state of the computer platform with the first software replaced by the second software to provide a second measured state; verifying that the second measured state is as trustworthy as the first measured state; and substituting the second measured state for the first measured state in evidence of the state of the computer platform.
  2. 2. A method as claimed in claim 1, wherein the computer platform comprises a trusted device protected against subversion and the steps of measuring, verifying and substituting are carried out by the trusted device.
  3. 3. A method as claimed in claim 1 or claim 2, wherein said first and second measured states comprise or are derived from digests of the first software and the second software respectively, and wherein the step of verifying comprises determining that the digest of the second software is related to the digest of the first software.
  4. 4. A method as claimed in claim 3, wherein the relation between the digest of the first software and the digest of the second software is provided by a statement attested by a trusted software provider, and the verifying step comprises
    verifying the statement.
  5. 5. A method as claimed in any of claims 2 to 4, wherein the trusted device comprises one or more platform configuration registers into which measurements are placed by concatenating a current platform configuration register value with measurement data, hashing the result and replacing the current platform configuration register value with the hashed result, wherein the first measured state is derived from platform configuration register values for the first software and the second measured state is derived from platform configuration register values after the first software has been replaced by the second software, and the verifying step comprises determining that the platform configuration register values for the first software are related to the platform configuration register values after the first software has been replaced by the second software.
  6. 6. A method as claimed in claim 5, wherein the first and second measured states each comprise a value derived from a plurality of platform configuration registers, and the verifying step comprises determining that the first measured state value is related to the second measured state value.
  7. 7. A method as claimed in claim 6, wherein the value is derived from the plurality of platform configuration registers by concatenating and hashing the values in those platform configuration registers.
  8. 8. A method as claimed in any preceding claim, wherein the evidence comprises data sealed against a value derived from a measured state such that the data may only be accessed when a current value of the measured state of the computer platform corresponds to the measured value.
  9. 9. A computer platform containing a trusted device protected against subversion and adapted to measure and vouch for software on the computer platform, wherein the trusted device is adapted to determine when first software on the computer platform has been replaced by second software that the second software is functionally consistent with the first software and is as trustworthy as the first software.
  10. 10. A computer platform as claimed in claim 9 wherein the trusted device determines functional consistency and trust from a statement attested by a trusted software provider by verifying the statement.
  11. 11. A computer platform as claimed in claim 9 or claim 10, wherein the trusted device is adapted to measure a state of the computing platform from measurements including a measurement relating to the first software or the second software.
  12. 12. A computer platform as claimed in claim 11, wherein the trusted device comprises one or more platform configuration registers into which measurements are placed by concatenating a current platform configuration register value with measurement data, hashing the result and replacing the current platform configuration register value with the hashed result, wherein the trusted device is adapted to determine when a first platform state represented by platform configuration register values determined by measurements including a measurement relating to the first software is equivalent to a second platform state represented by platform configuration register values determined by measurements including a measurement relating to the second software.
  13. 13. A computer platform as claimed in claim 11 or claim 12, wherein the trusted device is adapted to provide evidence of platform states.
  14. 14. A computer platform as claimed in any of claims 11 to 13, wherein the trusted device is adapted to seal data against a value derived from a measured platform state such that the data may only be accessed when a current value of the platform state corresponds to the measured value.
  15. 15. A method of providing an upgrade or replacement for identified software, comprising: determining for the upgrade or replacement a statement indicating the nature of the software and a proof that the upgrade or replacement is as trustworthy as the identified software; and providing the upgrade or replacement with the statement.
  16. 16. A method as claimed in claim 15, wherein the proof comprises an attestation from a trusted provider of the upgrade or replacement.
  17. 17. A method as claimed in claim 15 or claim 16, wherein the proof comprises identifying the statement as part of a linked list of statements relating to the identified software, and verifying that the linked list is a valid linked list.
  18. 18. A method as claimed in any of claims 15 to 17, further comprising providing a second statement, wherein the statement contains one or more digests, or data derived from digests, of the upgrade or replacement software, whereas the second statement has the same indication of the nature of the software and proof that the upgrade or replacement is as trustworthy as the identified software as the statement but excludes the one or more digests.
  19. 19. A data structure, comprising: an identification of a software type; a proof that two or more instances of that software type are as trustworthy as each other.
  20. 20. A data structure as claimed in claim 19, wherein the proof comprises an attestation from a trusted provider of the upgrade or replacement.
  21. 21. A data structure as claimed in claim 19 or claim 20, wherein the proof comprises an identification of the statement as part of a linked list of
    statements relating to the identified software.
  22. 22. A data structure as claimed in any of claims 19 to 21, wherein the data structure further comprises one or more digests, or data derived from digests, of instances of the software type.
  23. 23. A data structure as claimed in any of claims 19 to 21, wherein the data structure contains no digests, or data derived from digests, of instances of the software type.
  24. 24. A pair of data structures comprising a data structure as claimed in claim 22 and a data structure as claimed in claim 23, wherein each data structure has the same identification of a software type; and proof that two or more instances of that software type are as trustworthy as each other.
  25. 25. A method of providing an integrity metric on a computing platform containing a trusted device protected from subversion, comprising: measuring at least a part of a data structure as claimed in any of claims 19 to 22; and recording the measurement in the trusted device.
  26. 26. A method as claimed in claim 25 wherein the data structure forms part of a linked list of data structures, wherein the measuring step comprises measuring at least a part of the data structure for the first and last data structures in the linked list.
  27. 27. A method as claimed in claim 25 or claim 26 ftirther comprising separately recording whether verification that the data structure is a valid proof that that two or more instances of that software type are as trustworthy as each other was successful.
  28. 28. A method of determining a software state used by a computing platform, comprising: requesting an indication of software state from the computing platform; and receiving one or more indications of software type attested by a software provider and verified by the computing platform.
  29. 29. A method as claimed in claim 28, wherein for an indication of software type, privacy is preserved by receiving a plurality of digests, or derivatives of digests, of instances of that software type.
  30. 30. A method as claimed in claim 28, wherein for an indication of software type, privacy is preserved by receiving no digests, or other material directly derived from, instances of that software type.
  31. 31. A method for a computing platform to report a software state, comprising: receiving a request for an indication of software state of the computing platform; and providing one or more indications of software type attested by a software provider and verified by the computing platform.
  32. 32. A method as claimed in claim 31, wherein for an indication of software type, privacy is preserved by providing a plurality of digests, or derivatives of digests, of instances of that software type.
  33. 33. A method as claimed in claim 28, wherein for an indication of software type, privacy is preserved by providing no digests, or other material directly derived from, instances of that software type.
GB0505746A 2005-03-22 2005-03-22 Methods, devices and data structures for trusted data Withdrawn GB2424494A (en)

Priority Applications (9)

Application Number Priority Date Filing Date Title
GB0505746A GB2424494A (en) 2005-03-22 2005-03-22 Methods, devices and data structures for trusted data
EP06710180A EP1866825A1 (en) 2005-03-22 2006-03-22 Methods, devices and data structures for trusted data
US11/908,920 US8539587B2 (en) 2005-03-22 2006-03-22 Methods, devices and data structures for trusted data
PCT/GB2006/050063 WO2006100522A1 (en) 2005-03-22 2006-03-22 Methods, devices and data structures for trusted data
JP2008502491A JP4732508B2 (en) 2005-03-22 2006-03-22 Methods, devices, and data structures for trusted data
CN200680009269.XA CN101147154B (en) 2005-03-22 2006-03-22 Methods, devices and data structures for trusted data
CN 200910137034 CN101551841B (en) 2005-03-22 2006-03-22 Methods, devices and data structures for trusted data
EP09178175.7A EP2194476B1 (en) 2005-03-22 2006-03-22 Method and apparatus for creating a record of a software-verification attestation
US13/779,400 US9111119B2 (en) 2005-03-22 2013-02-27 Methods, devices and data structures for trusted data

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
GB0505746A GB2424494A (en) 2005-03-22 2005-03-22 Methods, devices and data structures for trusted data

Publications (2)

Publication Number Publication Date
GB0505746D0 GB0505746D0 (en) 2005-04-27
GB2424494A true GB2424494A (en) 2006-09-27

Family

ID=34531579

Family Applications (1)

Application Number Title Priority Date Filing Date
GB0505746A Withdrawn GB2424494A (en) 2005-03-22 2005-03-22 Methods, devices and data structures for trusted data

Country Status (2)

Country Link
CN (2) CN101147154B (en)
GB (1) GB2424494A (en)

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
GB2448379A (en) * 2007-04-13 2008-10-15 Hewlett Packard Development Co Dynamic trust management in computing platforms
US8850212B2 (en) 2010-05-21 2014-09-30 Hewlett-Packard Development Company, L.P. Extending an integrity measurement
CN107077568A (en) * 2014-11-17 2017-08-18 英特尔公司 symmetric key and trust chain
US20210374292A1 (en) * 2020-05-26 2021-12-02 Robert Bosch Gmbh Method for operating an electronic device

Families Citing this family (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102282564B (en) * 2009-02-18 2014-10-15 松下电器产业株式会社 Information processing device and information processing method
CN105515776A (en) * 2010-03-05 2016-04-20 交互数字专利控股公司 Method and apparatus for providing security to devices
US8516551B2 (en) * 2010-07-28 2013-08-20 Intel Corporation Providing a multi-phase lockstep integrity reporting mechanism
US8943334B2 (en) 2010-09-23 2015-01-27 Intel Corporation Providing per core voltage and frequency control

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP1182534A2 (en) * 2000-08-18 2002-02-27 Hewlett-Packard Company Apparatus and method for establishing trust
EP1282027A1 (en) * 2001-07-30 2003-02-05 Hewlett-Packard Company Trusted platform evaluation
US20050033987A1 (en) * 2003-08-08 2005-02-10 Zheng Yan System and method to establish and maintain conditional trust by stating signal of distrust

Family Cites Families (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7216369B2 (en) * 2002-06-28 2007-05-08 Intel Corporation Trusted platform apparatus, system, and method
US7200758B2 (en) * 2002-10-09 2007-04-03 Intel Corporation Encapsulation of a TCPA trusted platform module functionality within a server management coprocessor subsystem
CN100334555C (en) * 2002-12-27 2007-08-29 技嘉科技股份有限公司 Upgrading control method for intelligent cured software

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP1182534A2 (en) * 2000-08-18 2002-02-27 Hewlett-Packard Company Apparatus and method for establishing trust
EP1282027A1 (en) * 2001-07-30 2003-02-05 Hewlett-Packard Company Trusted platform evaluation
US20050033987A1 (en) * 2003-08-08 2005-02-10 Zheng Yan System and method to establish and maintain conditional trust by stating signal of distrust

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
20th Annual Computer Security Applications Conference, 6-10 December 2004, Arizona, "Open-source applications of TCPA hardware", Marchesini et al., obtained from http://www.acsac.org/2004/papers/81.pdf *
http://citeseer.ist.psu.edu/709014.html, "Privacy and trusted computing", Reid, J. et al., Information Security Research Centre, Queensland University of Technology, Brisbane, Australia, 21 March 2003 *

Cited By (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
GB2448379A (en) * 2007-04-13 2008-10-15 Hewlett Packard Development Co Dynamic trust management in computing platforms
US8060934B2 (en) 2007-04-13 2011-11-15 Hewlett-Packard Development Company, L.P. Dynamic trust management
GB2448379B (en) * 2007-04-13 2011-12-14 Hewlett Packard Development Co Dynamic trust management
US8850212B2 (en) 2010-05-21 2014-09-30 Hewlett-Packard Development Company, L.P. Extending an integrity measurement
GB2482652B (en) * 2010-05-21 2016-08-24 Hewlett Packard Development Co Lp Extending integrity measurements in a trusted device using a policy register
CN107077568A (en) * 2014-11-17 2017-08-18 英特尔公司 symmetric key and trust chain
EP3221996A4 (en) * 2014-11-17 2018-07-25 Intel Corporation Symmetric keying and chain of trust
CN107077568B (en) * 2014-11-17 2020-08-25 英特尔公司 Symmetric keys and Trust chains
US20210374292A1 (en) * 2020-05-26 2021-12-02 Robert Bosch Gmbh Method for operating an electronic device

Also Published As

Publication number Publication date
GB0505746D0 (en) 2005-04-27
CN101551841B (en) 2012-10-03
CN101147154A (en) 2008-03-19
CN101147154B (en) 2010-12-22
CN101551841A (en) 2009-10-07

Similar Documents

Publication Publication Date Title
US8539587B2 (en) Methods, devices and data structures for trusted data
US8850212B2 (en) Extending an integrity measurement
US8060934B2 (en) Dynamic trust management
US7467370B2 (en) Apparatus and method for creating a trusted environment
US20050268093A1 (en) Method and apparatus for creating a trusted environment in a computing platform
US8689318B2 (en) Trusted computing entities
US7437568B2 (en) Apparatus and method for establishing trust
US20100115625A1 (en) Policy enforcement in trusted platforms
EP1030237A1 (en) Trusted hardware device in a computer
Kühn et al. Realizing property-based attestation and sealing with commonly available hard-and software
US9710658B2 (en) Computing entities, platforms and methods operable to perform operations selectively using different cryptographic algorithms
GB2424494A (en) Methods, devices and data structures for trusted data
US11232209B2 (en) Trojan detection in cryptographic hardware adapters
Welter Data Protection and Risk Management on Personal Computer Systems Using the Trusted Platform Module
GB2412822A (en) Privacy preserving interaction between computing entities

Legal Events

Date Code Title Description
WAP Application withdrawn, taken to be withdrawn or refused ** after publication under section 16(1)