US20110208966A1 - Integrated circuit for authentication of consumable storage device - Google Patents

Integrated circuit for authentication of consumable storage device Download PDF

Info

Publication number
US20110208966A1
US20110208966A1 US13/086,359 US201113086359A US2011208966A1 US 20110208966 A1 US20110208966 A1 US 20110208966A1 US 201113086359 A US201113086359 A US 201113086359A US 2011208966 A1 US2011208966 A1 US 2011208966A1
Authority
US
United States
Prior art keywords
authentication
chip
bits
key
bit
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
US13/086,359
Other languages
English (en)
Inventor
Kia Silverbrook
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Silverbrook Research Pty Ltd
Original Assignee
Silverbrook Research Pty Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from AUPO7991A external-priority patent/AUPO799197A0/en
Application filed by Silverbrook Research Pty Ltd filed Critical Silverbrook Research Pty Ltd
Priority to US13/086,359 priority Critical patent/US20110208966A1/en
Assigned to SILVERBROOK RESEARCH PTY LTD reassignment SILVERBROOK RESEARCH PTY LTD ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: SILVERBROOK, KIA, MR
Publication of US20110208966A1 publication Critical patent/US20110208966A1/en
Abandoned legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F21/00Security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
    • G06F21/70Protecting specific internal or peripheral components, in which the protection of a component leads to protection of the entire computer
    • G06F21/71Protecting specific internal or peripheral components, in which the protection of a component leads to protection of the entire computer to assure secure computing or processing of information
    • G06F21/77Protecting specific internal or peripheral components, in which the protection of a component leads to protection of the entire computer to assure secure computing or processing of information in smart cards
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F21/00Security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
    • G06F21/30Authentication, i.e. establishing the identity or authorisation of security principals
    • G06F21/31User authentication
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L9/00Cryptographic mechanisms or cryptographic arrangements for secret or secure communications; Network security protocols
    • H04L9/002Countermeasures against attacks on cryptographic mechanisms
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L9/00Cryptographic mechanisms or cryptographic arrangements for secret or secure communications; Network security protocols
    • H04L9/32Cryptographic mechanisms or cryptographic arrangements for secret or secure communications; Network security protocols including means for verifying the identity or authority of a user of the system or for message authentication, e.g. authorization, entity authentication, data integrity or data verification, non-repudiation, key authentication or verification of credentials
    • H04L9/3218Cryptographic mechanisms or cryptographic arrangements for secret or secure communications; Network security protocols including means for verifying the identity or authority of a user of the system or for message authentication, e.g. authorization, entity authentication, data integrity or data verification, non-repudiation, key authentication or verification of credentials using proof of knowledge, e.g. Fiat-Shamir, GQ, Schnorr, ornon-interactive zero-knowledge proofs
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L9/00Cryptographic mechanisms or cryptographic arrangements for secret or secure communications; Network security protocols
    • H04L9/32Cryptographic mechanisms or cryptographic arrangements for secret or secure communications; Network security protocols including means for verifying the identity or authority of a user of the system or for message authentication, e.g. authorization, entity authentication, data integrity or data verification, non-repudiation, key authentication or verification of credentials
    • H04L9/3236Cryptographic mechanisms or cryptographic arrangements for secret or secure communications; Network security protocols including means for verifying the identity or authority of a user of the system or for message authentication, e.g. authorization, entity authentication, data integrity or data verification, non-repudiation, key authentication or verification of credentials using cryptographic hash functions
    • H04L9/3242Cryptographic mechanisms or cryptographic arrangements for secret or secure communications; Network security protocols including means for verifying the identity or authority of a user of the system or for message authentication, e.g. authorization, entity authentication, data integrity or data verification, non-repudiation, key authentication or verification of credentials using cryptographic hash functions involving keyed hash functions, e.g. message authentication codes [MACs], CBC-MAC or HMAC
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L9/00Cryptographic mechanisms or cryptographic arrangements for secret or secure communications; Network security protocols
    • H04L9/32Cryptographic mechanisms or cryptographic arrangements for secret or secure communications; Network security protocols including means for verifying the identity or authority of a user of the system or for message authentication, e.g. authorization, entity authentication, data integrity or data verification, non-repudiation, key authentication or verification of credentials
    • H04L9/3271Cryptographic mechanisms or cryptographic arrangements for secret or secure communications; Network security protocols including means for verifying the identity or authority of a user of the system or for message authentication, e.g. authorization, entity authentication, data integrity or data verification, non-repudiation, key authentication or verification of credentials using challenge-response
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2221/00Indexing scheme relating to security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
    • G06F2221/21Indexing scheme relating to G06F21/00 and subgroups addressing additional information or applications relating to security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
    • G06F2221/2103Challenge-response
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2221/00Indexing scheme relating to security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
    • G06F2221/21Indexing scheme relating to G06F21/00 and subgroups addressing additional information or applications relating to security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
    • G06F2221/2129Authenticate client device independently of the user
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2221/00Indexing scheme relating to security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
    • G06F2221/21Indexing scheme relating to G06F21/00 and subgroups addressing additional information or applications relating to security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
    • G06F2221/2143Clearing memory, e.g. to prevent the data from being stolen
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L2209/00Additional information or applications relating to cryptographic mechanisms or cryptographic arrangements for secret or secure communication H04L9/00
    • H04L2209/04Masking or blinding
    • H04L2209/043Masking or blinding of tables, e.g. lookup, substitution or mapping
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L2209/00Additional information or applications relating to cryptographic mechanisms or cryptographic arrangements for secret or secure communication H04L9/00
    • H04L2209/08Randomization, e.g. dummy operations or using noise

Definitions

  • This invention related to an integrated circuit for the authentication of a consumable storage device.
  • the process of authentication has particular application in any system (chip or software) that manipulates secure data.
  • This includes Internet commerce, peer-to-peer communication, Smart Cards, Authentication chips, electronic keys, and cryptographic equipment. Whilst the description of the preferred embodiments of the present invention assumes a System/consumable relationship, it is a trivial matter to extend the protocol for other uses.
  • An example is Internet commerce, where each consumer is effectively the consumable, and the Shop is the System.
  • Smart Cards where each smart card can have a unique key, known to the System.
  • an integrated circuit for the authentication of a consumable storage device by an apparatus comprising a memory space which contains encrypted data defined by a message authentication code (MAC) applied to data relating to a consumable stored by the device and by at least one secret key (K) shared by the apparatus for decryption of the data, the MAC being a construction of a cryptographic function.
  • MAC message authentication code
  • K secret key
  • the cryptographic function may be a hash function such that the MAC is an algorithm known as HMAC.
  • the hash function may be one of an MD5 function and a SHA-1 function.
  • the hash function may be an SHA-1 function.
  • the integrated circuit may be configured to define a number of temporary registers and rotating counters and to calculate an output word on an iterative basis by calculating and allocating words to respective registers during processing of the SHA-1 function.
  • the memory space of the integrated circuit may include two secret keys, K 1 and K 2 , the integrated circuit being configured to that the key K 1 is used to decrypt an encrypted random number generated by the apparatus and the key K 2 is used to decrypt encrypted data stored in the memory space.
  • a method of encrypting data relating to a consumable of a consumable storage device for an apparatus and stored by an integrated circuit including the steps of:
  • MAC message authentication code
  • the invention is an authentication chip including an OverUnderPower detection unit to prevent power supply attacks, the unit comprising: a first comparator having a first input connected to a reference voltage and a second input connected to a power supply line to detect voltage rises above a predetermined limit, and a second comparator having a first input connected to the reference voltage and a second input connected to the power supply line to detect voltage drops below a predetermined limit, and an output to provide a RESET signal to clear all volatile memory in the chip in the event of the power supply voltage exceeding the predetermined limits.
  • the OverUnderPower Detection Unit detects power glitches and tests the power level against the Voltage Reference to ensure it is within a certain tolerance
  • the OverUnderPower Detection Unit may be connected into the RESET Tamper Detection Line, to cause a RESET when triggered.
  • a side effect of the OverUnderPower Detection Unit is that as the voltage drops during a power-down, a RESET is triggered, thus erasing any work registers.
  • the OverUnderPower Detection unit may be implemented in CMOS.
  • the OverUnderPower Detection unit may be covered by a tamper detection line, or by a tamper prevention line, or both, so that if an attacker attempts to tamper with the unit, the chip will either RESET or erase all secret information.
  • FIG. 1 illustrates a single authentication chip data protocol
  • FIG. 2 illustrates a dual authentication chip data protocol
  • FIG. 3 illustrates a first presence only protocol
  • FIG. 4 illustrates a second presence only protocol
  • FIG. 5 illustrates a third data protocol
  • FIG. 6 illustrates a fourth data protocol
  • FIG. 7 is a schematic block diagram of a maximal period LFSR
  • FIG. 8 is a schematic block diagram of a clock limiting filter
  • FIG. 9 is a schematic block diagram of the tamper detection lines
  • FIG. 10 illustrates an oversized nMOS transistor used as test transistors in the tamper detection line of FIG. 9 ;
  • FIG. 11 is a circuit diagram of part of the tamper detection line of FIG. 9 including XOR gates between the two paths;
  • FIG. 12 illustrates how the Tamper Lines cover the noise generator circuitry
  • FIG. 13 is a circuit diagram of the normal FET implementation of a CMOS inverter
  • FIG. 14 is voltage/current diagram for the transistors of the CMOS inverter of FIG. 13 ;
  • FIG. 15 is a circuit diagram of the FET implementation of a non-flashing CMOS inverter
  • FIG. 16 is impedance diagram for the transistors of the CMOS inverter of FIG. 15 .
  • the authentication chip of the preferred embodiment is responsible for ensuring that only correctly manufactured print rolls are utilized in the camera system.
  • the authentication chip utilizes technologies that are generally valuable when utilized with any consumables and are not restricted to print roll system.
  • Manufacturers of other systems that require consumables such as a laser printer that requires toner cartridges
  • the prevention of copying is important to prevent poorly manufactured substitute consumables from damaging the base system. For example, poorly filtered ink may clog print nozzles in an ink jet printer, causing the consumer to blame the system manufacturer and not admit the use of non-authorized consumables.
  • the Authentication chip contains an authentication code and circuit specially designed to prevent copying.
  • the chip is manufactured using the standard Flash memory manufacturing process, and is low cost enough to be included in consumables such as ink and toner cartridges.
  • consumables such as ink and toner cartridges.
  • the Authentication chips as described here are compliant with the NSA export guidelines. Authentication is an extremely large and constantly growing field. Here we are concerned with authenticating consumables only.
  • X Y X is equal to Y X ⁇ Y X is not equal to Y X Decrement X by 1 (floor 0) X Increment X by 1 (with wrapping based on register length)
  • Erase X Erase Flash memory register X SetBits[X, Y] Set the bits of the Flash memory register X based on Y Z ⁇ ShiftRight[X, Y] Shift register X right one bit position, taking input bit from Y and placing the output bit in Z
  • the process of transforming M into cyphertext C, where the substance of M is hidden, is called encryption.
  • the process of transforming C back into M is called decryption.
  • E the encryption function
  • D the decryption function
  • a symmetric encryption algorithm is one where:
  • K 1 usually equals K 2 .
  • K 2 the number of keys in K 1 .
  • K 1 does not equal K 2 .
  • K 1 does not equal K 2 .
  • K 1 does not equal K 2 , given that one key can be derived from the other, a single key K can suffice for the mathematical definition.
  • DES Data Encryption Standard
  • the key length is 56 bits. It has been implemented in hardware and software, although the original design was for hardware only. The original algorithm used in DES is described in U.S. Pat. No. 3,962,539.
  • a variant of DES, called triple-DES is more secure, but requires 3 keys: K 1 , K 2 , and K 3 .
  • the keys are used in the following manner
  • triple-DES gives protection of equivalent key length of 112 bits.
  • Triple-DES does not give the equivalent protection of a 168-bit key (3 ⁇ 56) as one might naively expect.
  • Equipment that performs triple-DES decoding and/or encoding cannot be exported from the United States.
  • Blowfish is a symmetric block cipher first presented by Schneier in 1994. It takes a variable length key, from 32 bits to 448 bits. In addition, it is much faster than DES.
  • the Blowfish algorithm consists of two parts: a key-expansion part and a data-encryption part. Key expansion converts a key of at most 448 bits into several subkey arrays totaling 4168 bytes. Data encryption occurs via a 16-round Feistel network. All operations are XORs and additions on 32-bit words, with four index array lookups per round. It should be noted that decryption is the same as encryption except that the subkey arrays are used in the reverse order. Complexity of implementation is therefore reduced compared to other algorithms that do not have such symmetry.
  • RC5 has a variable block size, key size, and number of rounds. Typically, however, it uses a 64-bit block size and a 128-bit key.
  • the RC5 algorithm consists of two parts: a key-expansion part and a data-encryption part.
  • Data encryption uses addition mod 2 w , XOR and bitwise rotation.
  • IDEA IDEA uses 128 bit-keys to operate on 64-bit plaintext blocks.
  • the same algorithm is used for encryption and decryption. It is generally regarded to be the most secure block algorithm available today. It is described in U.S. Pat. No. 5,214,703, issued in 1993.
  • An asymmetric encryption algorithm is one where:
  • the RSA cryptosystem named after Rivest, Shamir, and Adleman, is the most widely used public-key cryptosystem, and is a de facto standard in much of the world.
  • the security of RSA is conjectured to depend on the difficulty of factoring large numbers that are the product of two primes (p and q).
  • p and q primes
  • p and q should be strong primes.
  • the RSA algorithm patent was issued in 1983 (U.S. Pat. No. 4,405,829).
  • DSA Digital Signature Standard
  • DSA Digital Signature Standard
  • DSA Digital Signature Standard
  • Cryptographic Standard is an algorithm designed as part of the Digital Signature Standard (DSS). As defined, it cannot be used for generalized encryption. In addition, compared to RSA, DSA is 10 to 40 times slower for signature verification. DSA explicitly uses the SHA-1 hashing algorithm (see definition in One-way Functions below). DSA key generation relies on finding two primes p and q such that q divides p ⁇ 1. According to Schneier, a 1024-bit p value is required for long term DSA security. However the DSA standard does not permit values of p larger than 1024 bits (p must also be a multiple of 64 bits). The US Government owns the DSA algorithm and has at least one relevant patent (U.S. Pat. No. 5,231,688 granted in 1993).
  • the ElGamal scheme is used for both encryption and digital signatures.
  • the security is based on the difficulty of calculating discrete logarithms in a finite field.
  • the public key is y, g, and p.
  • the private key is x.
  • the general principle of a challenge-response protocol is to provide identity authentication adapted to a camera system.
  • the simplest form of challenge-response takes the form of a secret password.
  • A asks B for the secret password, and if B responds with the correct password, A declares B authentic.
  • A asks B for the secret password, and if B responds with the correct password, A declares B authentic.
  • the Zero Knowledge Proof protocol first described by Feige, Fiat and Shamir is extensively used in Smart Cards for the purpose of authentication.
  • the protocol's effectiveness is based on the assumption that it is computationally infeasible to compute square roots modulo a large composite integer with unknown factorization. This is provably equivalent to the assumption that factoring large integers is difficult. It should be noted that there is no need for the claimant to have significant computing power. Smart cards implement this kind of authentication using only a few modular multiplications.
  • the Zero Knowledge Proof protocol is described in U.S. Pat. No. 4,748,668.
  • a one-way function F operates on an input X, and returns F[X] such that X cannot be determined from F[X].
  • F[X] contains fewer bits than X
  • collisions must exist.
  • X contains more bits than F[X]
  • the input must be compressed in some way to create the output. In many cases, X is broken into blocks of a particular size, and compressed over a number of rounds, with the output of one round being the input to the next.
  • the output of the hash function is the last output once X has been consumed.
  • the selection of a good key depends on the encryption algorithm chosen. Certain keys are not strong for particular encryption algorithms, so any key needs to be tested for strength. The more tests that need to be performed for key selection, the less likely the key will remain hidden.
  • collision resistant one-way hash functions are SHA-1, MD5 and RIPEMD-160, all derived from MD4.
  • MD5 Ron Rivest introduced MD5 in 1991 as a more secure MD4. Like MD4, MD5 produces a 128-bit hash value. Dobbertin describes the status of MD5 after recent attacks. He describes how pseudo-collisions have been found in MD5, indicating a weakness in the compression function, and more recently, collisions have been found. This means that MD5 should not be used for compression in digital signature schemes where the existence of collisions may have dire consequences. However MD5 can still be used as a one-way function. In addition, the HMAC-MD5 construct is not affected by these recent attacks.
  • SHA-1 is very similar to MD5, but has a 160-bit hash value (MD5 only has 128 bits of hash value).
  • SHA-1 was designed and introduced by the NIST and NSA for use in the Digital Signature Standard (DSS). The original published description was called SHA, but very soon afterwards, was revised to become SHA-1, supposedly to correct a security flaw in SHA (although the NSA has not released the mathematical reasoning behind the change). There are no known cryptographic attacks against SHA-1. It is also more resistant to brute-force attacks than MD4 or MD5 simply because of the longer hash result.
  • the US Government owns the SHA-1 and DSA algorithms (a digital signature authentication algorithm defined as part of DSS) and has at least one relevant patent (U.S. Pat. No. 5,231,688 granted in 1993).
  • RIPEMD-160 is a hash function derived from its predecessor RIPEMD (developed for the European Community's RIPE project in 1992). As its name suggests, RIPEMD-160 produces a 160-bit hash result. Tuned for software implementations on 32-bit architectures, RIPEMD-160 is intended to provide a high level of security for 10 years or more. Although there have been no successful attacks on RIPEMD-160, it is comparatively new and has not been extensively cryptanalyzed. The original RIPEMD algorithm was specifically designed to resist known cryptographic attacks on MD4. The recent attacks on MD5 showed similar weaknesses in the RIPEMD 128-bit hash function. Although the attacks showed only theoretical weaknesses, Dobbertin, Preneel and Bosselaers further strengthened RIPEMD into a new algorithm RIPEMD-160.
  • Message authentication is different from entity authentication.
  • entity authentication one entity (the claimant) proves its identity to another (the verifier).
  • message authentication we are concerned with making sure that a given message is from who we think it is from i.e. it has not been tampered en route from the source to its destination.
  • a one-way hash function is not sufficient protection for a message.
  • Hash functions such as MD5 rely on generating a hash value that is representative of the original input, and the original input cannot be derived from the hash value.
  • a simple attack by E who is in-between A and B, is to intercept the message from B, and substitute his own. Even if A also sends a hash of the original message, E can simply substitute the hash of his new message.
  • MAC Message Authentication Code
  • hash functions were never originally designed to contain a key or to support message authentication.
  • some ad hoc methods of using hash functions to perform message authentication including various functions that concatenate messages with secret prefixes, suffixes, or both have been proposed.
  • Most of these ad hoc methods have been successfully attacked by sophisticated means.
  • Additional MACs have been suggested based on XOR schemes and Toeplitz matricies (including the special case of LFSR-based constructions).
  • the HMAC construction in particular is gaining acceptance as a solution for Internet message authentication security protocols.
  • the HMAC construction acts as a wrapper, using the underlying hash function in a black-box way. Replacement of the hash function is straightforward if desired due to security or performance reasons.
  • the major advantage of the HMAC construct is that it can be proven secure provided the underlying hash function has some reasonable cryptographic strengths—that is, HMAC's strengths are directly connected to the strength of the hash function. Since the HMAC construct is a wrapper, any iterative hash function can be used in an HMAC. Examples include HMAC-MD5, HMAC-SHA1, HMAC-RIPEMD160 etc. Given the following definitions:
  • the HMAC algorithm is as follows:
  • HMAC[ M] H [( K ⁇ opad )
  • the recommended key length is at least n bits, although it should not be longer than 64 bytes (the length of the hashing block). A key longer than n bits does not add to the security of the function.
  • HMAC optionally allows truncation of the final output e.g. truncation to 128 bits from 160 bits.
  • the HMAC designers' Request for Comments was issued in 1997, one year after the algorithm was first introduced. The designers claimed that the strongest known attack against HMAC is based on the frequency of collisions for the hash function H and is totally impractical for minimally reasonable hash functions. More recently, HMAC protocols with replay prevention components have been defined in order to prevent the capture and replay of any M, HMAC[M] combination within a given time period.
  • random number generator theory is very much intertwined with cryptography, security, and authentication.
  • Knuth describes what makes a generator good (including statistical tests), and the general problems associated with constructing them.
  • One of the uses for random numbers is to ensure that messages vary over time. Consider a system where A encrypts commands and sends them to B. If the encryption algorithm produces the same output for a given input, an attacker could simply record the messages and play them back to fool B. There is no need for the attacker to crack the encryption mechanism other than to know which message to play to B (while pretending to be A).
  • messages often include a random number and a time stamp to ensure that the message (and hence its encrypted counterpart) varies each time.
  • Random number generators are also often used to generate keys. It is therefore best to say at the moment, that all generators are insecure for this purpose.
  • the Berlekamp-Massey algorithm is a classic attack on an LFSR random number generator. If the LFSR is of length n, then only 2n bits of the sequence suffice to determine the LFSR, compromising the key generator. If, however, the only role of the random number generator is to make sure that messages vary over time, the security of the generator and seed is not as important as it is for session key generation.
  • Random number seed generator If however, the random number seed generator is compromised, and an attacker is able to calculate future “random” numbers, it can leave some protocols open to attack. Any new protocol should be examined with respect to this situation.
  • the actual type of random number generator required will depend upon the implementation and the purposes for which the generator is used. Generators include Blum, Blum, and Shub, stream ciphers such as RC4 by Ron Rivest, hash functions such as SHA-1 and RIPEMD-160, and traditional generators such LFSRs (Linear Feedback Shift Registers) and their more recent counterpart FCSRs (Feedback with Carry Shift Registers).
  • This section describes the various types of attacks that can be undertaken to break an authentication cryptosystem such as the authentication chip.
  • the attacks are grouped into physical and logical attacks.
  • Physical attacks describe methods for breaking a physical implementation of a cryptosystem (for example, breaking open a chip to retrieve the key), while logical attacks involve attacks on the cryptosystem that are implementation independent.
  • Logical types of attack work on the protocols or algorithms, and attempt to do one of three things:
  • a known-plaintext attack is one where the attacker can see the data flow between the System and the Authentication Chip.
  • the inputs and outputs are observed (not chosen by the attacker), and can be analyzed for weaknesses (such as birthday attacks or by a search for differentially interesting input/output pairs).
  • a known plaintext attack is a weaker type of attack than the chosen plaintext attack, since the attacker can only observe the data flow.
  • a known plaintext attack can be carried out by connecting a logic analyzer to the connection between the System and the Authentication Chip.
  • a chosen plaintext attack describes one where a cryptanalyst has the ability to send any chosen message to the cryptosystem, and observe the response. If the cryptanalyst knows the algorithm, there may be a relationship between inputs and outputs that can be exploited by feeding a specific output to the input of another function.
  • the cryptanalyst can logically pretend he/she is the System, and thus send any chosen bit-pattern streams to the Authentication Chip.
  • This type of attack is where an attacker attempts to simply “guess” the key. As an attack it is identical to the Brute force attack, where the odds of success depend on the length of the key.
  • a quantum computer (NMR, Optical, or Caged Atom) containing n qubits embedded in an appropriate algorithm must be built.
  • the quantum computer effectively exists in 2 n simultaneous coherent states.
  • the trick is to extract the right coherent state without causing any decoherence. To date this has been achieved with a 2 qubit system (which exists in 4 coherent states). It is thought possible to extend this to 6 qubits (with 64 simultaneous coherent states) within a few years.
  • attackers can gather valuable information from the results of a bad input. This can range from the error message text to the time taken for the error to be generated.
  • a simple example is that of a userid/password scheme. If the error message usually says “Bad userid”, then when an attacker gets a message saying “Bad password” instead, then they know that the userid is correct. If the message always says “Bad userid/password” then much less information is given to the attacker.
  • a more complex example is that of the recent published method of cracking encryption codes from secure web sites.
  • the attack involves sending particular messages to a server and observing the error message responses. The responses give enough information to learn the keys—even the lack of a response gives some information.
  • An example of algorithmic time can be seen with an algorithm that returns an error as soon as an erroneous bit is detected in the input message.
  • it may be a simple method for the attacker to time the response and alter each bit one by one depending on the time taken for the error response, and thus obtain the key.
  • the time taken can be observed with far greater accuracy than over the Internet.
  • the Birthday Attack comes into play.
  • the attacker searches for two messages that share the same hash value (analogous to any two people sharing a birthday), only one message is acceptable to the person signing it, and the other is beneficial for the attacker.
  • Once the person has signed the original message the attacker simply claims now that the person signed the alternative message—mathematically there is no way to tell which message was the original, since they both hash to the same value.
  • the weakening of an n-bit key by the birthday attack is 2 n/2 .
  • a key length of 128 bits that is susceptible to the birthday attack has an effective length of only 64 bits.
  • the chaining variable is c bits
  • the hashing function behaves like a random mapping
  • the block length is b bits
  • the number of such b-bit blocks is approximately 2b/2c.
  • the challenge for finding a substitution block is that such blocks are a sparse subset of all possible blocks.
  • the number of 512 bit blocks is approximately 2 512 /2 160 , or 2 352 .
  • the chance of finding a block by brute force search is about 1 in 2 160 .
  • the clone manufacturer could incorporate a ROM in their chip that had a record of all of the responses from a genuine chip to the codes sent by the system. The larger the key, and the larger the response, the more space is required for such a lookup table.
  • Differential cryptanalysis describes an attack where pairs of input streams are generated with known differences, and the differences in the encoded streams are analyzed.
  • Existing differential attacks are heavily dependent on the structure of S boxes, as used in DES and other similar algorithms.
  • HMAC-SHA1 have no S boxes, an attacker can undertake a differential-like attack by undertaking statistical analysis of:
  • a man-in-the-middle can substitute part or all of a message. This is where a real Authentication Chip is plugged into a reusable clone chip within the consumable.
  • the clone chip intercepts all messages between the System and the Authentication Chip, and can perform a number of substitution attacks.
  • a message containing a header followed by content An attacker may not be able to generate a valid header, but may be able to substitute their own content, especially if the valid response is something along the lines of “Yes, I received your message”. Even if the return message is “Yes, I received the following message . . . ”, the attacker may be able to substitute the original message before sending the acknowledgement back to the original sender.
  • Message Authentication Codes were developed to combat most message substitution attacks.
  • a clone Authentication Chip can accomplish a simple authentication bypass by simulating a loss of connection after the use of the consumable but before the authentication protocol has completed (or even started).
  • One infamous attack known as the “Kentucky Fried Chip” hack involved replacing a microcontroller chip for a satellite TV system. When a subscriber stopped paying the subscription fee, the system would send out a “disable” message. However the new microcontroller would simply detect this message and not pass it on to the consumer's satellite TV system.
  • Reading ROM describes an attack when keys are stored in ROM, while the remaining attacks assume that a secret key is stored in Flash memory.
  • ROM If a key is stored in ROM it can be read directly. A ROM can thus be safely used to hold a public key (for use in asymmetric cryptography), but not to hold a private key. In symmetric cryptography, a ROM is completely insecure. Using a copyright text (such as a haiku) as the key is not sufficient, because we are assuming that the cloning of the chip is occurring in a country where intellectual property is not respected.
  • Reverse engineering of the chip is where an attacker opens the chip and analyzes the circuitry. Once the circuitry has been analyzed the inner workings of the chip's algorithm can be recovered. Lucent Technologies have developed an active method known as TOBIC (Two photon OBIC, where OBIC stands for Optical Beam Induced Current), to image circuits. Developed primarily for static RAM analysis, the process involves removing any back materials, polishing the back surface to a mirror finish, and then focusing light on the surface. The excitation wavelength is specifically chosen not to induce a current in the IC. A Kerckhoffs in the nineteenth century made a fundamental assumption about cryptanalysis: if the algorithm's inner workings are the sole secret of the scheme, the scheme is as good as broken. He stipulated that the secrecy must reside entirely in the key. As a result, the best way to protect against reverse engineering of the chip is to make the inner workings irrelevant.
  • any clone manufacturer has access to both the System and consumable designs. If the same channel is used for communication between the System and a trusted System Authentication Chip, and a non-trusted consumable Authentication Chip, it may be possible for the non-trusted chip to interrogate a trusted Authentication Chip in order to obtain the “correct answer”. If this is so, a clone manufacturer would not have to determine the key. They would only have to trick the System into using the responses from the System Authentication Chip.
  • the alternative method of usurping the authentication process follows the same method as the logical attack “Bypassing the Authentication Process”, involving simulated loss of contact with the System whenever authentication processes take place, simulating power-down etc.
  • This kind of attack is where the System itself is modified to accept clone consumables.
  • the attack may be a change of System ROM, a rewiring of the consumable, or, taken to the extreme case, a completely clone System.
  • This kind of attack requires each individual System to be modified, and would most likely require the owner's consent. There would usually have to be a clear advantage for the consumer to undertake such a modification, since it would typically void warranty and would most likely be costly.
  • An example of such a modification with a clear advantage to the consumer is a software patch to change fixed-region DVD players into region-free DVD players.
  • Chips are typically designed to properly operate within a certain clock speed range. Some attackers attempt to introduce faults in logic by running the chip at extremely high clock speeds or introduce a clock glitch at a particular time for a particular duration. The idea is to create race conditions where the circuitry does not function properly. An example could be an AND gate that (because of race conditions) gates through Input 1 all the time instead of the AND of Input 1 and Input 2 . If an attacker knows the internal structure of the chip, they can attempt to introduce race conditions at the correct moment in the algorithm execution, thereby revealing information about the key (or in the worst case, the key itself).
  • Single bits in a ROM can be overwritten using a laser cutter microscope, to either 1 or 0 depending on the sense of the logic.
  • a laser cutter microscope With a given opcode/operand set, it may be a simple matter for an attacker to change a conditional jump to a non-conditional jump, or perhaps change the destination of a register transfer. If the target instruction is chosen carefully, it may result in the key being revealed.
  • EEPROM/Flash attacks are similar to ROM attacks except that the laser cutter microscope technique can be used to both set and reset individual bits. This gives much greater scope in terms of modification of algorithms.
  • an attacker may simply set a single bit by use of a laser cutter microscope. Although the attacker doesn't know the previous value, they know the new value. If the chip still works, the bit's original state must be the same as the new state. If the chip doesn't work any longer, the bit's original state must be the logical NOT of the current state. An attacker can perform this attack on each bit of the key and obtain the n-bit key using at most n chips (if the new bit matched the old bit, a new chip is not required for determining the next bit).
  • test circuitry specifically designed to check for manufacturing defects. This includes BIST (Built In Self Test) and scan paths. Quite often the scan paths and test circuitry includes access and readout mechanisms for all the embedded latches. In some cases the test circuitry could potentially be used to give information about the contents of particular registers. Test circuitry is often disabled once the chip has passed all manufacturing tests, in some cases by blowing a specific connection within the chip. A determined attacker, however, can reconnect the test circuitry and hence enable it.
  • BIST Busilt In Self Test
  • each of these stages must be examined in terms of ramifications for security should chips be stolen. For example, if information is programmed into the chip in stages, theft of a chip between stages may allow an attacker to have access to key information or reduced efforts for attack. Similarly, if a chip is stolen directly after manufacture but before programming, does it give an attacker any logical or physical advantage?
  • each chip should hold secure state information about the consumable being authenticated. It should be noted that a Consumable Lifetime Authentication Chip could be used in any situation requiring a Presence Only Authentication Chip.
  • the requirements for authentication, data storage integrity and manufacture should be considered separately. The following sections summarize requirements of each.
  • the access requirements of these two data types differ greatly.
  • the Authentication chip therefore requires a storage/access control mechanism that allows for the integrity requirements of each type.
  • Authentication data must remain confidential. It needs to be stored in the chip during a manufacturing/programming stage of the chip's life, but from then on must not be permitted to leave the chip. It must be resistant to being read from non-volatile memory.
  • the authentication scheme is responsible for ensuring the key cannot be obtained by deduction, and the manufacturing process is responsible for ensuring that the key cannot be obtained by physical means.
  • the size of the authentication data memory area must be large enough to hold the necessary keys and secret information as mandated by the authentication protocols.
  • Each Authentication chip needs to be able to also store 256 bits (32 bytes) of consumable state data.
  • Consumable state data can be divided into the following types. Depending on the application, there will be different numbers of each of these types of data items. A maximum number of 32 bits for a single data item is to be considered.
  • Read Only data needs to be stored in the chip during a manufacturing/programming stage of the chip's life, but from then on should not be allowed to change.
  • Examples of Read Only data items are consumable batch numbers and serial numbers.
  • ReadWrite data is changeable state information, for example, the last time the particular consumable was used. ReadWrite data items can be read and written an unlimited number of times during the lifetime of the consumable. They can be used to store any state information about the consumable. The only requirement for this data is that it needs to be kept in non-volatile memory. Since an attacker can obtain access to a system (which can write to ReadWrite data), any attacker can potentially change data fields of this type. This data type should not be used for secret information, and must be considered insecure.
  • Decrement Only data is used to count down the availability of consumable resources.
  • a photocopier's toner cartridge may store the amount of toner remaining as a Decrement Only data item.
  • An ink cartridge for a color printer may store the amount of each ink color as a Decrement Only data item, requiring 3 (one for each of Cyan, Magenta, and Yellow), or even as many as 5 or 6 Decrement Only data items.
  • the requirement for this kind of data item is that once programmed with an initial value at the manufacturing/programming stage, it can only reduce in value. Once it reaches the minimum value, it cannot decrement any further.
  • the Decrement Only data item is only required by Consumable Lifetime Authentication.
  • the Authentication chip ideally must have a low manufacturing cost in order to be included as the authentication mechanism for low cost consumables.
  • the Authentication chip should use a standard manufacturing process, such as Flash. This is necessary to:
  • the circuitry of the authentication part of the chip must be resistant to physical attack.
  • Physical attack comes in four main ways, although the form of the attack can vary:
  • the chip should be exportable from the U.S., so it should not be possible to use an Authentication chip as a secure encryption device. This is low priority requirement since there are many companies in other countries able to manufacture the Authentication chips. In any case, the export restrictions from the U.S. may change.
  • ChipA a single chip 10 (referred to as ChipA) is responsible for proving to a system 11 (referred to as System) that it is authentic.
  • System 11 is unsure of ChipA's authenticity.
  • System 11 undertakes a challenge-response protocol with ChipA 10 , and thus determines ChipA's authenticity.
  • the authenticity of the consumable 12 is directly based on the authenticity of the chip, i.e. if ChipA 10 is considered authentic, then the consumable 12 , in which chip 10 is placed, is considered authentic.
  • the data flow can be seen in FIG. 1 , and involves a challenge 13 issued from the system, and a response 14 returned by the chip 10 .
  • System 11 can be software, hardware or a combination of both. It is important to note that System 11 is considered insecure—it can be easily reverse engineered by an attacker, either by examining the ROM or by examining circuitry. System is not specially engineered to be secure in itself.
  • ChipA A single chip 20 (referred to as ChipA) is responsible for proving to a system 21 (referred to as System) that it is authentic. ChipA 20 is associated with the consumable 22 . As part of the authentication process, System 21 makes use of a trusted Authentication Chip 23 (referred to as ChipT).
  • ChipT a trusted Authentication Chip 23
  • System 21 can be software, hardware or a combination of both.
  • ChipT 23 must be a physical Authentication Chip.
  • ChipT 23 and ChipA 20 have the same internal structure, while in others ChipT 23 and ChipA 20 have different internal structures.
  • the data flow can be seen in FIG. 2 , and can be seen to involve a challenge 24 from system 21 to chipA 20 and a request 25 from system 21 to chipT 23 , and a response 26 from chipA 20 to system 21 and information 27 from chipT 23 to system 21 .
  • Protocol 1 requires 2 Authentication Chips, while Protocol 2 can be implemented using either 1 or 2 Authentication Chips.
  • Protocol 1 is a double chip protocol (two Authentication Chips are required). Each Authentication Chip contains the following values:
  • Each Authentication Chip contains the following logical functions:
  • the protocol is as follows:
  • the data flow can be seen in FIG. 3 .
  • Protocol 1 The Security of Protocol 1 lies in two places:
  • Protocol 1 has several advantages:
  • Protocol 1 is implemented with F as an asymmetric encryption algorithm, there is no advantage over the symmetric case—the keys needs to be longer and the encryption algorithm is more expensive in silicon. Protocol 1 must be implemented with 2 Authentication Chips in order to keep the key secure. This means that each System requires an Authentication Chip and each consumable requires an Authentication Chip.
  • System may contain a large amount of processing power.
  • integration of ChipT into System may be desirable.
  • Use of an asymmetrical encryption algorithm allows the ChipT portion of System to be insecure. Protocol 2 therefore, uses asymmetric cryptography. For this protocol, each chip contains the following values:
  • the public key K T is in ChipT 23 , while the secret key K A is in ChipA 20 .
  • Having K T in ChipT 23 has the advantage that ChipT can be implemented in software or hardware (with the proviso that the seed for R is different for each chip or system).
  • Protocol 2 therefore can be implemented as a Single Chip Protocol or as a Double Chip Protocol.
  • the protocol for authentication is as follows:
  • the data flow can be seen in FIG. 4 .
  • Protocol 2 has the following advantages:
  • Protocol 2 has a number of its own problems:
  • Protocol 1 is the protocol of choice for Presence Only Authentication.
  • Protocols 1 and 2 only check that ChipA is a real Authentication Chip. They do not check to see if the consumable itself is valid. The fundamental assumption for authentication is that if ChipA is valid, the consumable is valid. It is therefore possible for a clone manufacturer to insert a real Authentication Chip into a clone consumable. There are two cases to consider:
  • Protocols 1 and 2 can be useful in situations where it is not cost effective for a clone manufacturer to embed a real Authentication chip into the consumable. If the consumable cannot be recycled or refilled easily, it may be protection enough to use Protocols 1 or 2.
  • each clone consumable must include a valid Authentication Chip. The chips would have to be stolen en masse, or taken from old consumables. The quantity of these reclaimed chips (as well as the effort in reclaiming them) should not be enough to base a business on, so the added protection of secure data transfer (see Protocols 3 and 4) may not be useful.
  • Protocol 3 requires 2 Authentication Chips, while Protocol 4 can be implemented using either 1 or 2 Authentication Chips.
  • This protocol is a double chip protocol (two Authentication Chips are required).
  • each Authentication Chip contains the following values:
  • Each Authentication Chip contains the following logical functions:
  • the data flow for read authentication is shown in FIG. 5 .
  • Protocol 3 The first thing to note about Protocol 3 is that F K [X] cannot be called directly. Instead F K [X] is called indirectly by Random, Test and Read:
  • each chip can only give a specific number of X, F K [X] pairs away in a certain time period.
  • the only specific timing requirement of Protocol 3 is that the return value of 0 (indicating a bad input) must be produced in the same amount of time regardless of where the error is in the input. Attackers can therefore not learn anything about what was bad about the input value. This is true for both RD and TST functions.
  • Protocol 3 Reading data from ChipA also requires authentication of ChipA.
  • the System can be sure that the contents of memory (M) is what ChipA claims it to be if F K2 [R
  • M contents of memory
  • M] F K2 [R
  • M] assures System that not only did an authentic ChipA send M, but also that M was not altered in between ChipA and System.
  • the Write function as defined does not authenticate the Write. To authenticate a write, the System must perform a Read after each Write.
  • Protocol 3 the only way to authenticate ChipA is to read the contents of ChipA's memory.
  • the security of this protocol depends on the underlying F K [X] scheme and the domain of R over the set of all Systems.
  • F K [X] can be any keyed one-way function, there is no advantage to implement it as asymmetric encryption. The keys need to be longer and the encryption algorithm is more expensive in silicon. This leads to a second protocol for use with asymmetric algorithms—Protocol 4. Protocol 3 must be implemented with 2 Authentication Chips in order to keep the keys secure. This means that each System requires an Authentication Chip and each consumable requires an Authentication Chip
  • System may contain a large amount of processing power.
  • integration of ChipT into System may be desirable.
  • Use of an asymmetrical encryption algorithm can allow the ChipT portion of System to be insecure. Protocol 4 therefore, uses asymmetric cryptography. For this protocol, each chip contains the following values:
  • the public key K T is in ChipT, while the secret key K A is in ChipA. Having K T in ChipT has the advantage that ChipT can be implemented in software or hardware (with the proviso that R is seeded with a different random number for each system).
  • the data flow for read authentication is shown in FIG. 6 .
  • ChipA Only a valid ChipA would know the value of R, since R is not passed into the Authenticate function (it is passed in as an encrypted value). R must be obtained by decrypting E[R], which can only be done using the secret key K A . Once obtained, R must be appended to M and then the result re-encoded. ChipT can then verify that the decoded form of E KA [R
  • M] R
  • Protocol 4 has a number of disadvantages:
  • Protocol 4 the only specific timing requirement of Protocol 4 is that the return value of 0 (indicating a bad input) must be produced in the same amount of time regardless of where the error is in the input. Attackers can therefore not learn anything about what was bad about the input value. This is true for both RD and TST functions.
  • Protocols 1 and 3 are the two protocols of choice. However, Protocols 1 and 3 contain much of the same components:
  • Protocol 3 requires an additional key (K 2 ), as well as some minimal state machine changes:
  • Protocol 3 only requires minimal changes over Protocol 1. It is more secure and can be used in all places where Presence Only Authentication is required (Protocol 1). It is therefore the protocol of choice. Given that Protocols 1 and 3 both make use of keyed one-way functions, the choice of one-way function is examined in more detail here. The following table outlines the attributes of the applicable choices. The attributes are worded so that the attribute is seen as an advantage.
  • Each of the protocols described (1-4) requires a random number generator.
  • the generator must be “good” in the sense that the random numbers generated over the life of all Systems cannot be predicted. If the random numbers were the same for each System, an attacker could easily record the correct responses from a real Authentication Chip, and place the responses into a ROM lookup for a clone chip. With such an attack there is no need to obtain K 1 or K 2 . Therefore the random numbers from each System must be different enough to be unpredictable, or non-deterministic.
  • the initial value for R (the random seed) should be programmed with a physically generated random number gathered from a physically random phenomenon, one where there is no information about whether a particular bit will be 1 or 0.
  • the seed for R must NOT be generated with a computer-run random number generator. Otherwise the generator algorithm and seed may be compromised enabling an attacker to generate and therefore know the set of all R values in all Systems.
  • R The simplest conceptual method of changing R is to increment it by 1. Since R is random to begin with, the values across differing systems are still likely to be random. However given an initial R, all subsequent R values can be determined directly (there is no need to iterate 10,000 times—R will take on values from R 0 to R 0 +10000). An incrementing R is immune to the earlier attack on a constant R. Since R is always different, there is no way to construct a lookup table for the particular System without wasting as many real Authentication Chips as the clone chip will replace.
  • the Random number generator 70 within the Authentication Chip is therefore an LFSR 71 with 160 bits and four taps 72 , 73 , 74 and 75 , which feed an exclusive-OR gate 76 , which in turn feeds back 77 to bit 159 .
  • Tap selection of the 160 bits for a maximal-period LFSR i.e. the LFSR will cycle through all 2 160 -1 states, 0 is not a valid state) yields bits 5 , 3 , 2 , and 0 , as shown in FIG. 7 .
  • the LFSR is sparse, in that not many bits are used for feedback (only 4 out of 160 bits are used). This is a problem for cryptographic applications, but not for this application of non-sequential number generation.
  • the 160-bit seed value for R can be any random number except 0, since an LFSR filled with 0s will produce a never-ending stream of 0s. Since the LFSR described is a maximal period LFSR, all 160 bits can be used directly as R. There is no need to construct a number sequentially from output bits of b 0 . After each successful call to TST, the random number (R) must be advanced by XORing bits 1 , 2 , 4 , and 159 , and shifting the result into the high order bit. The new R and corresponding F K1 [R] can be retrieved on the next call to Random.
  • Protocol 3 is the authentication scheme used by the Authentication Chip. As such, it should be resistant to defeat by logical means. While the effect of various types of attacks on Protocol 3 have been mentioned in discussion, this section details each type of attack in turn with reference to Protocol 3.
  • a Brute Force attack is guaranteed to break Protocol 3.
  • the length of the key means that the time for an attacker to perform a brute force attack is too long to be worth the effort.
  • An attacker only needs to break K 2 to build a clone Authentication Chip.
  • K 1 is merely present to strengthen K 2 against other forms of attack.
  • a Brute Force Attack on K 2 must therefore break a 160-bit key.
  • An attack against K 2 requires a maximum of 2 160 attempts, with a 50% chance of finding the key after only 2 159 attempts. Assuming an array of a trillion processors, each running one million tests per second, 2 159 (7.3 ⁇ 10 47 ) tests takes 2.3 ⁇ 10 23 years, which is longer than the lifetime of the universe. There are only 100 million personal computers in the world.
  • K 2 is open to a partial form of the Adaptive Chosen Plaintext attack, which is certainly a stronger form of attack than a simple Chosen Plaintext attack.
  • a chosen plaintext attack is not possible against K 1 , since there is no way for a caller to modify R, which used as input to the RND function (the only function to provide the result of hashing with K 1 ). Clearing R also has the effect of clearing the keys, so is not useful, and the SSI command calls CLR before storing the new R-value.
  • the HMAC construct provides security against all forms of chosen plaintext attacks. This is primarily because the HMAC construct has 2 secret input variables (the result of the original hash, and the secret key). Thus finding collisions in the hash function itself when the input variable is secret is even harder than finding collisions in the plain hash function. This is because the former requires direct access to SHA-1 (not permitted in Protocol 3) in order to generate pairs of input/output from SHA-1.
  • M] are not attacks against the SHA-1 hash function itself, and reduce the attack to a Differential Cryptanalysis attack, examining statistical differences between collected data.
  • Protocol 3 is resistant to the Adaptive Chosen Plaintext attacks.
  • TST and RD functions An attacker can only launch a Purposeful Error Attack on the TST and RD functions, since these are the only functions that validate input against the keys. With both the TST and RD functions, a 0 value is produced if an error is found in the input—no further information is given. In addition, the time taken to produce the 0 result is independent of the input, giving the attacker no information about which bit(s) were wrong. A Purposeful Error Attack is therefore fruitless.
  • Protocol 3 uses hashing as a form of digital signature.
  • the System sends a number that must be incorporated into the response from a valid Authentication Chip. Since the Authentication Chip must respond with H[R
  • the clone chip must therefore attempt to find a new value R 2 such that the hash of R 2 and a chosen M 2 yields the same hash value as H[R
  • the System Authentication Chip does not reveal the correct hash value (the TST function only returns 1 or 0 depending on whether the hash value is correct). Therefore the only way of finding out the correct hash value (in order to find a collision) is to interrogate a real Authentication Chip.
  • to find the correct value means to update M, and since the decrement-only parts of M are one-way, and the read-only parts of M cannot be changed, a clone consumable would have to update a real consumable before attempting to find a collision.
  • the alternative is a Brute Force attack search on the TST function to find a success (requiring each clone consumable to have access to a System consumable).
  • a Brute Force Search takes longer than the lifetime of the universe, in this case, per authentication. Due to the fact that a timely gathering of a hash value implies a real consumable must be decremented, there is no point for a clone consumable to launch this kind of attack.
  • the random number seed in each System is 160 bits.
  • the worst case situation for an Authentication Chip is that no state data is changed. Consequently there is a constant value returned as M.
  • a clone chip must still return F K2 [R
  • a sparse lookup table is only feasible if the messages sent to the Authentication Chip are somehow predictable, rather than effectively random.
  • the random number R is seeded with an unknown random number, gathered from a naturally random event.
  • the general sparse lookup table is therefore not a possible attack. However, it is possible for a clone manufacturer to know what the range of R is for a given System.
  • the clone Authentication Chip reports a full consumable, and then allows a single use before simulating loss of connection and insertion of a new full consumable.
  • the clone consumable would therefore need to contain responses for authentication of a full consumable and authentication of a partially used consumable.
  • the worst case ROM contains entries for full and partially used consumables for R over the lifetime of System.
  • a valid Authentication Chip must be used to generate the information, and be partially used in the process. If a given System only produces about n R-values, the sparse lookup-ROM required is 10n bytes multiplied by the number of different values for M. The time taken to build the ROM depends on the amount of time enforced between calls to RD.
  • the clone manufacturer must rely on the consumer returning for a refill, since the cost of building the ROM in the first place consumes a single consumable. The clone manufacturer's business in such a situation is consequently in the refills. The time and cost then, depends on the size of R and the number of different values for M that must be incorporated in the lookup.
  • a custom clone consumable ROM must be built to match each and every System, and a different valid Authentication Chip must be used for each System (in order to provide the full and partially used data). The use of an Authentication Chip in a System must therefore be examined to determine whether or not this kind of attack is worthwhile for a clone manufacturer.
  • this attack is possible as a per-System attack, and a decision must be made about the chance of this occurring for a given System/Consumable combination.
  • the chance will depend on the cost of the consumable and Authentication Chips, the longevity of the consumable, the profit margin on the consumable, the time taken to generate the ROM, the size of the resultant ROM, and whether customers will come back to the clone manufacturer for refills that use the same clone chip etc.
  • Protocol 3 can be via Known Plaintext, or from a Partially Adaptive Chosen Plaintext attack. Obviously the latter, being chosen, will be more useful.
  • Hashing algorithms in general are designed to be resistant to differential analysis. SHA-1 in particular has been specifically strengthened, especially by the 80 word expansion so that minimal differences in input produce will still produce outputs that vary in a larger number of bit positions (compared to 128 bit hash functions).
  • the information collected is not a direct SHA-1 input/output set, due to the nature of the HMAC algorithm.
  • the HMAC algorithm hashes a known value with an unknown value (the key), and the result of this hash is then rehashed with a separate unknown value. Since the attacker does not know the secret value, nor the result of the first hash, the inputs and outputs from SHA-1 are not known, making any differential attack extremely difficult. The following is a more detailed discussion of minimally different inputs and outputs from the Authentication Chip.
  • K 1 With K 1 , the attacker needs to statistically examine minimally different X, F K1 [X] pairs. However the attacker cannot choose any X value and obtain a related F K1 [X] value. Since X, F K1 [X] pairs can only be generated by calling the RND function on a System Authentication Chip, the attacker must call RND multiple times, recording each observed pair in a table. A search must then be made through the observed values for enough minimally different X values to undertake a statistical analysis of the F K1 [X] values.
  • K 2 With K 2 , the attacker needs to statistically examine minimally different X, F K2 [X] pairs.
  • the first way of limiting an attacker's choice is to limit Y, since RD requires an input of the format Y, F K1 [Y]. Although a valid pair can be readily obtained from the RND function, it is a pair of RND's choosing. An attacker can only provide their own Y if they have obtained the appropriate pair from RND, or if they know K 1 . Obtaining the appropriate pair from RND requires a Brute Force search. Knowing K 1 is only logically possible by performing cryptanalysis on pairs obtained from the RND function—effectively a known text attack. Although RND can only be called so many times per second, K 1 is common across System chips. Therefore known pairs can be generated in parallel.
  • the second way to limit an attacker's choice is to limit M, or at least the attacker's ability to choose M.
  • the limiting of M is done by making some parts of M Read Only, yet different for each Authentication Chip, and other parts of M Decrement Only.
  • the Read Only parts of M should ideally be different for each Authentication Chip, so could be information such as serial numbers, batch numbers, or random numbers.
  • the Decrement Only parts of M mean that for an attacker to try a different M, they can only decrement those parts of M so many times—after the Decrement Only parts of M have been reduced to 0 those parts cannot be changed again.
  • a clone consumable In order for this kind of attack to be carried out, a clone consumable must contain a real Authentication chip, but one that is effectively reusable since it never gets decremented.
  • the clone Authentication Chip would intercept messages, and substitute its own. However this attack does not give success to the attacker.
  • a clone Authentication Chip may choose not to pass on a WR command to the real Authentication Chip. However the subsequent RD command must return the correct response (as if the WR had succeeded).
  • the hash value must be known for the specific R and M. As described in the Birthday Attack section, an attacker can only determine the hash value by actually updating M in a real Chip, which the attacker does not want to do. Even changing the R sent by System does not help since the System Authentication Chip must match the R during a subsequent TST. A Message substitution attack would therefore be unsuccessful. This is only true if System updates the amount of consumable remaining before it is used.
  • Protocol 3 requires the System to update the consumable state data before the consumable is used, and follow every write by a read (to authenticate the write). Thus each use of the consumable requires an authentication. If the System adheres to these two simple rules, a clone manufacturer will have to simulate authentication via a method above (such as sparse ROM lookup).
  • Protocol 3 requires the System to update the consumable state data before the consumable is used, and follow every write by a read (to authenticate the write). Thus each use of the consumable requires an authentication. If a consumable has been used up, then its Authentication Chip will have had the appropriate state-data values decremented to 0. The chip can therefore not be used in another consumable. Note that this only holds true for Authentication Chips that hold Decrement-Only data items. If there is no state data decremented with each usage, there is nothing stopping the reuse of the chip. This is the basic difference between Presence-Only Authentication and Consumable Lifetime Authentication. Protocol 3 allows both. The bottom line is that if a consumable has Decrement Only data items that are used by the System, the Authentication Chip cannot be reused without being completely reprogrammed by a valid Programming Station that has knowledge of the secret key.
  • the manufacturer should therefore keep a backup of the key information in several parts, where a certain number of people must together combine their portions to reveal the full key information. This may be required if case the chip programming station needs to be reloaded. In any case, none of these attacks are against Protocol 3 itself, since no humans are involved in the authentication process. Instead, it is an attack against the programming stage of the chips.
  • the mechanism for authentication is the HMAC-SHA1 algorithm, acting on one of:
  • the HMAC algorithm is as follows:
  • HMAC[ M] H [( K ⁇ opad )
  • the SHA1 hashing algorithm is defined in the algorithm as summarized here.
  • Non-optimized SHA-1 requires a total of 2912 bits of data storage:
  • the hashing algorithm consists of firstly padding the input message to be a multiple of 512 bits and initializing the chaining variables H 1-5 with h 1-5 .
  • the padded message is then processed in 512-bit chunks, with the output hash value being the final 160-bit value given by the concatenation of the chaining variables: H 1
  • the steps of the SHA-1 algorithm are now examined in greater detail.
  • the first step of SHA-1 is to pad the input message to be a multiple of 512 bits as follows and to initialize the chaining variables.
  • Steps to follow to preprocess the input message Pad the input message Append a 1 bit to the message Append 0 bits such that the length of the padded message is 64-bits short of a multiple of 512 bits. Append a 64-bit value containing the length in bits of the original input message. Store the length as most significant bit through to least significant bit. Initialize the H 1 ⁇ h 1 , H 2 ⁇ h 2 , H 3 ⁇ h 3 , H 4 ⁇ h 4 , H 5 ⁇ h 5 chaining variables
  • the padded input message can now be processed.
  • Each 512-bit block is in the form of 16 ⁇ 32-bit words, referred to as InputWord 0-15 .
  • the output hash value is the final 160-bit value given by: H 1
  • the SHA-1 Step 2 procedure is not optimized for hardware.
  • the 80 temporary 32-bit registers use up valuable silicon on a hardware implementation.
  • This section describes an optimization to the SHA-1 algorithm that only uses 16 temporary registers.
  • the reduction in silicon is from 2560 bits down to 512 bits, a saving of over 2000 bits. It may not be important in some applications, but in the Authentication Chip storage space must be reduced where possible.
  • the optimization is based on the fact that although the original 16-word message block is expanded into an 80-word message block, the 80 words are not updated during the algorithm.
  • the words rely on the previous 16 words only, and hence the expanded words can be calculated on-the-fly during processing, as long as we keep 16 words for the backward references.
  • N 1 , N 2 , and N 3 during Rounds 0 and 1A is optional. A software implementation would not increment them, since it takes time, and at the end of the 16 times through the loop, all 4 counters will be their original values. Designers of hardware may wish to increment all 4 counters together to save on control logic. Round 0 can be completely omitted if the caller loads the 512 bits of X 0-15 .
  • the HMAC-SHA1 unit only ever performs hashing on two types of inputs: on R using K 1 and on R
  • the padding of messages in SHA-1 Step 1 (a 1 bit, a string of 0 bits, and the length of the message) is necessary to ensure that different messages will not look the same after padding. Since we only deal with 2 types of messages, our padding can be constant 0s.
  • the optimized version of the SHA-1 algorithm is used, where only 16 32-bit words are used for temporary storage. These 16 registers are loaded directly by the optimized HMAC-SHA1 hardware. The Nine 32-bit constants h 1-5 and y 1-4 are still required, although the fact that they are constants is an advantage for hardware implementation.
  • Hardware optimized HMAC-SHA-1 requires a total of 1024 bits of data storage:
  • the original input message R is a constant length of 160 bits.
  • the pseudocode takes on the following steps:
  • Step Description Action 1 Process K ⁇ ipad X 0-4 ⁇ K 1 ⁇ 0x363636 . . . 2 X 5-15 ⁇ 0x363636 . . . 3 H 1-5 ⁇ h 1-5 4 Process Block 5 Process R X 0-4 ⁇ R 6 X 5-15 ⁇ 0 7 Process Block 8 Buff160 1-5 ⁇ H 1-5 9 Process K ⁇ opad X 0-4 ⁇ K 1 ⁇ 0x5C5C5C . . . 10 X 5-15 ⁇ 0x5C5C5C . . . 11 H 1-5 ⁇ h 1-5 12 Process Block 13 Process previous H[x] X 0-4 ⁇ Result 14 X 5-15 ⁇ 0 15 Process Block 16 Get results Buff160 1-5 ⁇ H 1-5
  • the original input message is a constant length of 416 (256+160) bits.
  • the pseudocode takes on the following steps:
  • Step Description Action 1 Process K ⁇ ipad X 0-4 ⁇ K 2 ⁇ 0x363636 . . . 2 X 5-15 ⁇ 0x363636 . . . 3 H 1-5 ⁇ h 1-5 4 Process Block 5 Process R
  • Each Authentication Chip contains some non-volatile memory in order to hold the variables required by Authentication Protocol 3.
  • the following non-volatile variables are defined:
  • Variable Name Size (in bits) Description M[0 . . . 15] 256 16 words (each 16 bits) containing state data such as serial numbers, media remaining etc. K 1 160 Key used to transform R during authentication. K 2 160 Key used to transform M during authentication. R 160 Current random number AccessMode[0 . . . 15] 32 The 16 sets of 2-bit AccessMode values for M[n]. MinTicks 32 The minimum number of clock ticks between calls to key-based functions SIWritten 1 If set, the secret key information (K 1 , K 2 , and R) has been written to the chip. If clear, the secret information has not been written yet. IsTrusted 1 If set, the RND and TST functions can be called, but RD and WR functions cannot be called. If clear, the RND and TST functions cannot be called, but RD and WR functions can be called. Total bits 802
  • Flash memory it is not a simple matter to write a new value to replace the old.
  • the memory must be erased first, and then the appropriate bits set. This has an effect on the algorithms used to change Flash memory based variables. For example, Flash memory cannot easily be used as shift registers. To update a Flash memory variable by a general operation, it is necessary to follow these steps:
  • a RESET of the Authentication Chip has no effect on these non-volatile variables.
  • Variables M[0] through M[15] are used to hold consumable state data, such as serial numbers, batch numbers, and amount of consumable remaining
  • M may contain a number of different data types, they differ only in their write permissions.
  • Each data type can always be read. Once in client memory, the 256 bits can be interpreted in any way chosen by the client. The entire 256 bits of M are read at one time instead of in smaller amounts for reasons of security, as described in the chapter entitled Authentication.
  • the different write permissions are outlined in the following table:
  • Decrement Only values are typically 16-bit or 32-bit values, but can be any multiple of 16 bits.
  • the 16 sets of access mode bits for the 16 M[n] registers are gathered together in a single 32-bit AccessMode register.
  • the 32 bits of the AccessMode register correspond to M[n] with n as follows:
  • AccessMode[n] is examined for each M[n], and a decision made as to whether the new M[n] value will replace the old.
  • the AccessMode register is set using the Authentication Chip's SAM (Set Access Mode) command. Note that the Decrement Only comparison is unsigned, so any Decrement Only values that require negative ranges must be shifted into a positive range. For example, a consumable with a Decrement Only data item range of ⁇ 50 to 50 must have the range shifted to be 0 to 100. The System must then interpret the range 0 to 100 as being ⁇ 50 to 50. Note that most instances of Decrement Only ranges are N to 0, so there is no range shift required.
  • K 1 is the 160-bit secret key used to transform R during the authentication protocol.
  • K 1 is programmed along with K 2 and R with the SSI (Set Secret Information) command. Since K 1 must be kept secret, clients cannot directly read K 1 .
  • the commands that make use of K 1 are RND and RD.
  • RND returns a pair R, F K1 [R] where R is a random number, while RD requires an X, F K1 [X] pair as input.
  • K 1 is used in the keyed one-way hash function HMAC-SHA1. As such it should be programmed with a physically generated random number, gathered from a physically random phenomenon.
  • K 1 must NOT be generated with a computer-run random number generator.
  • K 1 , K 2 and R being generated in a way that is not deterministic. For example, to set K 1 , a person can toss a fair coin 160 times, recording heads as 1, and tails as 0. K 1 is automatically cleared to 0 upon execution of a CLR command. It can only be programmed to a non-zero value by the SSI command.
  • K 2 is the 160-bit secret key used to transform M R during the authentication protocol.
  • K 2 is programmed along with K 1 and R with the SSI (Set Secret Information) command. Since K 2 must be kept secret, clients cannot directly read K 2 .
  • the commands that make use of K 2 are RD and TST.
  • RD returns a pair M, F K2 [M X] where X was passed in as one of the parameters to the RD function.
  • TST requires an M, F K2 [M
  • K 2 is used in the keyed one-way hash function HMAC-SHA1. As such it should be programmed with a physically generated random number, gathered from a physically random phenomenon.
  • K 2 must NOT be generated with a computer-run random number generator.
  • the security of the Authentication chips depends on K 1 , K 2 and R being generated in a way that is not deterministic. For example, to set K 2 , a person can toss a fair coin 160 times, recording heads as 1, and tails as 0. K 2 is automatically cleared to 0 upon execution of a CLR command. It can only be programmed to a non-zero value by the SSI command.
  • R is a 160-bit random number seed that is programmed along with K 1 and K 2 with the SSI (Set Secret Information) command. R does not have to be kept secret, since it is given freely to callers via the RND command. However R must be changed only by the Authentication Chip, and not set to any chosen value by a caller. R is used during the TST command to ensure that the R from the previous call to RND was used to generate the F K2 [M
  • SSI Set Secret Information
  • IsTrusted is a 1-bit flag register that determines whether or not the Authentication Chip is a trusted chip (ChipT):
  • RND and TST functions cannot be called (but RD and WR functions can be called instead).
  • System never needs to call RND or TST on the consumable (since a clone chip would simply return 1 to a function such as TST, and a constant value for RND).
  • the IsTrusted bit has the added advantage of reducing the number of available R, F K1 [R] pairs obtainable by an attacker, yet still maintain the integrity of the Authentication protocol. To obtain valid R, F K1 [R] pairs, an attacker requires a System Authentication Chip, which is more expensive and less readily available than the consumables. Both R and the IsTrusted bit are cleared to 0 by the CLR command. They are both written to by the issuing of the SSI command. The IsTrusted bit can only set by storing a non-zero seed value in R via the SSI command (R must be non-zero to be a valid LFSR state, so this is quite reasonable). R is changed via a 160-bit maximal period LFSR with taps on bits 1 , 2 , 4 , and 159 , and is changed only by a successful call to TST (where 1 is returned).
  • Authentication Chips destined to be trusted Chips used in Systems should have their IsTrusted bit set during programming, and Authentication Chips used in Consumables (ChipA) should have their IsTrusted bit kept clear (by storing 0 in R via the SSI command during programming). There is no command to read or write the IsTrusted bit directly.
  • the security of the Authentication Chip does not only rely upon the randomness of K 1 and K 2 and the strength of the HMAC-SHA1 algorithm. To prevent an attacker from building a sparse lookup table, the security of the Authentication Chip also depends on the range of R over the lifetime of all Systems. What this means is that an attacker must not be able to deduce what values of R there are in produced and future Systems.
  • R should be programmed with a physically generated random number, gathered from a physically random phenomenon.
  • R must NOT be generated with a computer-run random number generator. The generation of R must not be deterministic. For example, to generate an R for use in a trusted System chip, a person can toss a fair coin 160 times, recording heads as 1, and tails as 0.0 is the only non-valid initial value for a trusted R is 0 (or the IsTrusted bit will not be set).
  • the SIWritten (Secret Information Written) 1-bit register holds the status of the secret information stored within the Authentication Chip.
  • the secret information is K 1 , K 2 and R.
  • a client cannot directly access the SIWritten bit. Instead, it is cleared via the CLR command (which also clears K 1 , K 2 and R).
  • the SIWritten bit is set automatically.
  • R is strictly not secret, it must be written together with K 1 and K 2 to ensure that an attacker cannot generate their own random number seed in order to obtain chosen R, F K1 [R] pairs.
  • the SIWritten status bit is used by all functions that access K 1 , K 2 , or R. If the SIWritten bit is clear, then calls to RD, WR, RND, and TST are interpreted as calls to CLR.
  • the first is a clock limiting hardware component that prevents the internal clock from operating at a speed more than a particular maximum (e.g. 10 MHz).
  • the second mechanism is the 32-bit MinTicks register, which is used to specify the minimum number of clock ticks that must elapse between calls to key-based functions.
  • the MinTicks variable is cleared to 0 via the CLR command. Bits can then be set via the SMT (Set MinTicks) command.
  • the input parameter to SMT contains the bit pattern that represents which bits of MinTicks are to be set. The practical effect is that an attacker can only increase the value in MinTicks (since the SMT function only sets bits).
  • MinTicks depends on the operating clock speed and the notion of what constitutes a reasonable time between key-based function calls (application specific).
  • the duration of a single tick depends on the operating clock speed. This is the maximum of the input clock speed and the Authentication Chip's clock-limiting hardware.
  • the Authentication Chip's clock-limiting hardware may be set at 10 MHz (it is not changeable), but the input clock is 1 MHz. In this case, the value of 1 tick is based on 1 MHz, not 10 MHz. If the input clock was 20 MHz instead of 1 MHz, the value of 1 tick is based on 10 MHz (since the clock speed is limited to 10 MHz).
  • MinTicks value can to be set.
  • the value for MinTicks is the minimum number of ticks required to pass between calls to the key-based RD and TST functions. The value is a real-time number, and divided by the length of an operating tick. Suppose the input clock speed matches the maximum clock speed of 10 MHz. If we want a minimum of 1 second between calls to key based functions, the value for MinTicks is set to 10,000,000. Consider an attacker attempting to collect X, F K1 [X] pairs by calling RND, RD and TST multiple times. If the MinTicks value is set such that the amount of time between calls to TST is 1 second, then each pair requires 1 second to generate.
  • MinTicks variable only slows down an attacker and causes the attack to cost more since it does not stop an attacker using multiple System chips in parallel.
  • MinTicks does make an attack on K 2 more difficult, since each consumable has a different M (part of M is random read-only data).
  • M part of M is random read-only data.
  • MinTicks causes the use of a single chip to be slowed down. If it takes a year just to get the data to start searching for values to begin a differential attack this increases the cost of attack and reduces the effective market time of a clone consumable.
  • the System communicates with the Authentication Chips via a simple operation command set. This section details the actual commands and parameters necessary for implementation of Protocol 3.
  • the Authentication Chip is defined here as communicating to System via a serial interface as a minimum implementation. It is a trivial matter to define an equivalent chip that operates over a wider interface (such as 8, 16 or 32 bits).
  • Each command is defined by 3-bit opcode. The interpretation of the opcode can depend on the current value of the IsTrusted bit and the current value of the IsWritten bit. The following operations are defined:
  • the two sets of commands are mutually exclusive between trusted and non-trusted Authentication Chips, and the same opcodes enforces this relationship.
  • Each of the commands is examined in detail in the subsequent sections. Note that some algorithms are specifically designed because Flash memory is assumed for the implementation of non-volatile variables.
  • the CLR (Clear) Command is designed to completely erase the contents of all Authentication Chip memory. This includes all keys and secret information, access mode bits, and state data. After the execution of the CLR command, an Authentication Chip will be in a programmable state, just as if it had been freshly manufactured. It can be reprogrammed with a new key and reused.
  • a CLR command consists of simply the CLR command opcode. Since the Authentication Chip is serial, this must be transferred one bit at a time. The bit order is LSB to MSB for each command component. A CLR command is therefore sent as bits 0 - 2 of the CLR opcode. A total of 3 bits are transferred. The CLR command can be called directly at any time.
  • SIWritten must be cleared first, to disable further calls to key access functions (such as RND, TST, RD and WR). If the AccessMode bits are cleared before SIWritten, an attacker could remove power at some point after they have been cleared, and manipulate M, thereby have a better chance of retrieving the secret information with a partial chosen text attack.
  • the CLR command is implemented with the following steps:
  • Step Action 1 Erase SIWritten Erase IsTrusted Erase K 1 Erase K 2 Erase R Erase M 2 Erase AccessMode Erase MinTicks
  • the SSI (Set Secret Information) command is used to load the K 1 , K 2 and R variables, and to set SIWritten and IsTrusted flags for later calls to RND, TST, RD and WR commands.
  • An SSI command consists of the SSI command opcode followed by the secret information to be stored in the K 1 , K 2 and R registers. Since the Authentication Chip is serial, this must be transferred one bit at a time. The bit order is LSB to MSB for each command component.
  • An SSI command is therefore sent as: bits 0 - 2 of the SSI opcode, followed by bits 0 - 159 of the new value for K 1 , bits 0 - 159 of the new value for K 2 , and finally bits 0 - 159 of the seed value for R.
  • a total of 483 bits are transferred.
  • the K 1 , K 2 , R, SIWritten, and IsTrusted registers are all cleared to 0 with a CLR command. They can only be set using the SSI command.
  • the SSI command is implemented with the following steps:
  • Step Action 1 CLR 2 K 1 ⁇ Read 160 bits from client 3 K 2 ⁇ Read 160 bits from client 4 R ⁇ Read 160 bits from client 5 IF (R ⁇ 0) IsTrusted ⁇ 1 6 SIWritten ⁇ 1
  • the RD (Read) command is used to securely read the entire 256 bits of state data (M) from a non-trusted Authentication Chip. Only a valid Authentication Chip will respond correctly to the RD request.
  • the output bits from the RD command can be fed as the input bits to the TST command on a trusted Authentication Chip for verification, with the first 256 bits (M) stored for later use if (as we hope) TST returns 1. Since the Authentication Chip is serial, the command and input parameters must be transferred one bit at a time. The bit order is LSB to MSB for each command component.
  • a RD command is therefore: bits 0 - 2 of the RD opcode, followed by bits 0 - 159 of X, and bits 0 - 159 of F K1 [X]. 323 bits are transferred in total. X and F K1 [X] are obtained by calling the trusted Authentication Chip's RND command.
  • the 320 bits output by the trusted chip's RND command can therefore be fed directly into the non-trusted chip's RD command, with no need for these bits to be stored by System.
  • the RD command can only be used when the following conditions have been met:
  • calls to RD must wait for the MinTicksRemaining register to reach 0. Once it has done so, the register is reloaded with MinTicks to ensure that a minimum time will elapse between calls to RD. Once MinTicksRemaining has been reloaded with MinTicks, the RD command verifies that the input parameters are valid. This is accomplished by internally generating F K1 [X] for the input X, and then comparing the result against the input F K1 [X]. This generation and comparison must take the same amount of time regardless of whether the input parameters are correct or not. If the times are not the same, an attacker can gain information about which bits of F K1 [X] are incorrect.
  • the only way for the input parameters to be invalid is an erroneous System (passing the wrong bits), a case of the wrong consumable in the wrong System, a bad trusted chip (generating bad pairs), or an attack on the Authentication Chip.
  • a constant value of 0 is returned when the input parameters are wrong. The time taken for 0 to be returned must be the same for all bad inputs so that attackers can learn nothing about what was invalid.
  • the output values are calculated.
  • the 256 bit content of M are transferred in the following order: bits 0 - 15 of M[0], bits 0 - 15 of M[1], through to bits 0 - 15 of M[15].
  • M] is calculated and output as bits 0 - 159 .
  • the R register is used to store the X value during the validation of the X, F K1 [X] pair. This is because RND and RD are mutually exclusive.
  • the RD command is implemented with the following steps:
  • the RND (Random) command is used by a client to obtain a valid R, F K1 [R] pair for use in a subsequent authentication via the RD and TST commands. Since there are no input parameters, an RND command is therefore simply bits 0 - 2 of the RND opcode.
  • the RND command can only be used when the following conditions have been met:
  • RND returns both R and F K1 [R] to the caller.
  • the 288-bit output of the RND command can be fed straight into the non-trusted chip's RD command as the input parameters. There is no need for the client to store them at all, since they are not required again.
  • the TST command will only succeed if the random number passed into the RD command was obtained first from the RND command. If a caller only calls RND multiple times, the same R, F K1 [R] pair will be returned each time. R will only advance to the next random number in the sequence after a successful call to TST. See TST for more information.
  • the RND command is implemented with the following steps:
  • Step Action 1 Output 160 bits of R to client 2 Hash ⁇ Calculate F K1 [R] 3 Output 160 bits of Hash to client
  • the TST (Test) command is used to authenticate a read of M from a non-trusted Authentication Chip.
  • the TST (Test) command consists of the TST command opcode followed by input parameters: X and F K2 [R
  • a TST command is therefore: bits 0 - 2 of the TST opcode, followed by bits 0 - 255 of M, bits 0 - 159 of F K2 [R
  • the entire data does not even have to be stored by the client. Instead, the bits can be passed directly to the trusted Authentication Chip's TST command. Only the 256 bits of M should be kept from a RD command.
  • the TST command can only be used when the following conditions have been met:
  • TST causes the internal M value to be replaced by the input M value.
  • R] is then calculated, and compared against the 160 bit input hash value. A single output bit is produced: 1 if they are the same, and 0 if they are different.
  • the use of the internal M value is to save space on chip, and is the reason why RD and TST are mutually exclusive commands. If the output bit is 1, R is updated to be the next random number in the sequence. This forces the caller to use a new random number each time RD and TST are called.
  • the resultant output bit is not output until the entire input string has been compared, so that the time to evaluate the comparison in the TST function is always the same. Thus no attacker can compare execution times or number of bits processed before an output is given.
  • the next random number is generated from R using a 160-bit maximal period LFSR (tap selections on bits 159 , 4 , 2 , and 1 ).
  • the initial 160-bit value for R is set up via the SSI command, and can be any random number except 0 (an LFSR filled with 0s will produce a never-ending stream of 0s).
  • R is transformed by XORing bits 1 , 2 , 4 , and 159 together, and shifting all 160 bits right 1 bit using the XOR result as the input bit to b 159 .
  • the new R will be returned on the next call to RND. Note that the time taken for 0 to be returned from TST must be the same for all bad inputs so that attackers can learn nothing about what was invalid about the input.
  • the TST command is implemented with the following steps:
  • M] 6 OK ⁇ (Hash next 160 bits from client) Note that this operation must take constant time so an attacker cannot determine how much of their guess is correct. 7 IF (OK) Temp ⁇ R Erase R Advance TEMP via LFSR R ⁇ TEMP 8 Output 1 bit of OK to client
  • a WR (Write) command is used to update the writeable parts of M containing Authentication Chip state data.
  • the WR command by itself is not secure. It must be followed by an authenticated read of M (via a RD command) to ensure that the change was made as specified.
  • the WR command is called by passing the WR command opcode followed by the new 256 bits of data to be written to M. Since the Authentication Chip is serial, the new value for M must be transferred one bit at a time. The bit order is LSB to MSB for each command component.
  • a WR command is therefore: bits 0 - 2 of the WR opcode, followed by bits 0 - 15 of M[0], bits 0 - 15 of M[1], through to bits 0 - 15 of M[15]. 259 bits are transferred in total.
  • the ability to write to a specific M[n] is governed by the corresponding Access Mode bits as stored in the AccessMode register.
  • the AccessMode bits can be set using the SAM command.
  • the WR command is implemented with the following steps:
  • Step Action 1 DecEncountered ⁇ 0 EqEncountered ⁇ 0 n ⁇ 15 2 Temp ⁇ Read 16 bits from client 3
  • the SAM command is called by passing the SAM command opcode followed by a 32-bit value that is used to set bits in the AccessMode register. Since the Authentication Chip is serial, the data must be transferred one bit at a time. The bit order is LSB to MSB for each command component.
  • a SAM command is therefore: bits 0 - 2 of the SAM opcode, followed by bits 0 - 31 of bits to be set in AccessMode. 35 bits are transferred in total.
  • the AccessMode register is only cleared to 0 upon execution of a CLR command.
  • the SAM command only sets bits, the effect is to allow the access mode bits corresponding to M[n] to progress from RW to either MSR, NMSR, or RO. It should be noted that an access mode of MSR can be changed to RO, but this would not help an attacker, since the authentication of M after a write to a doctored Authentication Chip would detect that the write was not successful and hence abort the operation.
  • the setting of bits corresponds to the way that Flash memory works best.
  • the only way to clear bits in the AccessMode register for example to change a Decrement Only M[n] to be Read/Write, is to use the CLR command. The CLR command not only erases (clears) the AccessMode register, but also clears the keys and all of M.
  • AccessMode[n] bits corresponding to M[n] can only usefully be changed once between CLR commands.
  • the SAM command returns the new value of the AccessMode register (after the appropriate bits have been set due to the input parameter).
  • the SAM command is implemented with the following steps:
  • Step Action 1 Temp ⁇ Read 32 bits from client 2 SetBits(AccessMode, Temp) 3 Output 32 bits of AccessMode to client
  • the GIT (Get Is Trusted) command is used to read the current value of the IsTrusted bit on the Authentication Chip. If the bit returned is 1, the Authentication Chip is a trusted System Authentication Chip. If the bit returned is 0, the Authentication Chip is a consumable Authentication Chip.
  • a GIT command consists of simply the GIT command opcode. Since the Authentication Chip is serial, this must be transferred one bit at a time. The bit order is LSB to MSB for each command component. A GIT command is therefore sent as bits 0 - 2 of the GIT opcode. A total of 3 bits are transferred.
  • the GIT command is implemented with the following steps:
  • Step Action 1 Output IsTrusted bit to client
  • MinTicks new [32 bits]
  • the SMT (Set MinTicks) command is used to set bits in the MinTicks register and hence define the minimum number of ticks that must pass in between calls to TST and RD.
  • the SMT command is called by passing the SMT command opcode followed by a 32-bit value that is used to set bits in the MinTicks register. Since the Authentication Chip is serial, the data must be transferred one bit at a time. The bit order is LSB to MSB for each command component.
  • An SMT command is therefore: bits 0 - 2 of the SMT opcode, followed by bits 0 - 31 of bits to be set in MinTicks. 35 bits are transferred in total.
  • the MinTicks register is only cleared to 0 upon execution of a CLR command. A value of 0 indicates that no ticks need to pass between calls to key-based functions. The functions may therefore be called as frequently as the clock speed limiting hardware allows the chip to run.
  • MinTicks register Since the SMT command only sets bits, the effect is to allow a client to set a value, and only increase the time delay if further calls are made. Setting a bit that is already set has no effect, and setting a bit that is clear only serves to slow the chip down further. The setting of bits corresponds to the way that Flash memory works best.
  • the only way to clear bits in the MinTicks register for example to change a value of 10 ticks to a value of 4 ticks, is to use the CLR command. However the CLR command clears the MinTicks register to 0 as well as clearing all keys and M. It is therefore useless for an attacker. Thus the MinTicks register can to only usefully be changed once between CLR commands
  • the SMT command is implemented with the following steps:
  • Step Action 1 Temp ⁇ Read 32 bits from client 2 SetBits(MinTicks, Temp)
  • Authentication Chips must be programmed with logically secure information in a physically secure environment. Consequently the programming procedures cover both logical and physical security.
  • Logical security is the process of ensuring that K 1 , K 2 , R, and the random M[n] values are generated by a physically random process, and not by a computer. It is also the process of ensuring that the order in which parts of the chip are programmed is the most logically secure.
  • Physical security is the process of ensuring that the programming station is physically secure, so that K 1 and K 2 remain secret, both during the key generation stage and during the lifetime of the storage of the keys. In addition, the programming station must be resistant to physical attempts to obtain or destroy the keys.
  • the Authentication Chip has its own security mechanisms for ensuring that K 1 and K 2 are kept secret, but the Programming Station must also keep K 1 and K 2 safe.
  • an Authentication Chip After manufacture, an Authentication Chip must be programmed before it can be used. In all chips values for K 1 and K 2 must be established. If the chip is destined to be a System Authentication Chip, the initial value for R must be determined. If the chip is destined to be a consumable Authentication Chip, R must be set to 0, and initial values for M and AccessMode must be set up. The following stages are therefore identified:
  • the attached Authentication Chip can be reused. This is easily accomplished by reprogrammed the chip starting at Stage 4 again. Each of the stages is examined in the subsequent sections.
  • Authentication Chips does not require any special security. There is no secret information programmed into the chips at manufacturing stage. The algorithms and chip process is not special. Standard Flash processes are used. A theft of Authentication Chips between the chip manufacturer and programming station would only provide the clone manufacturer with blank chips. This merely compromises the sale of Authentication chips, not anything authenticated by Authentication Chips. Since the programming station is the only mechanism with consumable and system product keys, a clone manufacturer would not be able to program the chips with the correct key. Clone manufacturers would be able to program the blank chips for their own systems and consumables, but it would be difficult to place these items on the market without detection. In addition, a single theft would be difficult to base a business around.
  • each type of consumable requires a different way of dividing M (the state data).
  • M the state data
  • the method of allocating M[n] and AccessMode[n] will be the same:
  • a 16-bit key number is more than enough to uniquely identify each car-key for a given car.
  • the 256 bits of M could be divided up as follows:
  • the Photos Remaining value at M[2] allows a number of consumable types to be built for use with the same camera System. For example, a new consumable with 36 photos is trivial to program.
  • M[3] can be used to define Film Type. Old film types would be 0, and the new film types would be some new value. New Systems can take advantage of this. Original systems would detect a non-zero value at M[3] and realize incompatibility with new film types. New Systems would understand the value of M[3] and so react appropriately.
  • the new consumable and System needs to have the same key information as the old one. To make a clean break with a new System and its own special consumables, a new key set would be required.
  • K 1 and K 2 must therefore be determined. In most cases, K 1 and K 2 will be generated once for all time. All Systems and consumables that have to work together (both now and in the future) need to have the same K 1 and K 2 values. K 1 and K 2 must therefore be kept secret since the entire security mechanism for the System/Consumable combination is made void if the keys are compromised.
  • the keys are compromised, the damage depends on the number of systems and consumables, and the ease to which they can be reprogrammed with new non-compromised keys: In the case of a photocopier with toner cartridges, the worst case is that a clone manufacturer could then manufacture their own Authentication Chips (or worse, buy them), program the chips with the known keys, and then insert them into their own consumables. In the case of a car with car-keys, each car has a different set of keys. This leads to two possible general scenarios. The first is that after the car and car-keys are programmed with the keys, K 1 and K 2 are deleted so no record of their values are kept, meaning that there is no way to compromise K 1 and K 2 .
  • the second scenario is that the car manufacturer keeps K 1 and K 2 , and new keys can be made for the car.
  • a compromise of K 1 and K 2 means that someone could make a car-key specifically for a particular car.
  • K 1 and K 2 should be generated by a physically random process, and not by a computer.
  • random bit generators based on natural sources of randomness are subject to influence by external factors and also to malfunction. It is imperative that such devices be tested periodically for statistical randomness.
  • Lavarand® system from SGI. This generator uses a digital camera to photograph six lava lamps every few minutes. Lava lamps contain chaotic turbulent systems. The resultant digital images are fed into an SHA-1 implementation that produces a 7-way hash, resulting in a 160-bit value from every 7th bye from the digitized image. These 7 sets of 160 bits total 140 bytes. The 140 byte value is fed into a BBS generator to position the start of the output bitstream. The output 160 bits from the BBS would be the key or the Authentication chip.
  • An extreme example of a non-deterministic random process is someone flipping a coin 160 times for K 1 and 160 times for K 2 in a clean room. With each head or tail, a 1 or 0 is entered on a panel of a Key Programmer Device. The process must be undertaken with several observers (for verification) in silence (someone may have a hidden microphone). The point to be made is that secure data entry and storage is not as simple as it sounds. The physical security of the Key Programmer Device and accompanying Programming Station requires an entire document of its own. Once keys K 1 and K 2 have been determined, they must be kept for as long as Authentication Chips need to be made that use the key.
  • K 1 and K 2 are destroyed after a single System chip and a few consumable chips have been programmed.
  • K 1 and K 2 must be retained for as long as the toner-cartridges are being made for the photocopiers. The keys must be kept securely.
  • MinTicks depends on the operating clock speed of the Authentication Chip (System specific) and the notion of what constitutes a reasonable time between RD or TST function calls (application specific).
  • the duration of a single tick depends on the operating clock speed. This is the maximum of the input clock speed and the Authentication Chip's clock-limiting hardware.
  • the Authentication Chip's clock-limiting hardware may be set at 10 MHz (it is not changeable), but the input clock is 1 MHz. In this case, the value of 1 tick is based on 1 MHz, not 10 MHz. If the input clock was 20 MHz instead of 1 MHz, the value of 1 tick is based on 10 MHz (since the clock speed is limited to 10 MHz).
  • MinTicks value can be set.
  • the value for MinTicks is the minimum number of ticks required to pass between calls to RD or RND key-based functions. Suppose the input clock speed matches the maximum clock speed of 10 MHz. If we want a minimum of 1 second between calls to TST, the value for MinTicks is set to 10,000,000. Even a value such as 2 seconds might be a completely reasonable value for a System such as a printer (one authentication per page, and one page produced every 2 or 3 seconds).
  • Authentication Chips are in an unknown state after manufacture. Alternatively, they have already been used in one consumable, and must be reprogrammed for use in another. Each Authentication Chip must be cleared and programmed with new keys and new state data. Clearing and subsequent programming of Authentication Chips must take place in a secure Programming Station environment.
  • a seed value for R must be generated. It must be a random number derived from a physically random process, and must not be 0. The following tasks must be undertaken, in the following order, and in a secure programming environment:
  • the Authentication Chip is now ready for insertion into a System. It has been completely programmed. If the System Authentication Chips are stolen at this point, a clone manufacturer could use them to generate R, F K1 [R] pairs in order to launch a known text attack on K 1 , or to use for launching a partially chosen-text attack on K 2 . This is no different to the purchase of a number of Systems, each containing a trusted Authentication Chip. The security relies on the strength of the Authentication protocols and the randomness of K 1 and K 2 .
  • the programming is slightly different to that of the trusted System Authentication Chip.
  • the seed value for R must be 0. It must have additional programming for M and the AccessMode values.
  • M[n] must be programmed with 0, and the random M[n] must be programmed with random data. The following tasks must be undertaken, in the following order, and in a secure programming environment:
  • the non-trusted consumable chip is now ready to be programmed with the general state data. If the Authentication Chips are stolen at this point, an attacker could perform a limited chosen text attack. In the best situation, parts of M are Read Only (0 and random data), with the remainder of M completely chosen by an attacker (via the WR command). A number of RD calls by an attacker obtains F K2 [M
  • This stage is only required for consumable Authentication Chips, since M and AccessMode registers cannot be altered on System Authentication Chips.
  • M[n] have already been programmed in Stage 4.
  • the remaining state data values need to be programmed and the associated Access Mode values need to be set. Bear in mind that the speed of this stage will be limited by the value stored in the MinTicks register.
  • This stage is separated from Stage 4 on account of the differences either in physical location or in time between where/when Stage 4 is performed, and where/when Stage 5 is performed. Ideally, Stages 4 and 5 are performed at the same time in the same Programming Station. Stage 4 produces valid Authentication Chips, but does not load them with initial state values (other than 0).
  • Stage 5 can be run multiple times, each time setting a different state data value and Access Mode value, it is more likely to be run a single time, setting all the remaining state data values and setting all the remaining Access Mode values.
  • a production line can be set up where the batch number and serial number of the Authentication Chip is produced according to the physical consumable being produced. This is much harder to match if the state data is loaded at a physically different factory.
  • the Stage 5 process involves first checking to ensure the chip is a valid consumable chip, which includes a RD to gather the data from the Authentication Chip, followed by a WR of the initial data values, and then a SAM to permanently set the new data values.
  • the steps are outlined here:
  • Steps 1 to 7 the validation (Steps 1 to 7) does not have to occur if Stage 4 and 5 follow on from one another on the same Programming Station. But it should occur in all other situations where Stage 5 is run as a separate programming process from Stage 4. If these Authentication Chips are now stolen, they are already programmed for use in a particular consumable. An attacker could place the stolen chips into a clone consumable. Such a theft would limit the number of cloned products to the number of chips stolen. A single theft should not create a supply constant enough to provide clone manufacturers with a cost-effective business. The alternative use for the chips is to save the attacker from purchasing the same number of consumables, each with an Authentication Chip, in order to launch a partially chosen text attack or brute force attack. There is no special security breach of the keys if such an attack were to occur.
  • the circuitry of the Authentication Chip must be resistant to physical attack. A summary of manufacturing implementation guidelines is presented, followed by specification of the chip's physical defenses (ordered by attack).
  • the Authentication Chip should be implemented with a standard manufacturing process (such as Flash). This is necessary to:
  • the Authentication chip must have a low manufacturing cost in order to be included as the authentication mechanism for low cost consumables. It is therefore desirable to keep the chip size as low as reasonably possible.
  • Each Authentication Chip requires 802 bits of non-volatile memory.
  • the storage required for optimized HMAC-SHA1 is 1024 bits.
  • the remainder of the chip (state machine, processor, CPU or whatever is chosen to implement Protocol 3) must be kept to a minimum in order that the number of transistors is minimized and thus the cost per chip is minimized.
  • the circuit areas that process the secret key information or could reveal information about the key should also be minimized (see Non-Flashing CMOS below for special data paths).
  • the Authentication Chip circuitry is designed to operate within a specific clock speed range. Since the user directly supplies the clock signal, it is possible for an attacker to attempt to introduce race-conditions in the circuitry at specific times during processing. An example of this is where a high clock speed (higher than the circuitry is designed for) may prevent an XOR from working properly, and of the two inputs, the first may always be returned. These styles of transient fault attacks can be very efficient at recovering secret key information. The lesson to be learned from this is that the input clock signal cannot be trusted. Since the input clock signal cannot be trusted, it must be limited to operate up to a maximum frequency. This can be achieved a number of ways.
  • an edge detect unit 81 passes the edge on to a delay 82 , which in turn enables a gate 83 so that the clock signal is able to pass from the input port 84 to the output 85 .
  • FIG. 8 shows the Clock Filter
  • the delay should be set so that the maximum clock speed is a particular frequency (e.g. about 4 MHz). Note that this delay is not programmable—it is fixed.
  • the filtered clock signal would be further divided internally as required.
  • Each Authentication Chip should contain a noise generator that generates continuous circuit noise.
  • the noise will interfere with other electromagnetic emissions from the chip's regular activities and add noise to the I dd signal. Placement of the noise generator is not an issue on an Authentication Chip due to the length of the emission wavelengths.
  • the noise generator is used to generate electronic noise, multiple state changes each clock cycle, and as a source of pseudo-random bits for the Tamper Prevention and Detection circuitry.
  • a simple implementation of a noise generator is a 64-bit LFSR seeded with a non-zero number.
  • the clock used for the noise generator should be running at the maximum clock rate for the chip in order to generate as much noise as possible.
  • a set of circuits is required to test for and prevent physical attacks on the Authentication Chip. However what is actually detected as an attack may not be an intentional physical attack. It is therefore important to distinguish between these two types of attacks in an Authentication Chip:
  • the two types of detection differ in what is performed as a result of the detection.
  • erasure of Flash memory key information is a sensible action.
  • the circuitry cannot be sure if an attack has occurred, there is still certainly something wrong. Action must be taken, but the action should not be the erasure of secret key information.
  • a suitable action to take in the second case is a chip RESET. If what was detected was an attack that has permanently damaged the chip, the same conditions will occur next time and the chip will RESET again. If, on the other hand, what was detected was part of the normal operating environment of the chip, a RESET will not harm the key.
  • a good example of an event that circuitry cannot have knowledge about is a power glitch.
  • the glitch may be an intentional attack, attempting to reveal information about the key. It may, however, be the result of a faulty connection, or simply the start of a power-down sequence. It is therefore best to only RESET the chip, and not erase the key. If the chip was powering down, nothing is lost. If the System is faulty, repeated RESETs will cause the consumer to get the System repaired. In both cases the consumable is still intact.
  • a good example of an event that circuitry can have knowledge about is the cutting of a data line within the chip. If this attack is somehow detected, it could only be a result of a faulty chip (manufacturing defect) or an attack. In either case, the erasure of the secret information is a sensible step to take.
  • each Authentication Chip should have 2 Tamper Detection Lines—one for definite attacks, and one for possible attacks. Connected to these Tamper Detection Lines would be a number of Tamper Detection test units, each testing for different forms of tampering. In addition, we want to ensure that the Tamper Detection Lines and Circuits themselves cannot also be tampered with.
  • Tamper Detection Line 90 At one end of the Tamper Detection Line 90 is a source of pseudo-random bits 91 (clocking at high speed compared to the general operating circuitry).
  • the Noise Generator circuit described above is an adequate source.
  • the generated bits pass through two different paths—one 92 carries the original data, and the other 93 carries the inverse of the data, it having passed through an inverter 94 .
  • the wires carrying these bits are in the layer above the general chip circuitry (for example, the memory, the key manipulation circuitry etc). The wires must also cover the random bit generator.
  • the bits are recombined at a number of places via an XOR gate 95 .
  • a 1 is output, and used by the particular unit (for example, each output bit from a memory read should be ANDed with this bit value).
  • the lines finally come together at the Flash memory Erase circuit, where a complete erasure is triggered by a 0 from the XOR. Attached to the line is a number of triggers, each detecting a physical attack on the chip. Each trigger has oversize nMOS transistors, such as 96 , attached to GND. The Tamper Detection Line physically goes through these nMOS transistors. If the test fails, the trigger causes the Tamper Detect Line to become 0.
  • FIG. 9 illustrates the basic principle of a Tamper Detection Line with its outputs connected to either the Erase or RESET circuitry.
  • the Tamper Detection Line must go through the drain 100 of an output transistor 96 for each test, as illustrated by FIG. 10 . It is not possible to break the Tamper Detect Line since this would stop the flow of 1s and 1s from the random source. The XOR tests would therefore fail. As the Tamper Detect Line physically passes through each test, it is not possible to eliminate any particular test without breaking the Tamper Detect Line.
  • FIG. 11 illustrates the taking of multiple XORs, indicated generally at 110 , from the Tamper Detect Line to be used in the different parts of the chip.
  • Each of these XORs 110 can be considered to be generating a ChipOK bit that can be used within each unit or sub-unit.
  • a sample usage would be to have an OK bit in each unit that is ANDed with a given ChipOK bit each cycle.
  • the OK bit is loaded with 1 on a RESET. If OK is 0, that unit will fail until the next RESET. If the Tamper Detect Line is functioning correctly, the chip will either RESET or erase all key information. If the RESET or erase circuitry has been destroyed, then this unit will not function, thus thwarting an attacker.
  • the destination of the RESET and Erase line and associated circuitry is very context sensitive. It needs to be protected in much the same way as the individual tamper tests. There is no point generating a RESET pulse if the attacker can simply cut the wire leading to the RESET circuitry. The actual implementation will depend very much on what is to be cleared at RESET, and how those items are cleared.
  • FIG. 12 shows how the Tamper Lines 120 cover the noise generator circuitry 121 of the chip.
  • the generator 121 and NOT gate 122 are on one level, while the Tamper Detect Lines 120 run on a level above the generator 121 .
  • Flash memory and RAM must be protected from an attacker who would attempt to modify (or set) a particular bit of program code or key information.
  • the mechanism used must conform to being used in the Tamper Detection Circuitry (described above).
  • the first part of the solution is to ensure that the Tamper Detection Line passes directly above each Flash or RAM bit. This ensures that an attacker cannot probe the contents of Flash or RAM.
  • a breach of the covering wire is a break in the Tamper Detection Line. The breach causes the Erase signal to be set, thus deleting any contents of the memory.
  • the high frequency noise on the Tamper Detection Line also obscures passive observation.
  • the second part of the solution for Flash is to use multi-level data storage, but only to use a subset of those multiple levels for valid bit representations.
  • a single floating gate holds more than one bit.
  • a 4-voltage-state transistor can represent two bits. Assuming a minimum and maximum voltage representing 00 and 11 respectively, the two middle voltages represent 01 and 10.
  • the two middle voltages represent 01 and 10.
  • the second part of the solution for RAM is to use a parity bit.
  • the data part of the register can be checked against the parity bit (which will not match after an attack).
  • the bits coming from Flash and RAM can therefore be validated by a number of test units (one per bit) connected to the common Tamper Detection Line.
  • the Tamper Detection circuitry would be the first circuitry the data passes through (thus stopping an attacker from cutting the data lines).
  • Program code should be kept in multi-level Flash instead of ROM, since ROM is subject to being altered in a non-testable way.
  • a boot mechanism is therefore required to load the program code into Flash memory (Flash memory is in an indeterminate state after manufacture).
  • the boot circuitry must not be in ROM—a small state-machine would suffice. Otherwise the boot code could be modified in an undetectable way.
  • the boot circuitry must erase all Flash memory, check to ensure the erasure worked, and then load the program code. Flash memory must be erased before loading the program code. Otherwise an attacker could put the chip into the boot state, and then load program code that simply extracted the existing keys.
  • the state machine must also check to ensure that all Flash memory has been cleared (to ensure that an attacker has not cut the Erase line) before loading the new program code.
  • the loading of program code must be undertaken by the secure Programming Station before secret information (such as keys) can be loaded.
  • FIG. 13 The normal situation for FET implementation for the case of a CMOS Inverter 130 , which involves a pMOS transistor 131 combined with an nMOS transistor 132 ) is shown in FIG. 13 .
  • FIG. 14 is the voltage/current diagram for the CMOS inverter 130 .
  • the resultant power-ground short circuit causes a temporary increase in the current, and in fact accounts for the majority of current consumed by a CMOS device.
  • a small amount of infrared light is emitted during the short circuit, and can be viewed through the silicon substrate (silicon is transparent to infrared light).
  • a small amount of light is also emitted during the charging and discharging of the transistor gate capacitance and transmission line capacitance.
  • CMOS implementation 150 should therefore be used for all data paths that manipulate the key or a partially calculated value that is based on the key.
  • the use of two non-overlapping clocks ⁇ 1 and ⁇ 2 can provide a non-flashing mechanism.
  • ⁇ 1 is connected to a second gate 151 of all nMOS transistors 152
  • ⁇ 2 is connected to a second gate 153 of all pMOS transistors 154 .
  • the transition can only take place in combination with the clock. Since ⁇ 1 and ⁇ 2 are non-overlapping, the pMOS and nMOS transistors will not have a simultaneous intermediate resistance.
  • the setup is shown in FIG. 15 and the impedance diagram in FIG. 16 .
  • CMOS inverters can be positioned near critical non-Flashing CMOS components. These inverters should take their input signal from the Tamper Detection Line above. Since the Tamper Detection Line operates multiple times faster than the regular operating circuitry, the net effect will be a high rate of light-bursts next to each non-Flashing CMOS component. Since a bright light overwhelms observation of a nearby faint light, an observer will not be able to detect what switching operations are occurring in the chip proper. These regular CMOS inverters will also effectively increase the amount of circuit noise, reducing the SNR and obscuring useful EMI.
  • connections along which the key or secret data flows should be made in the polysilicon layers. Where necessary, they can be in metal 1, but must never be in the top metal layer (containing the Tamper Detection Lines).
  • Each Authentication Chip requires an OverUnderPower Detection Unit to prevent Power Supply Attacks.
  • An OverUnderPower Detection Unit detects power glitches and tests the power level against a Voltage Reference to ensure it is within a certain tolerance.
  • the Unit contains a single Voltage Reference and two comparators.
  • the OverUnderPower Detection Unit would be connected into the RESET Tamper Detection Line, thus causing a RESET when triggered.
  • a side effect of the OverUnderPower Detection Unit is that as the voltage drops during a power-down, a RESET is triggered, thus erasing any work registers.
  • Test hardware on an Authentication Chip could very easily introduce vulnerabilities. As a result, the Authentication Chip should not contain any BIST or scan paths. The Authentication Chip must therefore be testable with external test vectors. This should be possible since the Authentication Chip is not complex.
  • Reverse engineering a chip is only useful when the security of authentication lies in the algorithm alone.
  • our Authentication Chips rely on a secret key, and not in the secrecy of the algorithm.
  • Our authentication algorithm is, by contrast, public, and in any case, an attacker of a high volume consumable is assumed to have been able to obtain detailed plans of the internals of the chip.
  • reverse engineering the chip itself, as opposed to the stored data poses no threat.
  • the simplest method of modification is to replace the System's Authentication Chip with one that simply reports success for each call to TST. This can be thwarted by System calling TST several times for each authentication, with the first few times providing false values, and expecting a fail from TST. The final call to TST would be expected to succeed. The number of false calls to TST could be determined by some part of the returned result from RD or from the system clock. Unfortunately an attacker could simply rewire System so that the new System clone authentication chip can monitor the returned result from the consumable chip or clock. The clone System Authentication Chip would only return success when that monitored value is presented to its TST function.
  • Clone consumables could then return any value as the hash result for RD, as the clone System chip would declare that value valid. There is therefore no point for the System to call the System Authentication Chip multiple times, since a rewiring attack will only work for the System that has been rewired, and not for all Systems.
  • a similar form of attack on a System is a replacement of the System ROM.
  • the ROM program code can be altered so that the Authentication never occurs. There is nothing that can be done about this, since the System remains in the hands of a consumer. Of course this would void any warranty, but the consumer may consider the alteration worthwhile if the clone consumable were extremely cheap and more readily available than the original item.
  • the System/consumable manufacturer must therefore determine how likely an attack of this nature is. Such a study must include given the pricing structure of Systems and Consumables, frequency of System service, advantage to the consumer of having a physical modification performed, and where consumers would go to get the modification performed.
  • the limit case of modifying a system is for a clone manufacturer to provide a completely clone System which takes clone consumables. This may be simple competition or violation of patents. Either way, it is beyond the scope of the Authentication Chip and depends on the technology or service being cloned.
  • the chip In order to view the chip operation, the chip must be operating. However, the Tamper Prevention and Detection circuitry covers those sections of the chip that process or hold the key. It is not possible to view those sections through the Tamper Prevention lines. An attacker cannot simply slice the chip past the Tamper Prevention layer, for this will break the Tamper Detection Lines and cause an erasure of all keys at power-up. Simply destroying the erasure circuitry is not sufficient, since the multiple ChipOK bits (now all 0) feeding into multiple units within the Authentication Chip will cause the chip's regular operating circuitry to stop functioning.
  • the Noise Generator described above will cause circuit noise.
  • the noise will interfere with other electromagnetic emissions from the chip's regular activities and thus obscure any meaningful reading of internal data transfers.
  • the solution against this kind of attack is to decrease the SNR in the I dd signal. This is accomplished by increasing the amount of circuit noise and decreasing the amount of signal.
  • the Noise Generator circuit (which also acts as a defense against EMI attacks) will also cause enough state changes each cycle to obscure any meaningful information in the I dd signal.
  • the special Non-Flashing CMOS implementation of the key-carrying data paths of the chip prevents current from flowing when state changes occur. This has the benefit of reducing the amount of signal.
  • the Clock Filter (described above) eliminates the possibility of clock glitch attacks.
  • the OverUnderPower Detection Unit eliminates the possibility of power supply attacks.
  • Authentication Chips store Program code, keys and secret information in Flash memory, and not in ROM. This attack is therefore not possible.
  • Authentication Chips store Program code, keys and secret information in Flash memory.
  • Flash memory is covered by two Tamper Prevention and Detection Lines. If either of these lines is broken (in the process of destroying a gate) the attack will be detected on power-up, and the chip will either RESET (continually) or erase the keys from Flash memory. However, even if the attacker is able to somehow access the bits of Flash and destroy or short out the gate holding a particular bit, this will force the bit to have no charge or a full charge. These are both invalid states for the Authentication Chip's usage of the multi-level Flash memory (only the two middle states are valid). When that data value is transferred from Flash, detection circuitry will cause the Erasure Tamper Detection Line to be triggered—thereby erasing the remainder of Flash memory and RESETing the chip. A Modify EEPROM/Flash Attack is therefore fruitless.
  • Gate Destruction Attacks rely on the ability of an attacker to modify a single gate to cause the chip to reveal information during operation. However any circuitry that manipulates secret information is covered by one of the two Tamper Prevention and Detection lines. If either of these lines is broken (in the process of destroying a gate) the attack will be detected on power-up, and the chip will either RESET (continually) or erase the keys from Flash memory. To launch this kind of attack, an attacker must first reverse-engineer the chip to determine which gate(s) should be targeted. Once the location of the target gates has been determined, the attacker must break the covering Tamper Detection line, stop the Erasure of Flash memory, and somehow rewire the components that rely on the ChipOK lines.
  • An Overwrite Attack relies on being able to set individual bits of the key without knowing the previous value. It relies on probing the chip, as in the Conventional Probing Attack and destroying gates as in the Gate Destruction Attack. Both of these attacks (as explained in their respective sections), will not succeed due to the use of the Tamper Prevention and Detection Circuitry and ChipOK lines. However, even if the attacker is able to somehow access the bits of Flash and destroy or short out the gate holding a particular bit, this will force the bit to have no charge or a full charge. These are both invalid states for the Authentication Chip's usage of the multi-level Flash memory (only the two middle states are valid).
  • Any working registers or RAM within the Authentication Chip may be holding part of the authentication keys when power is removed.
  • the working registers and RAM would continue to hold the information for some time after the removal of power. If the chip were sliced so that the gates of the registers/RAM were exposed, without discharging them, then the data could probably be viewed directly using an STM.
  • the first defense can be found above, in the description of defense against Power Glitch Attacks. When power is removed, all registers and RAM are cleared, just as the RESET condition causes a clearing of memory. The chances then, are less for this attack to succeed than for a reading of the Flash memory.
  • RAM charges (by nature) are more easily lost than Flash memory. The slicing of the chip to reveal the RAM will certainly cause the charges to be lost (if they haven't been lost simply due to the memory not being refreshed and the time taken to perform the slicing). This attack is therefore fruitless.
  • Chips can be stolen when at any of these stages:
  • a theft in between the chip manufacturer and programming station would only provide the clone manufacturer with blank chips. This merely compromises the sale of Authentication chips, not anything authenticated by the Authentication chips. Since the programming station is the only mechanism with consumable and system product keys, a clone manufacturer would not be able to program the chips with the correct key. Clone manufacturers would be able to program the blank chips for their own Systems and Consumables, but it would be difficult to place these items on the market without detection.
  • the second form of theft can only happen in a situation where an Authentication Chip passes through two or more distinct programming phases. This is possible, but unlikely. In any case, the worst situation is where no state data has been programmed, so all of M is read/write.
  • the HMAC-SHA1 algorithm is resistant to such attacks.
  • the third form of theft would have to take place in between the programming station and the installation factory.
  • the Authentication chips would already be programmed for use in a particular system or for use in a particular consumable. The only use these chips have to a thief is to place them into a clone System or clone Consumable. Clone systems are irrelevant—a cloned System would not even require an authentication chip. For clone Consumables, such a theft would limit the number of cloned products to the number of chips stolen.
  • a single theft should not create a supply constant enough to provide clone manufacturers with a cost-effective business.
  • the final form of theft is where the System or Consumable itself is stolen.
  • physical security protocols must be enhanced. If the theft occurs anywhere else, it is a matter of concern only for the owner of the item and the police or insurance company.
  • the security mechanisms that the Authentication Chip uses assume that the consumables and systems are in the hands of the public. Consequently, having them stolen makes no difference to the security of the keys.
US13/086,359 1997-07-15 2011-04-13 Integrated circuit for authentication of consumable storage device Abandoned US20110208966A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
US13/086,359 US20110208966A1 (en) 1997-07-15 2011-04-13 Integrated circuit for authentication of consumable storage device

Applications Claiming Priority (6)

Application Number Priority Date Filing Date Title
AUPO7991A AUPO799197A0 (en) 1997-07-15 1997-07-15 Image processing method and apparatus (ART01)
AUPO7991 1997-07-15
US09/113,223 US6442525B1 (en) 1997-07-15 1998-07-10 System for authenticating physical objects
US09/516,874 US6745331B1 (en) 1998-07-10 2000-03-02 Authentication chip with protection from power supply attacks
US10/791,792 US20090013178A9 (en) 1997-07-15 2004-03-04 Integrated circuit for the authentication of a consumable storage device
US13/086,359 US20110208966A1 (en) 1997-07-15 2011-04-13 Integrated circuit for authentication of consumable storage device

Related Parent Applications (1)

Application Number Title Priority Date Filing Date
US10/791,792 Continuation US20090013178A9 (en) 1997-07-15 2004-03-04 Integrated circuit for the authentication of a consumable storage device

Publications (1)

Publication Number Publication Date
US20110208966A1 true US20110208966A1 (en) 2011-08-25

Family

ID=32323647

Family Applications (8)

Application Number Title Priority Date Filing Date
US09/505,147 Expired - Fee Related US6816968B1 (en) 1998-07-10 2000-02-15 Consumable authentication protocol and system
US09/516,874 Expired - Fee Related US6745331B1 (en) 1997-07-15 2000-03-02 Authentication chip with protection from power supply attacks
US10/780,622 Expired - Fee Related US7194629B2 (en) 1997-07-15 2004-02-19 Apparatus for authenticating memory space of an authorized accessory
US10/791,792 Abandoned US20090013178A9 (en) 1997-07-15 2004-03-04 Integrated circuit for the authentication of a consumable storage device
US10/902,889 Expired - Fee Related US7210038B2 (en) 1998-07-10 2004-08-02 Method for validating an authentication chip
US10/902,883 Expired - Fee Related US7401223B2 (en) 1998-07-10 2004-08-02 Authentication chip for authenticating an untrusted chip
US12/030,817 Abandoned US20100031064A1 (en) 1998-07-10 2008-02-13 Tamper Detection Line Circuitry For An Authentication Integrated Circuit
US13/086,359 Abandoned US20110208966A1 (en) 1997-07-15 2011-04-13 Integrated circuit for authentication of consumable storage device

Family Applications Before (7)

Application Number Title Priority Date Filing Date
US09/505,147 Expired - Fee Related US6816968B1 (en) 1998-07-10 2000-02-15 Consumable authentication protocol and system
US09/516,874 Expired - Fee Related US6745331B1 (en) 1997-07-15 2000-03-02 Authentication chip with protection from power supply attacks
US10/780,622 Expired - Fee Related US7194629B2 (en) 1997-07-15 2004-02-19 Apparatus for authenticating memory space of an authorized accessory
US10/791,792 Abandoned US20090013178A9 (en) 1997-07-15 2004-03-04 Integrated circuit for the authentication of a consumable storage device
US10/902,889 Expired - Fee Related US7210038B2 (en) 1998-07-10 2004-08-02 Method for validating an authentication chip
US10/902,883 Expired - Fee Related US7401223B2 (en) 1998-07-10 2004-08-02 Authentication chip for authenticating an untrusted chip
US12/030,817 Abandoned US20100031064A1 (en) 1998-07-10 2008-02-13 Tamper Detection Line Circuitry For An Authentication Integrated Circuit

Country Status (1)

Country Link
US (8) US6816968B1 (US20110208966A1-20110825-P00007.png)

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20090138716A1 (en) * 2006-03-29 2009-05-28 Agnes Leclercq Method for transmitting and receiving data, in particular for secure exchanges between an aircraft and a ground base, related devices and aircraft equipped with such devices
US20110225432A1 (en) * 2010-03-12 2011-09-15 Stmicroelectronics (Rousset) Sas Method and circuitry for detecting a fault attack
US20150288518A1 (en) * 2014-04-08 2015-10-08 Karl P.W. Wiegand Algorithm-agnostic approach for systematically hardening encryption
US10205596B2 (en) 2013-07-31 2019-02-12 Hewlett-Pachard Development Company, L.P. Authenticating a consumable product based on a remaining life value
WO2021127137A1 (en) * 2019-12-19 2021-06-24 Bae Systems Information And Electronic Systems Integration Inc. Externally powered cold key load
GB2605168A (en) * 2021-03-24 2022-09-28 Cirrus Logic Int Semiconductor Ltd An integrated circuit having a secure area

Families Citing this family (194)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6786420B1 (en) 1997-07-15 2004-09-07 Silverbrook Research Pty. Ltd. Data distribution mechanism in the form of ink dots on cards
US6547364B2 (en) * 1997-07-12 2003-04-15 Silverbrook Research Pty Ltd Printing cartridge with an integrated circuit device
US6618117B2 (en) 1997-07-12 2003-09-09 Silverbrook Research Pty Ltd Image sensing apparatus including a microcontroller
US7551201B2 (en) 1997-07-15 2009-06-23 Silverbrook Research Pty Ltd Image capture and processing device for a print on demand digital camera system
AUPO850597A0 (en) 1997-08-11 1997-09-04 Silverbrook Research Pty Ltd Image processing method and apparatus (art01a)
AUPO802797A0 (en) 1997-07-15 1997-08-07 Silverbrook Research Pty Ltd Image processing method and apparatus (ART54)
US7743262B2 (en) * 1997-07-15 2010-06-22 Silverbrook Research Pty Ltd Integrated circuit incorporating protection from power supply attacks
US7249109B1 (en) * 1997-07-15 2007-07-24 Silverbrook Research Pty Ltd Shielding manipulations of secret data
US7249108B1 (en) * 1997-07-15 2007-07-24 Silverbrook Research Pty Ltd Validation protocol and system
US6985207B2 (en) 1997-07-15 2006-01-10 Silverbrook Research Pty Ltd Photographic prints having magnetically recordable media
US6879341B1 (en) 1997-07-15 2005-04-12 Silverbrook Research Pty Ltd Digital camera system containing a VLIW vector processor
US7716098B2 (en) * 1997-07-15 2010-05-11 Silverbrook Research Pty Ltd. Method and apparatus for reducing optical emissions in an integrated circuit
US6690419B1 (en) 1997-07-15 2004-02-10 Silverbrook Research Pty Ltd Utilising eye detection methods for image processing in a digital image camera
US7702926B2 (en) * 1997-07-15 2010-04-20 Silverbrook Research Pty Ltd Decoy device in an integrated circuit
US6624848B1 (en) 1997-07-15 2003-09-23 Silverbrook Research Pty Ltd Cascading image modification using multiple digital cameras incorporating image processing
US7110024B1 (en) 1997-07-15 2006-09-19 Silverbrook Research Pty Ltd Digital camera system having motion deblurring means
US7587044B2 (en) * 1998-01-02 2009-09-08 Cryptography Research, Inc. Differential power analysis method and apparatus
JP2002522929A (ja) * 1998-07-31 2002-07-23 コーニンクレッカ フィリップス エレクトロニクス エヌ ヴィ 秘密の特性値を検出する分析方法を無効にする手段を備えるデータ処理装置
AUPP702098A0 (en) 1998-11-09 1998-12-03 Silverbrook Research Pty Ltd Image creation method and apparatus (ART73)
US7018117B2 (en) 1999-01-25 2006-03-28 Fargo Electronics, Inc. Identification card printer ribbon cartridge
US6932527B2 (en) 1999-01-25 2005-08-23 Fargo Electronics, Inc. Card cartridge
US6600908B1 (en) 1999-02-04 2003-07-29 Hark C. Chan Method and system for broadcasting and receiving audio information and associated audio indexes
US7245707B1 (en) * 1999-03-26 2007-07-17 Chan Hark C Data network based telephone messaging system
AUPQ056099A0 (en) 1999-05-25 1999-06-17 Silverbrook Research Pty Ltd A method and apparatus (pprint01)
IL130963A (en) * 1999-07-15 2006-04-10 Nds Ltd Key management for content protection
US7131001B1 (en) * 1999-10-29 2006-10-31 Broadcom Corporation Apparatus and method for secure filed upgradability with hard wired public key
GB2357407A (en) * 1999-12-17 2001-06-20 Int Computers Ltd Cryptographic key replacement using key lifetimes
US7685423B1 (en) * 2000-02-15 2010-03-23 Silverbrook Research Pty Ltd Validation protocol and system
EP1260053B1 (en) * 2000-02-15 2006-05-31 Silverbrook Research Pty. Limited Consumable authentication protocol and system
JP2001282619A (ja) * 2000-03-30 2001-10-12 Hitachi Ltd コンテンツ改竄検知方法及びその実施装置並びにその処理プログラムを記録した記録媒体
JP4647748B2 (ja) * 2000-06-12 2011-03-09 キヤノン株式会社 暗号化装置及び方法、ならびに通信方法及びシステム
KR100377172B1 (ko) * 2000-06-13 2003-03-26 주식회사 하이닉스반도체 데이터 암호화 표준 알고리즘을 이용한 암호화 장치의 키스케쥴러
US8732038B2 (en) 2000-07-19 2014-05-20 Sharp Kabushiki Kaisha Service management method, product-in-circulation to which the same is applied, service management device, service management network system, service management program, and computer readable program product with the program stored thereon
US20020077979A1 (en) * 2000-07-19 2002-06-20 Masaya Nagata Service management method, product-in-circulation to which the same is applied, service management device, service management network system, service management program, and computer-readable program product with the program stored thereon
JP4403649B2 (ja) * 2000-09-11 2010-01-27 ソニー株式会社 認証システム、認証方法およびicカード
US7360075B2 (en) 2001-02-12 2008-04-15 Aventail Corporation, A Wholly Owned Subsidiary Of Sonicwall, Inc. Method and apparatus for providing secure streaming data transmission facilities using unreliable protocols
US7353380B2 (en) * 2001-02-12 2008-04-01 Aventail, Llc, A Subsidiary Of Sonicwall, Inc. Method and apparatus for providing secure streaming data transmission facilities using unreliable protocols
US7383329B2 (en) * 2001-02-13 2008-06-03 Aventail, Llc Distributed cache for state transfer operations
FR2821225B1 (fr) * 2001-02-20 2005-02-04 Mobileway Systeme de paiement electronique a distance
FR2823928B1 (fr) * 2001-04-19 2003-08-22 Canal Plus Technologies Procede pour une communication securisee entre deux dispositifs
FR2826811B1 (fr) * 2001-06-27 2003-11-07 France Telecom Procede d'authentification cryptographique
US7234061B1 (en) * 2001-06-29 2007-06-19 Cisco Technology, Inc. Methods and apparatus for verifying modules from approved vendors
US7224805B2 (en) * 2001-07-06 2007-05-29 Nokia Corporation Consumption of content
US20030035646A1 (en) * 2001-08-20 2003-02-20 Vat 19, Llc Digital video device having a verification code thereon and method of generating a verification code
US20030061488A1 (en) * 2001-09-25 2003-03-27 Michael Huebler Cloning protection for electronic equipment
US7076059B1 (en) * 2002-01-17 2006-07-11 Cavium Networks Method and apparatus to implement the data encryption standard algorithm
US7430762B2 (en) 2002-03-01 2008-09-30 Fargo Electronics, Inc. Identification card manufacturing security
JP2003296680A (ja) * 2002-03-29 2003-10-17 Hitachi Ltd データ処理装置
US20030204731A1 (en) * 2002-04-29 2003-10-30 Pochuev Denis A. Method and apparatus to enhance the security of data
DK1880744T3 (en) * 2002-05-10 2015-02-09 Hoffmann La Roche Bisphosphonic acids for the treatment and prevention of osteoporosis
US7788479B2 (en) * 2002-07-25 2010-08-31 International Business Machines Corporation Apparatus, system and method of ensuring that only randomly-generated numbers that have passed a test are used for cryptographic purposes
US7600118B2 (en) * 2002-09-27 2009-10-06 Intel Corporation Method and apparatus for augmenting authentication in a cryptographic system
BRPI0315078B1 (pt) * 2002-10-07 2019-08-20 Telefonaktiebolaget Lm Ericsson (Publ) Dispositivo de segurança resistente à violação, e, terminal de usuário
CN1276363C (zh) * 2002-11-13 2006-09-20 深圳市朗科科技有限公司 借助半导体存储装置实现数据安全存储和算法存储的方法
US7818519B2 (en) * 2002-12-02 2010-10-19 Silverbrook Research Pty Ltd Timeslot arbitration scheme
US7133818B2 (en) * 2003-04-17 2006-11-07 Sun Microsystems, Inc. Method and apparatus for accelerated post-silicon testing and random number generation
TWI227328B (en) * 2003-06-19 2005-02-01 Yan-Fu Liou Method and system for accelerating inspection speed of semiconductor products
US7469107B2 (en) * 2003-07-23 2008-12-23 Lexmark International, Inc. Method for providing imaging substance for use in an imaging device via a virtual replenishment
US8229108B2 (en) * 2003-08-15 2012-07-24 Broadcom Corporation Pseudo-random number generation based on periodic sampling of one or more linear feedback shift registers
KR100544478B1 (ko) * 2003-12-01 2006-01-24 삼성전자주식회사 정보의 보안등급에 따라 인쇄권한을 제한할 수 있는인쇄장치, 이를 이용한 인쇄시스템 및 이들의 인쇄방법
US7464266B2 (en) * 2004-02-13 2008-12-09 Microsoft Corporation Cheap signatures for synchronous broadcast communication
US7499541B2 (en) * 2004-05-11 2009-03-03 National Institute Of Information And Communications Technology Cipher strength evaluation apparatus
US7627764B2 (en) * 2004-06-25 2009-12-01 Intel Corporation Apparatus and method for performing MD5 digesting
US20060031873A1 (en) * 2004-08-09 2006-02-09 Comcast Cable Holdings, Llc System and method for reduced hierarchy key management
US7343496B1 (en) 2004-08-13 2008-03-11 Zilog, Inc. Secure transaction microcontroller with secure boot loader
US7818574B2 (en) * 2004-09-10 2010-10-19 International Business Machines Corporation System and method for providing dynamically authorized access to functionality present on an integrated circuit chip
US8117452B2 (en) * 2004-11-03 2012-02-14 Cisco Technology, Inc. System and method for establishing a secure association between a dedicated appliance and a computing platform
US20060107054A1 (en) * 2004-11-16 2006-05-18 Young David W Method, apparatus and system to authenticate chipset patches with cryptographic signatures
US20060117004A1 (en) * 2004-11-30 2006-06-01 Hunt Charles L System and method for contextually understanding and analyzing system use and misuse
US8099369B2 (en) * 2004-12-08 2012-01-17 Ngna, Llc Method and system for securing content in media systems
US7383438B2 (en) * 2004-12-18 2008-06-03 Comcast Cable Holdings, Llc System and method for secure conditional access download and reconfiguration
US7933410B2 (en) * 2005-02-16 2011-04-26 Comcast Cable Holdings, Llc System and method for a variable key ladder
US20060200412A1 (en) * 2005-02-23 2006-09-07 Comcast Cable Holdings, Llc System and method for DRM regional and timezone key management
KR100670005B1 (ko) * 2005-02-23 2007-01-19 삼성전자주식회사 모바일 플랫폼을 위한 메모리의 무결성을 원격으로 확인하는 확인장치 및 그 시스템 그리고 무결성 확인 방법
JP2008532410A (ja) * 2005-03-01 2008-08-14 エヌエックスピー ビー ヴィ メッセージ認証コードを発生する発生器、発生方法、プログラム要素、及びコンピュータ読取可能媒体
US7788490B2 (en) * 2005-04-01 2010-08-31 Lexmark International, Inc. Methods for authenticating an identity of an article in electrical communication with a verifier system
US20060269066A1 (en) * 2005-05-06 2006-11-30 Schweitzer Engineering Laboratories, Inc. System and method for converting serial data into secure data packets configured for wireless transmission in a power system
US8639946B2 (en) * 2005-06-24 2014-01-28 Sigmatel, Inc. System and method of using a protected non-volatile memory
US20070019805A1 (en) * 2005-06-28 2007-01-25 Trustees Of Boston University System employing systematic robust error detection coding to protect system element against errors with unknown probability distributions
US8099187B2 (en) 2005-08-18 2012-01-17 Hid Global Corporation Securely processing and tracking consumable supplies and consumable material
US7385491B2 (en) * 2005-09-28 2008-06-10 Itt Manufacturing Enterprises, Inc. Tamper monitor circuit
GB0521582D0 (en) * 2005-10-22 2005-11-30 Depuy Int Ltd An implant for supporting a spinal column
GB0521585D0 (en) * 2005-10-22 2005-11-30 Depuy Int Ltd A spinal support rod
US8327204B2 (en) * 2005-10-27 2012-12-04 Dft Microsystems, Inc. High-speed transceiver tester incorporating jitter injection
FR2893436B1 (fr) * 2005-11-15 2008-02-15 Oberthur Card Syst Sa Securisation entre des composants electroniques d'une entite electronique securisee portable
US7845016B2 (en) * 2005-11-28 2010-11-30 Cisco Technology, Inc. Methods and apparatus for verifying modules from approved vendors
US20070162759A1 (en) * 2005-12-28 2007-07-12 Motorola, Inc. Protected port for electronic access to an embedded device
GB0600662D0 (en) * 2006-01-13 2006-02-22 Depuy Int Ltd Spinal support rod kit
US8234505B2 (en) * 2006-01-20 2012-07-31 Seagate Technology Llc Encryption key in a storage system
US8348952B2 (en) 2006-01-26 2013-01-08 Depuy International Ltd. System and method for cooling a spinal correction device comprising a shape memory material for corrective spinal surgery
EP1985061A1 (fr) * 2006-02-03 2008-10-29 ATT- Advanced Track & Trace S. A. Procede et dispositif d'authentification
FR2907288B1 (fr) * 2006-02-03 2008-12-19 Att Sa Procede et dispositif d'authentification
EP2016593B1 (en) * 2006-04-20 2014-11-05 NVE Corporation Enclosure tamper detection and protection
JP2009536389A (ja) * 2006-05-10 2009-10-08 エヌエックスピー ビー ヴィ 回路装置付きセンサ
US20080086781A1 (en) * 2006-10-06 2008-04-10 Stephane Rodgers Method and system for glitch protection in a secure system
US8036380B2 (en) * 2006-12-14 2011-10-11 Telefonaktiebolaget L M Ericsson (Publ) Efficient data integrity protection
US8294577B2 (en) 2007-03-09 2012-10-23 Nve Corporation Stressed magnetoresistive tamper detection devices
US8423789B1 (en) 2007-05-22 2013-04-16 Marvell International Ltd. Key generation techniques
ATE524006T1 (de) * 2007-06-11 2011-09-15 Fts Computertechnik Gmbh Verfahren und architektur zur sicherung von echtzeitdaten
US7913085B2 (en) * 2007-06-15 2011-03-22 Koolspan, Inc. System and method of per-packet keying
US8037524B1 (en) * 2007-06-19 2011-10-11 Netapp, Inc. System and method for differentiated cross-licensing for services across heterogeneous systems using transient keys
US7934083B2 (en) * 2007-09-14 2011-04-26 Kevin Norman Taylor Configurable access kernel
US9354890B1 (en) 2007-10-23 2016-05-31 Marvell International Ltd. Call stack structure for enabling execution of code outside of a subroutine and between call stack frames
GB0720762D0 (en) * 2007-10-24 2007-12-05 Depuy Spine Sorl Assembly for orthopaedic surgery
FR2926381A1 (fr) * 2008-01-11 2009-07-17 Sagem Securite Sa Methode de transfert securise de donnees
US9442758B1 (en) 2008-01-21 2016-09-13 Marvell International Ltd. Dynamic processor core switching
US8392762B2 (en) * 2008-02-04 2013-03-05 Honeywell International Inc. System and method for detection and prevention of flash corruption
US20090199014A1 (en) * 2008-02-04 2009-08-06 Honeywell International Inc. System and method for securing and executing a flash routine
US8345097B2 (en) * 2008-02-15 2013-01-01 Harris Corporation Hybrid remote digital recording and acquisition system
KR100997238B1 (ko) 2008-03-03 2010-11-29 삼성전자주식회사 Crum 유닛, 교체가능유닛 및 이를 이용하는 화상형성장치와, 그 인증 및 암호화 데이터 통신 방법
FR2928798B1 (fr) * 2008-03-14 2011-09-09 Centre Nat Rech Scient Procede d'authentification, systeme d'authentification, terminal serveur, terminal client et programmes d'ordinateur correspondants
US9003197B2 (en) * 2008-03-27 2015-04-07 General Instrument Corporation Methods, apparatus and system for authenticating a programmable hardware device and for authenticating commands received in the programmable hardware device from a secure processor
US9003559B2 (en) * 2008-07-29 2015-04-07 International Business Machines Corporation Continuity check monitoring for microchip exploitation detection
US8332659B2 (en) * 2008-07-29 2012-12-11 International Business Machines Corporation Signal quality monitoring to defeat microchip exploitation
US8172140B2 (en) * 2008-07-29 2012-05-08 International Business Machines Corporation Doped implant monitoring for microchip tamper detection
US8214657B2 (en) * 2008-07-29 2012-07-03 International Business Machines Corporation Resistance sensing for defeating microchip exploitation
US8510560B1 (en) 2008-08-20 2013-08-13 Marvell International Ltd. Efficient key establishment for wireless networks
US8311222B2 (en) * 2008-08-26 2012-11-13 GlobalFoundries, Inc. Hardware based multi-dimensional encryption
US8296555B2 (en) 2008-09-18 2012-10-23 Marvell World Trade Ltd. Preloader
US8171306B2 (en) * 2008-11-05 2012-05-01 Microsoft Corporation Universal secure token for obfuscation and tamper resistance
US7916539B2 (en) * 2009-01-23 2011-03-29 Analog Devices, Inc. Differential, level-shifted EEPROM structures
US8549260B2 (en) * 2009-01-29 2013-10-01 Infineon Technologies Ag Apparatus for processing data and method for generating manipulated and re-manipulated configuration data for processor
CN102640448A (zh) * 2009-05-13 2012-08-15 敬畏技术有限责任公司 用于在对称加密系统内安全地识别和认证设备的系统和方法
EP2251813A1 (en) * 2009-05-13 2010-11-17 Nagravision S.A. Method for authenticating access to a secured chip by a test device
US8769654B2 (en) * 2009-06-23 2014-07-01 Cisco Technology, Inc. Counterfeit prevention strategy for pluggable modules
DE102009031145A1 (de) * 2009-06-30 2011-01-05 Siemens Aktiengesellschaft Vorrichtung und Verfahren zum Prüfen eines Chips, auf dem ein kryptographisches Verfahren implementiert ist
US20110093714A1 (en) * 2009-10-20 2011-04-21 Infineon Technologies Ag Systems and methods for asymmetric cryptographic accessory authentication
KR101646705B1 (ko) * 2009-12-01 2016-08-09 삼성전자주식회사 에스-박스를 구현한 암호화 장치
CN102725737B (zh) * 2009-12-04 2016-04-20 密码研究公司 可验证防泄漏的加密和解密
US8621212B2 (en) * 2009-12-22 2013-12-31 Infineon Technologies Ag Systems and methods for cryptographically enhanced automatic blacklist management and enforcement
US9582443B1 (en) 2010-02-12 2017-02-28 Marvell International Ltd. Serial control channel processor for executing time-based instructions
EP2369622B1 (fr) * 2010-03-24 2015-10-14 STMicroelectronics Rousset SAS Procédé et dispositif de contremesure contre une attaque par injection d'erreur dans un microcircuit électronique
FR2959580A1 (fr) 2010-05-03 2011-11-04 St Microelectronics Rousset Circuit et procede de detection d'une attaque par injection de fautes
US8848905B1 (en) * 2010-07-28 2014-09-30 Sandia Corporation Deterrence of device counterfeiting, cloning, and subversion by substitution using hardware fingerprinting
US8645716B1 (en) 2010-10-08 2014-02-04 Marvell International Ltd. Method and apparatus for overwriting an encryption key of a media drive
KR101665562B1 (ko) * 2010-11-05 2016-10-25 삼성전자주식회사 검출 회로, 그 검출 방법, 및 이를 포함하는 메모리 시스템
US8613087B2 (en) * 2010-12-06 2013-12-17 Samsung Electronics Co., Ltd. Computing system
JP2012169756A (ja) * 2011-02-10 2012-09-06 Hitachi Ltd 暗号化通信検査システム
US8630411B2 (en) 2011-02-17 2014-01-14 Infineon Technologies Ag Systems and methods for device and data authentication
KR101577886B1 (ko) * 2011-06-29 2015-12-15 인텔 코포레이션 무결성 검사 및 리플레이 공격들에 대한 보호를 이용하는 메모리 암호화를 위한 방법 및 장치
US9098694B1 (en) * 2011-07-06 2015-08-04 Marvell International Ltd. Clone-resistant logic
US20130044798A1 (en) 2011-08-18 2013-02-21 Microsoft Corporation Side Channel Communications
US8732523B2 (en) 2011-10-24 2014-05-20 Arm Limited Data processing apparatus and method for analysing transient faults occurring within storage elements of the data processing apparatus
US8872635B2 (en) * 2011-10-25 2014-10-28 Static Control Components, Inc. Systems and methods for verifying a chip
US8635467B2 (en) 2011-10-27 2014-01-21 Certicom Corp. Integrated circuit with logic circuitry and multiple concealing circuits
US8334705B1 (en) 2011-10-27 2012-12-18 Certicom Corp. Analog circuitry to conceal activity of logic circuitry
US9436629B2 (en) 2011-11-15 2016-09-06 Marvell World Trade Ltd. Dynamic boot image streaming
US8646094B2 (en) * 2011-12-07 2014-02-04 Owl Computing Technologies, Inc. Method and apparatus for preventing unauthorized access to information stored in a non-volatile memory
US8385553B1 (en) * 2012-02-28 2013-02-26 Google Inc. Portable secure element
US9258907B2 (en) 2012-08-09 2016-02-09 Lockheed Martin Corporation Conformal 3D non-planar multi-layer circuitry
FI2887923T3 (fi) * 2012-08-24 2023-06-30 Sun Pharmaceutical Ind Ltd Polyoksylilipidin tai polyoksylirasvahapon oftalminen formulaatio ja silmätilojen hoito
US9043632B2 (en) 2012-09-25 2015-05-26 Apple Inc. Security enclave processor power control
US9047471B2 (en) 2012-09-25 2015-06-02 Apple Inc. Security enclave processor boot control
US8873747B2 (en) 2012-09-25 2014-10-28 Apple Inc. Key management using security enclave processor
US8775757B2 (en) 2012-09-25 2014-07-08 Apple Inc. Trust zone support in system on a chip having security enclave processor
US8930700B2 (en) * 2012-12-12 2015-01-06 Richard J. Wielopolski Remote device secure data file storage system and method
US9575768B1 (en) 2013-01-08 2017-02-21 Marvell International Ltd. Loading boot code from multiple memories
US9124434B2 (en) 2013-02-01 2015-09-01 Microsoft Technology Licensing, Llc Securing a computing device accessory
US8772745B1 (en) 2013-03-14 2014-07-08 Lockheed Martin Corporation X-ray obscuration film and related techniques
US9736801B1 (en) 2013-05-20 2017-08-15 Marvell International Ltd. Methods and apparatus for synchronizing devices in a wireless data communication system
US9521635B1 (en) 2013-05-21 2016-12-13 Marvell International Ltd. Methods and apparatus for selecting a device to perform shared functionality in a deterministic and fair manner in a wireless data communication system
WO2015015305A1 (en) 2013-07-31 2015-02-05 Marvell Word Trade Ltd. Parallelizing boot operations
WO2015052957A1 (ja) * 2013-10-08 2015-04-16 日本電気株式会社 暗号文比較システム
US9413356B1 (en) * 2013-12-11 2016-08-09 Marvell International Ltd. Chip or SoC including fusible logic array and functions to protect logic against reverse engineering
US9660802B1 (en) 2013-12-12 2017-05-23 Marvell International Ltd. Systems and methods for generating and storing silicon fingerprints for a security chip
DE102014206992A1 (de) * 2014-04-11 2015-10-15 Siemens Aktiengesellschaft Zufallszahlengenerator und Verfahren zum Erzeugen von Zufallszahlen
CN106659425B (zh) * 2014-07-10 2020-09-18 基文影像公司 配置成定位体内装置的传感器带及定位方法
US9547778B1 (en) 2014-09-26 2017-01-17 Apple Inc. Secure public key acceleration
US10123410B2 (en) 2014-10-10 2018-11-06 Lockheed Martin Corporation Fine line 3D non-planar conforming circuit
US9501664B1 (en) 2014-12-15 2016-11-22 Sandia Corporation Method, apparatus and system to compensate for drift by physically unclonable function circuitry
US10284470B2 (en) * 2014-12-23 2019-05-07 Intel Corporation Technologies for network device flow lookup management
US9606939B2 (en) 2015-02-26 2017-03-28 International Business Machines Corporation Memory data security
US10892889B2 (en) * 2015-04-07 2021-01-12 Coleridge Enterprises Llc Systems and methods for an enhanced XOR cipher through extensions
KR102309203B1 (ko) * 2015-04-23 2021-10-05 매그나칩 반도체 유한회사 반도체 칩의 위변조 방지 회로 및 방법
US9721093B2 (en) * 2015-06-16 2017-08-01 Intel Corporation Enhanced security of power management communications and protection from side channel attacks
EP3427435A1 (en) 2016-03-08 2019-01-16 Marvell World Trade Ltd. Methods and apparatus for secure device authentication
US10380341B2 (en) * 2016-04-01 2019-08-13 Qualcomm Incorporated Adaptive systems and procedures for defending a processor against transient fault attacks
CN106453253B (zh) * 2016-09-06 2019-10-25 上海扈民区块链科技有限公司 一种高效的基于身份的匿签密方法
US10474846B1 (en) * 2017-08-31 2019-11-12 Square, Inc. Processor power supply glitch detection
WO2019078832A1 (en) 2017-10-18 2019-04-25 Hewlett-Packard Development Company, L.P. INTEGRATED CIRCUIT DEVICE FOR REPLACEABLE PRINTER COMPONENT
US11025416B1 (en) 2018-03-09 2021-06-01 Wells Fargo Bank, N.A. Systems and methods for quantum session authentication
US10855454B1 (en) * 2018-03-09 2020-12-01 Wells Fargo Bank, N.A. Systems and methods for quantum session authentication
US11343087B1 (en) 2018-03-09 2022-05-24 Wells Fargo Bank, N.A. Systems and methods for server-side quantum session authentication
US10728029B1 (en) 2018-03-09 2020-07-28 Wells Fargo Bank, N.A. Systems and methods for multi-server quantum session authentication
US10404454B1 (en) * 2018-04-25 2019-09-03 Blockchain Asics Llc Cryptographic ASIC for derivative key hierarchy
CN110580420B (zh) 2018-06-11 2023-03-28 阿里巴巴集团控股有限公司 基于集成芯片的数据处理方法、计算机设备、存储介质
US10855453B1 (en) 2018-08-20 2020-12-01 Wells Fargo Bank, N.A. Systems and methods for time-bin quantum session authentication
US10839109B2 (en) * 2018-11-14 2020-11-17 Massachusetts Institute Of Technology Integrated circuit (IC) portholes and related techniques
US11270032B1 (en) 2018-12-27 2022-03-08 Thales E-Security, Inc. Tamper switch assembly and installation method thereof
US11196575B2 (en) 2019-04-24 2021-12-07 International Business Machines Corporation On-chipset certification to prevent spy chip
US11804955B1 (en) 2019-09-13 2023-10-31 Chol, Inc. Method and system for modulated waveform encryption
US11816228B2 (en) 2020-09-25 2023-11-14 Advanced Micro Devices, Inc. Metadata tweak for channel encryption differentiation
CN112491843B (zh) * 2020-11-17 2022-06-21 苏州浪潮智能科技有限公司 一种数据库多重认证方法、系统、终端及存储介质
EP4285261A2 (de) * 2021-01-26 2023-12-06 Real-Cis GmbH Einrichtung und verfahren für die sichere verarbeitung von daten

Citations (95)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US3708603A (en) * 1971-03-01 1973-01-02 C Keagle Electronic sound synthesizer
US3816765A (en) * 1972-06-27 1974-06-11 Rca Corp Digital interface circuit for a random noise generator
US4001777A (en) * 1973-06-21 1977-01-04 Elmore Alexander Taximeter protection system
US4023149A (en) * 1975-10-28 1977-05-10 Motorola, Inc. Static storage technique for four transistor IGFET memory cell
US4029901A (en) * 1975-11-13 1977-06-14 Motorola, Inc. Control center for a communications system with interchannel patching capabilities
US4395774A (en) * 1981-01-12 1983-07-26 National Semiconductor Corporation Low power CMOS frequency divider
US4455545A (en) * 1982-11-05 1984-06-19 Sperry Corporation High frequency output inductor for inverter power supply
US4472821A (en) * 1982-05-03 1984-09-18 General Electric Company Dynamic shift register utilizing CMOS dual gate transistors
US4495560A (en) * 1980-07-09 1985-01-22 Kabushiki Kaisha Toyota Chuo Kenkyusho Fluctuating drive system
US4509201A (en) * 1982-06-23 1985-04-02 Toshiba Corporation Wireless telephone apparatus with protection against power loss
US4529870A (en) * 1980-03-10 1985-07-16 David Chaum Cryptographic identification, financial transaction, and credential device
US4590470A (en) * 1983-07-11 1986-05-20 At&T Bell Laboratories User authentication system employing encryption functions
US4601011A (en) * 1981-12-30 1986-07-15 Avigdor Grynberg User authorization verification apparatus for computer systems including a central device and a plurality of pocket sized remote units
US4667192A (en) * 1983-05-24 1987-05-19 The Johns Hopkins University Method and apparatus for bus arbitration using a pseudo-random sequence
US4736423A (en) * 1985-04-30 1988-04-05 International Business Machines Corporation Technique for reducing RSA Crypto variable storage
US4757532A (en) * 1985-04-19 1988-07-12 Alcatel Business Systems Limited Secure transport of information between electronic stations
US4771276A (en) * 1985-04-15 1988-09-13 International Business Machines Corporation Electromagnetic touch sensor input system in a cathode ray tube display device
US4799061A (en) * 1985-11-18 1989-01-17 International Business Machines Corporation Secure component authentication system
US4799059A (en) * 1986-03-14 1989-01-17 Enscan, Inc. Automatic/remote RF instrument monitoring system
US4800590A (en) * 1985-01-14 1989-01-24 Willis E. Higgins Computer key and computer lock system
US4801935A (en) * 1986-11-17 1989-01-31 Computer Security Corporation Apparatus and method for security of electric and electronic devices
US4807284A (en) * 1986-09-24 1989-02-21 Ncr Corporation Security device for sensitive data
US4827113A (en) * 1984-10-19 1989-05-02 Casio Computer Co., Ltd. Technique for authenticating IC card and terminal
US4852680A (en) * 1988-04-07 1989-08-01 J. I. Case Company Vehicle anti-theft system with remote security module
US4862501A (en) * 1985-03-08 1989-08-29 Kabushiki Kaisha Toshiba Communications network using IC cards
US4902910A (en) * 1987-11-17 1990-02-20 Xilinx, Inc. Power supply voltage level sensing circuit
US4918728A (en) * 1989-08-30 1990-04-17 International Business Machines Corporation Data cryptography operations using control vectors
US4926173A (en) * 1988-11-10 1990-05-15 Ncr Corporation Data entry keyboard apparatus
US4933898A (en) * 1989-01-12 1990-06-12 General Instrument Corporation Secure integrated circuit chip with conductive shield
US5036461A (en) * 1990-05-16 1991-07-30 Elliott John C Two-way authentication system between user's smart card and issuer-specific plug-in application modules in multi-issued transaction device
US5095270A (en) * 1989-08-16 1992-03-10 U.S. Philips Corporation Method of suppressing current distribution noise in a dc squid
US5136642A (en) * 1990-06-01 1992-08-04 Kabushiki Kaisha Toshiba Cryptographic communication method and cryptographic communication device
US5184324A (en) * 1990-12-20 1993-02-02 Sharp Kabushiki Kaisha Dynamic semiconductor multi-value memory device
US5196840A (en) * 1990-11-05 1993-03-23 International Business Machines Corporation Secure communications system for remotely located computers
US5218569A (en) * 1991-02-08 1993-06-08 Banks Gerald J Electrically alterable non-volatile memory with n-bits per memory cell
US5239575A (en) * 1991-07-09 1993-08-24 Schlumberger Industries, Inc. Telephone dial-inbound data acquisition system with demand reading capability
US5245657A (en) * 1991-07-08 1993-09-14 Mitsubishi Denki Kabushiki Kaisha Verification method and apparatus
US5280193A (en) * 1992-05-04 1994-01-18 Lin Paul T Repairable semiconductor multi-package module having individualized package bodies on a PC board substrate
US5311595A (en) * 1989-06-07 1994-05-10 Kommunedata I/S Method of transferring data, between computer systems using electronic cards
US5327131A (en) * 1991-11-07 1994-07-05 Kawasaki Steel Corporation Parallel A/D converter having comparator threshold voltages defined by MOS transistor geometries
US5351210A (en) * 1990-11-28 1994-09-27 Kabushiki Kaisha Toshiba Serially accessible semiconductor memory with multiple level storage cells
US5406287A (en) * 1993-12-22 1995-04-11 The United States Of America As Represented By The Secretary Of The Air Force Programmable airdrop infrared decoy
US5416783A (en) * 1993-08-09 1995-05-16 Motorola, Inc. Method and apparatus for generating pseudorandom numbers or for performing data compression in a data processor
US5499294A (en) * 1993-11-24 1996-03-12 The United States Of America As Represented By The Administrator Of The National Aeronautics And Space Administration Digital camera with apparatus for authentication of images produced from an image file
US5500517A (en) * 1994-09-02 1996-03-19 Gemplus Card International Apparatus and method for data transfer between stand alone integrated circuit smart card terminal and remote computer of system operator
US5506905A (en) * 1994-06-10 1996-04-09 Delco Electronics Corp. Authentication method for keyless entry system
US5515441A (en) * 1994-05-12 1996-05-07 At&T Corp. Secure communication method and apparatus
US5535167A (en) * 1991-06-12 1996-07-09 Hazani; Emanuel Non-volatile memory circuits, architecture
US5539690A (en) * 1994-06-02 1996-07-23 Intel Corporation Write verify schemes for flash memory with multilevel cells
US5619399A (en) * 1995-02-16 1997-04-08 Micromodule Systems, Inc. Multiple chip module mounting assembly and computer using same
US5619571A (en) * 1995-06-01 1997-04-08 Sandstrom; Brent B. Method for securely storing electronic records
US5629642A (en) * 1995-08-18 1997-05-13 Mitsubishi Denki Kabushiki Kaisha Power supply monitor
US5633932A (en) * 1995-12-19 1997-05-27 Intel Corporation Apparatus and method for preventing disclosure through user-authentication at a printing node
US5646660A (en) * 1994-08-09 1997-07-08 Encad, Inc. Printer ink cartridge with drive logic integrated circuit
US5646591A (en) * 1992-05-22 1997-07-08 Directed Electronics, Inc. Advanced method of indicating incoming threat level to an electronically secured vehicle and apparatus therefor
US5673223A (en) * 1995-06-09 1997-09-30 Samsung Electronics Co., Ltd. Nonvolatile semiconductor memory device with multiple word line voltage generators
US5673316A (en) * 1996-03-29 1997-09-30 International Business Machines Corporation Creation and distribution of cryptographic envelope
US5757388A (en) * 1996-12-16 1998-05-26 Eastman Kodak Company Electronic camera and integral ink jet printer
US5757918A (en) * 1995-01-20 1998-05-26 Tandem Computers Incorporated Method and apparatus for user and security device authentication
US5768369A (en) * 1996-11-20 1998-06-16 Harris Corporation Telephone test set keypad with integrated dynamic microphone
US5778069A (en) * 1996-04-10 1998-07-07 Microsoft Corporation Non-biased pseudo random number generator
US5787367A (en) * 1996-07-03 1998-07-28 Chrysler Corporation Flash reprogramming security for vehicle computer
US5790667A (en) * 1995-01-20 1998-08-04 Matsushita Electric Industrial Co., Ltd. Personal authentication method
US5802178A (en) * 1996-07-30 1998-09-01 Itt Industries, Inc. Stand alone device for providing security within computer networks
US5804975A (en) * 1996-09-18 1998-09-08 Lucent Technologies Inc. Detecting breakdown in dielectric layers
US5810146A (en) * 1996-10-31 1998-09-22 Authentication Technologies, Inc. Wide edge lead currency thread detection system
US5864695A (en) * 1994-12-28 1999-01-26 Oki Electric Industry Co., Ltd. IC card control circuit and IC card control system
US5872849A (en) * 1994-01-13 1999-02-16 Certco Llc Enhanced cryptographic system and method with key escrow feature
US5872847A (en) * 1996-07-30 1999-02-16 Itt Industries, Inc. Using trusted associations to establish trust in a computer network
US5923759A (en) * 1995-04-20 1999-07-13 Lee; Philip S. System for securely exchanging data with smart cards
US5943123A (en) * 1996-07-25 1999-08-24 Anritsu Corporation Optical fiber monitor using optical time domain reflectometer and monitoring method
US6028937A (en) * 1995-10-09 2000-02-22 Matsushita Electric Industrial Co., Ltd Communication device which performs two-way encryption authentication in challenge response format
US6049880A (en) * 1996-12-19 2000-04-11 Samsung Electronics Co., Ltd. Computer display monitor apparatus and method for controlling power thereof
US6088802A (en) * 1997-06-04 2000-07-11 Spyrus, Inc. Peripheral device with integrated security functionality
US6088450A (en) * 1996-04-17 2000-07-11 Intel Corporation Authentication system based on periodic challenge/response protocol
US6099408A (en) * 1996-12-31 2000-08-08 Walker Digital, Llc Method and apparatus for securing electronic games
US6192473B1 (en) * 1996-12-24 2001-02-20 Pitney Bowes Inc. System and method for mutual authentication and secure communications between a postage security device and a meter server
US6217165B1 (en) * 1997-07-15 2001-04-17 Silverbrook Research Pty. Ltd. Ink and media cartridge with axial ink chambers
US6246970B1 (en) * 1998-07-10 2001-06-12 Silverbrook Research Pty Ltd Method for making a chip tamper-resistant
US6359479B1 (en) * 1998-08-04 2002-03-19 Juniper Networks, Inc. Synchronizing data transfers between two distinct clock domains
US6367013B1 (en) * 1995-01-17 2002-04-02 Eoriginal Inc. System and method for electronic transmission, storage, and retrieval of authenticated electronic original documents
US6389533B1 (en) * 1999-02-05 2002-05-14 Intel Corporation Anonymity server
US6442690B1 (en) * 1998-10-23 2002-08-27 L3-Communications Corporation Apparatus and methods for managing key material in heterogeneous cryptographic assets
US6442525B1 (en) * 1997-07-15 2002-08-27 Silverbrook Res Pty Ltd System for authenticating physical objects
US6442276B1 (en) * 1997-07-21 2002-08-27 Assure Systems, Inc. Verification of authenticity of goods by use of random numbers
US20030041244A1 (en) * 2000-04-28 2003-02-27 Levente Buttyan Method for securing communications between a terminal and an additional user equipment
US6529487B1 (en) * 1999-07-09 2003-03-04 Qualcomm Incorporated Method and apparatus for securely transmitting distributed RAND for use in mobile station authentication
US6552563B1 (en) * 1996-11-14 2003-04-22 Si Diamond Technology, Inc. Display panel test device
US6789189B2 (en) * 2000-08-04 2004-09-07 First Data Corporation Managing account database in ABDS system
US20050038755A1 (en) * 1997-07-15 2005-02-17 Kia Silverbook Method and apparatus for reducing optical emissions in an integrated circuit
US6938156B2 (en) * 2000-08-04 2005-08-30 First Data Corporation ABDS system and verification status for authenticating entity access
US7246098B1 (en) * 1997-07-15 2007-07-17 Silverbrook Research Pty Ltd Consumable authentication protocol and system
US7346586B1 (en) * 1997-07-15 2008-03-18 Silverbrook Research Pty Ltd Validation protocol and system
US7657488B2 (en) * 1997-07-15 2010-02-02 Silverbrook Research Pty Ltd Validating apparatus having encryption integrated circuits
US7702926B2 (en) * 1997-07-15 2010-04-20 Silverbrook Research Pty Ltd Decoy device in an integrated circuit

Family Cites Families (16)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US3529487A (en) * 1968-12-18 1970-09-22 Chrysler Corp Transmission control device
US3703222A (en) * 1971-01-21 1972-11-21 Otis Elevator Co Solid state control system
US3845657A (en) * 1972-02-04 1974-11-05 Westinghouse Electric Corp Surveillance system including means for detecting impending failure in high pressure, high temperature fluid conducting pipes
US3845857A (en) 1972-10-19 1974-11-05 Moorfeed Corp Spring mount for vibratory feeder
US5457748A (en) * 1992-11-30 1995-10-10 Motorola, Inc. Method and apparatus for improved security within encrypted communication devices
US5832438A (en) * 1995-02-08 1998-11-03 Sun Micro Systems, Inc. Apparatus and method for audio computing
US6067620A (en) * 1996-07-30 2000-05-23 Holden; James M. Stand alone security device for computer networks
DE19716111A1 (de) 1997-04-17 1998-10-22 Giesecke & Devrient Gmbh Verfahren zur gegenseitigen Authentifizierung zweier Einheiten
GB9709135D0 (en) * 1997-05-02 1997-06-25 Certicom Corp Two way authentication protocol
FR2763955B1 (fr) * 1997-05-30 1999-07-02 Seppic Sa Produit de remplissage pour cables a fibres optiques compatible avec le polypropylene
JP3575951B2 (ja) 1997-06-17 2004-10-13 株式会社東芝 機器認証方法及び装置並びに認証システム
US6904110B2 (en) * 1997-07-31 2005-06-07 Francois Trans Channel equalization system and method
US6704871B1 (en) * 1997-09-16 2004-03-09 Safenet, Inc. Cryptographic co-processor
US6148405A (en) 1997-11-10 2000-11-14 Phone.Com, Inc. Method and system for secure lightweight transactions in wireless data networks
JP3459602B2 (ja) * 1999-01-25 2003-10-20 三洋電機株式会社 半導体装置とそのパターンレイアウト方法
US7240995B2 (en) * 2003-05-06 2007-07-10 Lexmark International, Inc. Method of authenticating a consumable

Patent Citations (99)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US3708603A (en) * 1971-03-01 1973-01-02 C Keagle Electronic sound synthesizer
US3816765A (en) * 1972-06-27 1974-06-11 Rca Corp Digital interface circuit for a random noise generator
US4001777A (en) * 1973-06-21 1977-01-04 Elmore Alexander Taximeter protection system
US4023149A (en) * 1975-10-28 1977-05-10 Motorola, Inc. Static storage technique for four transistor IGFET memory cell
US4029901A (en) * 1975-11-13 1977-06-14 Motorola, Inc. Control center for a communications system with interchannel patching capabilities
US4529870A (en) * 1980-03-10 1985-07-16 David Chaum Cryptographic identification, financial transaction, and credential device
US4495560A (en) * 1980-07-09 1985-01-22 Kabushiki Kaisha Toyota Chuo Kenkyusho Fluctuating drive system
US4395774A (en) * 1981-01-12 1983-07-26 National Semiconductor Corporation Low power CMOS frequency divider
US4601011A (en) * 1981-12-30 1986-07-15 Avigdor Grynberg User authorization verification apparatus for computer systems including a central device and a plurality of pocket sized remote units
US4472821A (en) * 1982-05-03 1984-09-18 General Electric Company Dynamic shift register utilizing CMOS dual gate transistors
US4509201A (en) * 1982-06-23 1985-04-02 Toshiba Corporation Wireless telephone apparatus with protection against power loss
US4455545A (en) * 1982-11-05 1984-06-19 Sperry Corporation High frequency output inductor for inverter power supply
US4667192A (en) * 1983-05-24 1987-05-19 The Johns Hopkins University Method and apparatus for bus arbitration using a pseudo-random sequence
US4590470A (en) * 1983-07-11 1986-05-20 At&T Bell Laboratories User authentication system employing encryption functions
US4827113A (en) * 1984-10-19 1989-05-02 Casio Computer Co., Ltd. Technique for authenticating IC card and terminal
US4800590A (en) * 1985-01-14 1989-01-24 Willis E. Higgins Computer key and computer lock system
US4862501A (en) * 1985-03-08 1989-08-29 Kabushiki Kaisha Toshiba Communications network using IC cards
US4771276A (en) * 1985-04-15 1988-09-13 International Business Machines Corporation Electromagnetic touch sensor input system in a cathode ray tube display device
US4757532A (en) * 1985-04-19 1988-07-12 Alcatel Business Systems Limited Secure transport of information between electronic stations
US4736423A (en) * 1985-04-30 1988-04-05 International Business Machines Corporation Technique for reducing RSA Crypto variable storage
US4799061A (en) * 1985-11-18 1989-01-17 International Business Machines Corporation Secure component authentication system
US4799059A (en) * 1986-03-14 1989-01-17 Enscan, Inc. Automatic/remote RF instrument monitoring system
US4807284A (en) * 1986-09-24 1989-02-21 Ncr Corporation Security device for sensitive data
US4801935A (en) * 1986-11-17 1989-01-31 Computer Security Corporation Apparatus and method for security of electric and electronic devices
US4902910A (en) * 1987-11-17 1990-02-20 Xilinx, Inc. Power supply voltage level sensing circuit
US4852680A (en) * 1988-04-07 1989-08-01 J. I. Case Company Vehicle anti-theft system with remote security module
US4926173A (en) * 1988-11-10 1990-05-15 Ncr Corporation Data entry keyboard apparatus
US4933898A (en) * 1989-01-12 1990-06-12 General Instrument Corporation Secure integrated circuit chip with conductive shield
US5311595A (en) * 1989-06-07 1994-05-10 Kommunedata I/S Method of transferring data, between computer systems using electronic cards
US5095270A (en) * 1989-08-16 1992-03-10 U.S. Philips Corporation Method of suppressing current distribution noise in a dc squid
US4918728A (en) * 1989-08-30 1990-04-17 International Business Machines Corporation Data cryptography operations using control vectors
US5036461A (en) * 1990-05-16 1991-07-30 Elliott John C Two-way authentication system between user's smart card and issuer-specific plug-in application modules in multi-issued transaction device
US5136642A (en) * 1990-06-01 1992-08-04 Kabushiki Kaisha Toshiba Cryptographic communication method and cryptographic communication device
US5196840A (en) * 1990-11-05 1993-03-23 International Business Machines Corporation Secure communications system for remotely located computers
US5351210A (en) * 1990-11-28 1994-09-27 Kabushiki Kaisha Toshiba Serially accessible semiconductor memory with multiple level storage cells
US5184324A (en) * 1990-12-20 1993-02-02 Sharp Kabushiki Kaisha Dynamic semiconductor multi-value memory device
US5218569A (en) * 1991-02-08 1993-06-08 Banks Gerald J Electrically alterable non-volatile memory with n-bits per memory cell
US5535167A (en) * 1991-06-12 1996-07-09 Hazani; Emanuel Non-volatile memory circuits, architecture
US5245657A (en) * 1991-07-08 1993-09-14 Mitsubishi Denki Kabushiki Kaisha Verification method and apparatus
US5239575A (en) * 1991-07-09 1993-08-24 Schlumberger Industries, Inc. Telephone dial-inbound data acquisition system with demand reading capability
US5327131A (en) * 1991-11-07 1994-07-05 Kawasaki Steel Corporation Parallel A/D converter having comparator threshold voltages defined by MOS transistor geometries
US5280193A (en) * 1992-05-04 1994-01-18 Lin Paul T Repairable semiconductor multi-package module having individualized package bodies on a PC board substrate
US5646591A (en) * 1992-05-22 1997-07-08 Directed Electronics, Inc. Advanced method of indicating incoming threat level to an electronically secured vehicle and apparatus therefor
US5416783A (en) * 1993-08-09 1995-05-16 Motorola, Inc. Method and apparatus for generating pseudorandom numbers or for performing data compression in a data processor
US5499294A (en) * 1993-11-24 1996-03-12 The United States Of America As Represented By The Administrator Of The National Aeronautics And Space Administration Digital camera with apparatus for authentication of images produced from an image file
US5406287A (en) * 1993-12-22 1995-04-11 The United States Of America As Represented By The Secretary Of The Air Force Programmable airdrop infrared decoy
US5872849A (en) * 1994-01-13 1999-02-16 Certco Llc Enhanced cryptographic system and method with key escrow feature
US5515441A (en) * 1994-05-12 1996-05-07 At&T Corp. Secure communication method and apparatus
US5539690A (en) * 1994-06-02 1996-07-23 Intel Corporation Write verify schemes for flash memory with multilevel cells
US5506905A (en) * 1994-06-10 1996-04-09 Delco Electronics Corp. Authentication method for keyless entry system
US5646660A (en) * 1994-08-09 1997-07-08 Encad, Inc. Printer ink cartridge with drive logic integrated circuit
US5500517A (en) * 1994-09-02 1996-03-19 Gemplus Card International Apparatus and method for data transfer between stand alone integrated circuit smart card terminal and remote computer of system operator
US5864695A (en) * 1994-12-28 1999-01-26 Oki Electric Industry Co., Ltd. IC card control circuit and IC card control system
US6367013B1 (en) * 1995-01-17 2002-04-02 Eoriginal Inc. System and method for electronic transmission, storage, and retrieval of authenticated electronic original documents
US5790667A (en) * 1995-01-20 1998-08-04 Matsushita Electric Industrial Co., Ltd. Personal authentication method
US5757918A (en) * 1995-01-20 1998-05-26 Tandem Computers Incorporated Method and apparatus for user and security device authentication
US5619399A (en) * 1995-02-16 1997-04-08 Micromodule Systems, Inc. Multiple chip module mounting assembly and computer using same
US5923759A (en) * 1995-04-20 1999-07-13 Lee; Philip S. System for securely exchanging data with smart cards
US5619571A (en) * 1995-06-01 1997-04-08 Sandstrom; Brent B. Method for securely storing electronic records
US5673223A (en) * 1995-06-09 1997-09-30 Samsung Electronics Co., Ltd. Nonvolatile semiconductor memory device with multiple word line voltage generators
US5629642A (en) * 1995-08-18 1997-05-13 Mitsubishi Denki Kabushiki Kaisha Power supply monitor
US6028937A (en) * 1995-10-09 2000-02-22 Matsushita Electric Industrial Co., Ltd Communication device which performs two-way encryption authentication in challenge response format
US5633932A (en) * 1995-12-19 1997-05-27 Intel Corporation Apparatus and method for preventing disclosure through user-authentication at a printing node
US5673316A (en) * 1996-03-29 1997-09-30 International Business Machines Corporation Creation and distribution of cryptographic envelope
US5778069A (en) * 1996-04-10 1998-07-07 Microsoft Corporation Non-biased pseudo random number generator
US6088450A (en) * 1996-04-17 2000-07-11 Intel Corporation Authentication system based on periodic challenge/response protocol
US5787367A (en) * 1996-07-03 1998-07-28 Chrysler Corporation Flash reprogramming security for vehicle computer
US5943123A (en) * 1996-07-25 1999-08-24 Anritsu Corporation Optical fiber monitor using optical time domain reflectometer and monitoring method
US5802178A (en) * 1996-07-30 1998-09-01 Itt Industries, Inc. Stand alone device for providing security within computer networks
US5872847A (en) * 1996-07-30 1999-02-16 Itt Industries, Inc. Using trusted associations to establish trust in a computer network
US5804975A (en) * 1996-09-18 1998-09-08 Lucent Technologies Inc. Detecting breakdown in dielectric layers
US5810146A (en) * 1996-10-31 1998-09-22 Authentication Technologies, Inc. Wide edge lead currency thread detection system
US6552563B1 (en) * 1996-11-14 2003-04-22 Si Diamond Technology, Inc. Display panel test device
US5768369A (en) * 1996-11-20 1998-06-16 Harris Corporation Telephone test set keypad with integrated dynamic microphone
US5757388A (en) * 1996-12-16 1998-05-26 Eastman Kodak Company Electronic camera and integral ink jet printer
US6049880A (en) * 1996-12-19 2000-04-11 Samsung Electronics Co., Ltd. Computer display monitor apparatus and method for controlling power thereof
US6192473B1 (en) * 1996-12-24 2001-02-20 Pitney Bowes Inc. System and method for mutual authentication and secure communications between a postage security device and a meter server
US6099408A (en) * 1996-12-31 2000-08-08 Walker Digital, Llc Method and apparatus for securing electronic games
US6088802A (en) * 1997-06-04 2000-07-11 Spyrus, Inc. Peripheral device with integrated security functionality
US7702926B2 (en) * 1997-07-15 2010-04-20 Silverbrook Research Pty Ltd Decoy device in an integrated circuit
US7716098B2 (en) * 1997-07-15 2010-05-11 Silverbrook Research Pty Ltd. Method and apparatus for reducing optical emissions in an integrated circuit
US20090043708A9 (en) * 1997-07-15 2009-02-12 Kia Silverbook Method and apparatus for reducing optical emissions in an integrated circuit
US7346586B1 (en) * 1997-07-15 2008-03-18 Silverbrook Research Pty Ltd Validation protocol and system
US6442525B1 (en) * 1997-07-15 2002-08-27 Silverbrook Res Pty Ltd System for authenticating physical objects
US7246098B1 (en) * 1997-07-15 2007-07-17 Silverbrook Research Pty Ltd Consumable authentication protocol and system
US20100213878A1 (en) * 1997-07-15 2010-08-26 Silverbrook Research Pty Ltd Method and apparatus for reducing optical emissions in an integrated circuit
US7657488B2 (en) * 1997-07-15 2010-02-02 Silverbrook Research Pty Ltd Validating apparatus having encryption integrated circuits
US6217165B1 (en) * 1997-07-15 2001-04-17 Silverbrook Research Pty. Ltd. Ink and media cartridge with axial ink chambers
US20090126030A1 (en) * 1997-07-15 2009-05-14 Silverbrook Research Pty Ltd Tamper detection line circuitry for use in authenticating an integrated circuit
US20050038755A1 (en) * 1997-07-15 2005-02-17 Kia Silverbook Method and apparatus for reducing optical emissions in an integrated circuit
US6442276B1 (en) * 1997-07-21 2002-08-27 Assure Systems, Inc. Verification of authenticity of goods by use of random numbers
US6246970B1 (en) * 1998-07-10 2001-06-12 Silverbrook Research Pty Ltd Method for making a chip tamper-resistant
US6359479B1 (en) * 1998-08-04 2002-03-19 Juniper Networks, Inc. Synchronizing data transfers between two distinct clock domains
US6442690B1 (en) * 1998-10-23 2002-08-27 L3-Communications Corporation Apparatus and methods for managing key material in heterogeneous cryptographic assets
US6389533B1 (en) * 1999-02-05 2002-05-14 Intel Corporation Anonymity server
US6529487B1 (en) * 1999-07-09 2003-03-04 Qualcomm Incorporated Method and apparatus for securely transmitting distributed RAND for use in mobile station authentication
US20030041244A1 (en) * 2000-04-28 2003-02-27 Levente Buttyan Method for securing communications between a terminal and an additional user equipment
US6938156B2 (en) * 2000-08-04 2005-08-30 First Data Corporation ABDS system and verification status for authenticating entity access
US6789189B2 (en) * 2000-08-04 2004-09-07 First Data Corporation Managing account database in ABDS system

Cited By (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20090138716A1 (en) * 2006-03-29 2009-05-28 Agnes Leclercq Method for transmitting and receiving data, in particular for secure exchanges between an aircraft and a ground base, related devices and aircraft equipped with such devices
US8572390B2 (en) * 2006-03-29 2013-10-29 Airbus Operations S.A.S. Method for transmitting and receiving data, in particular for secure exchanges between an aircraft and a ground base, related devices and aircraft equipped with such devices
US20110225432A1 (en) * 2010-03-12 2011-09-15 Stmicroelectronics (Rousset) Sas Method and circuitry for detecting a fault attack
US8489897B2 (en) * 2010-03-12 2013-07-16 Stmicroelectronics (Rousset) Sas Method and circuitry for detecting a fault attack
US10205596B2 (en) 2013-07-31 2019-02-12 Hewlett-Pachard Development Company, L.P. Authenticating a consumable product based on a remaining life value
US20150288518A1 (en) * 2014-04-08 2015-10-08 Karl P.W. Wiegand Algorithm-agnostic approach for systematically hardening encryption
WO2021127137A1 (en) * 2019-12-19 2021-06-24 Bae Systems Information And Electronic Systems Integration Inc. Externally powered cold key load
US11221666B2 (en) * 2019-12-19 2022-01-11 Bae Systems Information And Electronic Systems Integration Inc. Externally powered cold key load
GB2605168A (en) * 2021-03-24 2022-09-28 Cirrus Logic Int Semiconductor Ltd An integrated circuit having a secure area
GB2605168B (en) * 2021-03-24 2023-03-29 Cirrus Logic Int Semiconductor Ltd An integrated circuit having a secure area

Also Published As

Publication number Publication date
US7210038B2 (en) 2007-04-24
US20050010778A1 (en) 2005-01-13
US20050066168A1 (en) 2005-03-24
US6745331B1 (en) 2004-06-01
US20040172532A1 (en) 2004-09-02
US7194629B2 (en) 2007-03-20
US7401223B2 (en) 2008-07-15
US20090013178A9 (en) 2009-01-08
US20100031064A1 (en) 2010-02-04
US6816968B1 (en) 2004-11-09
US20040187000A1 (en) 2004-09-23

Similar Documents

Publication Publication Date Title
US7454617B2 (en) Apparatus for validating the presence of an authorized accessory
US7747541B2 (en) Validating apparatus for use with a pair of integrated circuits
US7194629B2 (en) Apparatus for authenticating memory space of an authorized accessory
US7991699B2 (en) Tamper detection line circuitry for use in authenticating an integrated circuit
US7657488B2 (en) Validating apparatus having encryption integrated circuits
US7962767B2 (en) Integrated circuit having obscured state change circuitry
US7093139B2 (en) Unauthorized modification of values stored in flash memory
US7197642B2 (en) Consumable authentication protocol and system
EP1260054B1 (en) Validation protocol and system
US20100250971A1 (en) Printer consumable comprising integrated circuit protected from power supply attacks
US20100213878A1 (en) Method and apparatus for reducing optical emissions in an integrated circuit
US7346586B1 (en) Validation protocol and system

Legal Events

Date Code Title Description
AS Assignment

Owner name: SILVERBROOK RESEARCH PTY LTD, AUSTRALIA

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:SILVERBROOK, KIA, MR;REEL/FRAME:026145/0772

Effective date: 20110418

STCB Information on status: application discontinuation

Free format text: ABANDONED -- FAILURE TO PAY ISSUE FEE