WO2017123631A1 - A privacy-preserving, mutual puf-based authentication protocol - Google Patents
A privacy-preserving, mutual puf-based authentication protocol Download PDFInfo
- Publication number
- WO2017123631A1 WO2017123631A1 PCT/US2017/013013 US2017013013W WO2017123631A1 WO 2017123631 A1 WO2017123631 A1 WO 2017123631A1 US 2017013013 W US2017013013 W US 2017013013W WO 2017123631 A1 WO2017123631 A1 WO 2017123631A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- bitstrings
- token
- privacy
- path
- authentication
- Prior art date
Links
- 230000004044 response Effects 0.000 claims abstract description 24
- 230000001934 delay Effects 0.000 claims abstract description 16
- 238000000034 method Methods 0.000 claims description 61
- 230000008569 process Effects 0.000 claims description 20
- 230000006870 function Effects 0.000 claims description 15
- 238000009826 distribution Methods 0.000 claims description 10
- 230000009977 dual effect Effects 0.000 claims description 7
- 230000002829 reductive effect Effects 0.000 claims description 4
- 238000005259 measurement Methods 0.000 claims description 2
- 238000003860 storage Methods 0.000 claims description 2
- 238000011161 development Methods 0.000 abstract description 2
- 238000012360 testing method Methods 0.000 description 25
- 239000013598 vector Substances 0.000 description 25
- 238000010801 machine learning Methods 0.000 description 8
- 230000008929 regeneration Effects 0.000 description 8
- 238000011069 regeneration method Methods 0.000 description 8
- 230000000630 rising effect Effects 0.000 description 7
- 238000004519 manufacturing process Methods 0.000 description 6
- 238000000528 statistical test Methods 0.000 description 6
- 238000004458 analytical method Methods 0.000 description 5
- 230000010363 phase shift Effects 0.000 description 5
- 230000007704 transition Effects 0.000 description 5
- 230000004913 activation Effects 0.000 description 4
- 230000008901 benefit Effects 0.000 description 4
- 238000010367 cloning Methods 0.000 description 4
- 238000010586 diagram Methods 0.000 description 4
- 230000000694 effects Effects 0.000 description 4
- 238000005516 engineering process Methods 0.000 description 4
- 230000007613 environmental effect Effects 0.000 description 4
- 230000001965 increasing effect Effects 0.000 description 4
- 230000007246 mechanism Effects 0.000 description 4
- 238000013459 approach Methods 0.000 description 3
- 239000004065 semiconductor Substances 0.000 description 3
- 238000012549 training Methods 0.000 description 3
- 230000009466 transformation Effects 0.000 description 3
- 101100113692 Caenorhabditis elegans clk-2 gene Proteins 0.000 description 2
- 108090000201 Carboxypeptidase B2 Proteins 0.000 description 2
- 206010012812 Diffuse cutaneous mastocytosis Diseases 0.000 description 2
- 230000008859 change Effects 0.000 description 2
- 238000004891 communication Methods 0.000 description 2
- 230000001419 dependent effect Effects 0.000 description 2
- 238000013461 design Methods 0.000 description 2
- 238000002474 experimental method Methods 0.000 description 2
- 238000003780 insertion Methods 0.000 description 2
- 230000037431 insertion Effects 0.000 description 2
- 239000000463 material Substances 0.000 description 2
- 238000012986 modification Methods 0.000 description 2
- 230000004048 modification Effects 0.000 description 2
- 238000012545 processing Methods 0.000 description 2
- 230000002441 reversible effect Effects 0.000 description 2
- 238000012935 Averaging Methods 0.000 description 1
- 101000973623 Homo sapiens Neuronal growth regulator 1 Proteins 0.000 description 1
- 102100022223 Neuronal growth regulator 1 Human genes 0.000 description 1
- 230000006978 adaptation Effects 0.000 description 1
- 230000002411 adverse Effects 0.000 description 1
- 238000003491 array Methods 0.000 description 1
- 230000005540 biological transmission Effects 0.000 description 1
- 239000003990 capacitor Substances 0.000 description 1
- 238000002680 cardiopulmonary resuscitation Methods 0.000 description 1
- 230000001010 compromised effect Effects 0.000 description 1
- 238000004883 computer application Methods 0.000 description 1
- 238000007796 conventional method Methods 0.000 description 1
- 238000012937 correction Methods 0.000 description 1
- 230000003247 decreasing effect Effects 0.000 description 1
- 230000000779 depleting effect Effects 0.000 description 1
- 230000018109 developmental process Effects 0.000 description 1
- 238000001541 differential confocal microscopy Methods 0.000 description 1
- 230000002708 enhancing effect Effects 0.000 description 1
- 239000000284 extract Substances 0.000 description 1
- 238000000605 extraction Methods 0.000 description 1
- 229910052732 germanium Inorganic materials 0.000 description 1
- GNPVGFCGXDBREM-UHFFFAOYSA-N germanium atom Chemical compound [Ge] GNPVGFCGXDBREM-UHFFFAOYSA-N 0.000 description 1
- 230000010354 integration Effects 0.000 description 1
- 229910044991 metal oxide Inorganic materials 0.000 description 1
- 150000004706 metal oxides Chemical class 0.000 description 1
- 230000001902 propagating effect Effects 0.000 description 1
- 238000013102 re-test Methods 0.000 description 1
- 230000010076 replication Effects 0.000 description 1
- 238000011160 research Methods 0.000 description 1
- 230000035945 sensitivity Effects 0.000 description 1
- 229910052710 silicon Inorganic materials 0.000 description 1
- 239000010703 silicon Substances 0.000 description 1
- 230000003068 static effect Effects 0.000 description 1
- 238000006467 substitution reaction Methods 0.000 description 1
- 239000000758 substrate Substances 0.000 description 1
- 230000009897 systematic effect Effects 0.000 description 1
- 238000012546 transfer Methods 0.000 description 1
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F21/00—Security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
- G06F21/30—Authentication, i.e. establishing the identity or authorisation of security principals
- G06F21/44—Program or device authentication
- G06F21/445—Program or device authentication by mutual authentication, e.g. between devices or programs
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F21/00—Security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
- G06F21/30—Authentication, i.e. establishing the identity or authorisation of security principals
- G06F21/31—User authentication
- G06F21/34—User authentication involving the use of external additional devices, e.g. dongles or smart cards
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F21/00—Security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
- G06F21/30—Authentication, i.e. establishing the identity or authorisation of security principals
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F21/00—Security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
- G06F21/30—Authentication, i.e. establishing the identity or authorisation of security principals
- G06F21/44—Program or device authentication
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F21/00—Security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
- G06F21/70—Protecting specific internal or peripheral components, in which the protection of a component leads to protection of the entire computer
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F21/00—Security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
- G06F21/70—Protecting specific internal or peripheral components, in which the protection of a component leads to protection of the entire computer
- G06F21/71—Protecting specific internal or peripheral components, in which the protection of a component leads to protection of the entire computer to assure secure computing or processing of information
- G06F21/73—Protecting specific internal or peripheral components, in which the protection of a component leads to protection of the entire computer to assure secure computing or processing of information by creating or determining hardware identification, e.g. serial numbers
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09C—CIPHERING OR DECIPHERING APPARATUS FOR CRYPTOGRAPHIC OR OTHER PURPOSES INVOLVING THE NEED FOR SECRECY
- G09C1/00—Apparatus or methods whereby a given sequence of signs, e.g. an intelligible text, is transformed into an unintelligible sequence of signs by transposing the signs or groups of signs or by replacing them by others according to a predetermined system
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04L—TRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
- H04L9/00—Cryptographic mechanisms or cryptographic arrangements for secret or secure communications; Network security protocols
- H04L9/06—Cryptographic mechanisms or cryptographic arrangements for secret or secure communications; Network security protocols the encryption apparatus using shift registers or memories for block-wise or stream coding, e.g. DES systems or RC4; Hash functions; Pseudorandom sequence generators
- H04L9/065—Encryption by serially and continuously modifying data stream elements, e.g. stream cipher systems, RC4, SEAL or A5/3
- H04L9/0656—Pseudorandom key sequence combined element-for-element with data sequence, e.g. one-time-pad [OTP] or Vernam's cipher
- H04L9/0662—Pseudorandom key sequence combined element-for-element with data sequence, e.g. one-time-pad [OTP] or Vernam's cipher with particular pseudorandom sequence generator
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04L—TRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
- H04L9/00—Cryptographic mechanisms or cryptographic arrangements for secret or secure communications; Network security protocols
- H04L9/08—Key distribution or management, e.g. generation, sharing or updating, of cryptographic keys or passwords
- H04L9/0861—Generation of secret information including derivation or calculation of cryptographic keys or passwords
- H04L9/0866—Generation of secret information including derivation or calculation of cryptographic keys or passwords involving user or device identifiers, e.g. serial number, physical or biometrical information, DNA, hand-signature or measurable physical characteristics
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04L—TRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
- H04L9/00—Cryptographic mechanisms or cryptographic arrangements for secret or secure communications; Network security protocols
- H04L9/32—Cryptographic mechanisms or cryptographic arrangements for secret or secure communications; Network security protocols including means for verifying the identity or authority of a user of the system or for message authentication, e.g. authorization, entity authentication, data integrity or data verification, non-repudiation, key authentication or verification of credentials
- H04L9/3234—Cryptographic mechanisms or cryptographic arrangements for secret or secure communications; Network security protocols including means for verifying the identity or authority of a user of the system or for message authentication, e.g. authorization, entity authentication, data integrity or data verification, non-repudiation, key authentication or verification of credentials involving additional secure or trusted devices, e.g. TPM, smartcard, USB or software token
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04L—TRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
- H04L9/00—Cryptographic mechanisms or cryptographic arrangements for secret or secure communications; Network security protocols
- H04L9/32—Cryptographic mechanisms or cryptographic arrangements for secret or secure communications; Network security protocols including means for verifying the identity or authority of a user of the system or for message authentication, e.g. authorization, entity authentication, data integrity or data verification, non-repudiation, key authentication or verification of credentials
- H04L9/3271—Cryptographic mechanisms or cryptographic arrangements for secret or secure communications; Network security protocols including means for verifying the identity or authority of a user of the system or for message authentication, e.g. authorization, entity authentication, data integrity or data verification, non-repudiation, key authentication or verification of credentials using challenge-response
- H04L9/3278—Cryptographic mechanisms or cryptographic arrangements for secret or secure communications; Network security protocols including means for verifying the identity or authority of a user of the system or for message authentication, e.g. authorization, entity authentication, data integrity or data verification, non-repudiation, key authentication or verification of credentials using challenge-response using physically unclonable functions [PUF]
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F7/00—Methods or arrangements for processing data by operating upon the order or content of the data handled
- G06F7/58—Random or pseudo-random number generators
- G06F7/588—Random number generators, i.e. based on natural stochastic processes
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04L—TRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
- H04L2209/00—Additional information or applications relating to cryptographic mechanisms or cryptographic arrangements for secret or secure communication H04L9/00
- H04L2209/08—Randomization, e.g. dummy operations or using noise
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04L—TRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
- H04L2209/00—Additional information or applications relating to cryptographic mechanisms or cryptographic arrangements for secret or secure communication H04L9/00
- H04L2209/12—Details relating to cryptographic hardware or logic circuitry
Definitions
- the invention relates to authentication protocols for a Physically Unclonable Function ("PUP) including a Hardware-embedded Delay PUF (“HELP”) such as that disclosed in International Patent Application PCT/US14/053276 filed August 28, 2014, and International Patent Application PCT/US15/065909 filed December 15, 2015, each incorporated by reference.
- PUP Physically Unclonable Function
- HELP Hardware-embedded Delay PUF
- the invention relates to authentication protocols that provide both privacy and mutual identification.
- U IC also known as a chip or a microchip
- An integrated circuit is a miniaturized electronic circuit used in electronic equipment such as computer, telephone, and digital applications.
- An IC is typically formed of semiconductor devices, such as silicon and germanium, as well as passive components such as capacitors, resistors, and diodes.
- passive components such as capacitors, resistors, and diodes.
- an IC is manufactured on a thin substrate of semiconductor material.
- cost in manufacturing of ICs, per transistor has decreased.
- ICs must be protected from threats such as cloning or copying as well as protected against misappropriation and unauthorized use.
- Threats may allow unauthorized access to encrypted data, replication of IC design including unauthorized use of intellectual property ("IP”) and hardware piracy or the illegal manufacturing of the ICs. Threats of cloning, misappropriation and unauthorized use of a security key are a problem, particularly in computer applications that use a security key in authentication protocols.
- IP intellectual property
- Security keys define the basis of computer-based hardware security mechanisms implemented at high levels of hardware security such as those mechanisms that perform encryption of data communication channels, or provide IP theft protection in computer-based logic devices including Field-Programmable Gate Arrays ("FPGAs").
- FPGAs Field-Programmable Gate Arrays
- Random bitstrings may form the basis for encryption, identification, authentication, and feature activation in hardware security.
- keying material for encryption may be stored as digital bitstrings in non-volatile memory on FPGAs and Application Specific Integrated Circuit ("ASICs").
- ASICs Application Specific Integrated Circuit
- secrets stored this way may not be secure against a determined adversary, who can use probing attacks to steal the secret.
- Physical Unclonable Functions fPUFs may be used as alternative to storing digital bitstrings in non-volatile memory.
- a PUF refers to an IC hardware primitive that leverages entropy introduced by manufacturing variations to produce bitstrings, and may incorporate an on-chip infrastructure for measuring and digitizing the corresponding variations. PUFs may measure and digitize the natural variations that occur in path delays, leakage current, or static random access memory (“SRAM”) power-up patterns, to produce a random bitstring.
- SRAM static random access memory
- Challenge-based IC authentication is one example.
- a secret key is embedded in the IC that enables the IC to generate a unique response to a challenge, which is valid only for that challenge.
- the key remains secret and the mechanism performing authentication is resistant to spoofing.
- Remote activation schemes are another example. Remote activation schemes enable IC designers to lock each IC at startup and then enable it remotely, providing intellectual property protection and hardware metering. States are added to the finite state machine ("FS ") of a design and control signals are added which are a function of the secret key. Therefore, the hardware locks up until receipt of a specific activation code.
- FS finite state machine
- PUF implementations include mismatched delay-lines, SRAM power-on patterns, metal- oxide semiconductor (“MOS”) device mismatches and input dependent leakage patterns.
- MOS metal- oxide semiconductor
- a PUF extracts entropy (randomness) from variations in the physical and electrical properties of ICs, which are unique to each IC, as a means of generating digital secrets (bitstrings).
- the bitstrings can serve the role of uniquely identifying the hardware tokens for authentication applications.
- the bitstrings are generated on-the- fly, thereby eliminating the need to store digital copies of them in non-volatile memory (“NVM”), and are (ideally) reproducible under a range of environmental variations.
- NVM non-volatile memory
- the ability to control the precise generation time of the secret bitstring and the sensitivity of the PUF entropy source to invasive probing attacks (which act to invalidate it) are additional attributes that make them attractive for authentication in embedded hardware including resource-constrained hardware tokens.
- PUFs may be classified as a “strong PUF” or a “weak PUF”.
- Strong PUFs may reduce area and energy overheads by reducing the number and type of cryptographic primitives and operations.
- a strong PUF is capable of producing a large, unique set of bits per device, and has additional challenges that relate to machine learning attacks, protocol attacks and constraints on device resources.
- area overhead restricts the physical size of the entropy source in a "weak PUP.
- weak PUF architectures require the insertion of a dedicated array of identically-designed test structures to serve as the entropy source in which the area overhead restricts the physical size of the entropy source.
- "weak PUFs” can be used for authentication, they require the insertion of obfuscation functions, e.g., cryptographic hash, encryption and XOR functions, to protect their limited amount entropy against adversarial interface attacks designed to machine leam the secrets.
- arbiter PUF What is known as the "arbiter PUF” is traditionally regarded as the first strong PUF because it can be configured to product 2 n responses.
- the arbiter PUF is vulnerable to model-building attacks since only a small number of gates define the paths.
- an arbiter PUF is typically configures with as few as 256 logic gates making it susceptible to machine learning (“ML”) attacks.
- the simplest form of a PUF-based authentication protocol is carried out in two phases: enrollment and authentication.
- the process of preparing a hardware token for authentication operations in the field is called enrollment.
- a secure server randomly selects a small subset of challenges that are applied to the PUF to generate a corresponding set of responses.
- the CRPs for each token are then recorded by the server in a secure database, which are then later used for authenticating the fielded token.
- the number of stored CRPs for each token can be small because the large CPRs space along with the secrecy of the selected subset make it very difficult for adversaries to build a clone to impersonate the token.
- Authentication is the process between a prover - e.g., a hardware token or smart card - and a verifier - a secure server or bank - that confirms identities using corroborative evidence of one or both parties.
- a prover e.g., a hardware token or smart card -
- a verifier e.g., a secure server or bank - that confirms identities using corroborative evidence of one or both parties.
- this simple form of a PUF-based authentication protocol is susceptible to denial-of-service ("DOS") attacks, whereby an adversary depletes the verifier's CRPs for a token by repeatedly attempting to authenticate.
- DOS attacks even when DOS attacks are not attempted, the stored CRPs can be exhausted in the course of a sequence of valid authentications because the verifier must delete a CRP once it is used (to avoid replay attacks), and the verifier stores only a fixed number of CRPs for each token.
- Protocols have been proposed to use delay variations in functional units for authentication. However, these protocols make use of the timing values directly, and do not account for path length bias effects.
- a weakness in existing protocols relates to weaknesses in the PUF's entropy source.
- Other protocols are not lightweight such as a recently proposed protocol that supports privacy-preserving and mutual authentication which makes use of a weak SRAM PUF, and requires NVM and several cryptographic functions to be implemented on the token.
- Conventional methods of authentication which use area-heavy cryptographic primitives and nonvolatile memory (“NVM”) are less attractive for evolving embedded applications.
- the invention is directed to a PUF-based, end-to-end privacy-preserving, mutual PUF-based authentication protocol that provides a truly strong PUF with cryptographic properties.
- the invention provides a PUF-based, mutual, privacy preserving authentication protocol.
- the protocol is described and implemented using a Hardware-embedded delay PUF ("HELP), any Physical Unclonable Functions (“PUFs”) is contemplated.
- HELP Hardware-embedded delay PUF
- PUFs Physical Unclonable Functions
- the protocol does not require non-volatile memory or cryptographic primitives on the token.
- path delay information is stored on the verifier during enrollment instead of response bitstrings.
- the Hardware embedded Delay PUF (“HELP) is a strong PUF that leverages path delay variations in a functional unit.
- HELP generates bitstrings from delay variations that exist within existing functional units and provides a large number of CRPs.
- the paths defined by the functional unit have a complex interconnection structure, requiring long runtimes of sophisticated automatic test pattern generation (ATPG) software to determine the test sequences required to test them.
- ATG automatic test pattern generation
- the difficulty of generating challenges for HELP adds a new dimension to the difficulty of carrying out model-building attacks because the adversary must first expend a great deal of effort to determine the challenges that enable an effective model-building strategy.
- HELP accepts 2-vector sequences as challenges and supports an exponential input challenge space, i.e., with n inputs, the number of challenges is upper bounded at 2 2n , which indicates that any of the 2 n input vectors can be followed by any of the other 2 n input vectors.
- the 2-vector sequences are constrained to generate either rising transitions or falling transitions along the paths, but not both. This reduces the challenge space from
- the response space is defined as 2 m , then m needs to be on order of 64 or larger to meet the conditions of a strong PUF. Although combinational logic circuits can be constructed to meet this condition, the resulting size is too large for resource constrained devices.
- path timing information for example digitized representations of measured path delays
- path timing information is stored in a database on a (secure) server enabling efficient authentication protocols that provide both privacy and mutual authentication.
- path delays provides distinct advantages over response bitstrings by enabling multiple response bitstrings to be generated from the same set of path delays.
- a very large, exponential set, of response bitstrings may be generated using a fixed set of stored path delays on the verifier.
- the invention expands the response space of HELP by defining a set of configuration parameters.
- the combination of the 2-vector sequences and these parameters increases the CRP space to a large exponential.
- one of the configuration parameters is referred to as the Path-Select-Mask. It allows the verifier to select a specific subset of the paths, from those tested by the applied 2- vector sequences, to be used in the bitstring generation process. By itself, the Path- Select-Mask adds an of possibilities to the size of the response
- n and k are typically in the range of 5000 and 2048, respectively, which corresponds to a value larger than
- the protocol is provided in a hardware implementation of a cryptographic primitive, specifically the Advanced Encryption Standard ("AES") algorithm.
- AES Advanced Encryption Standard
- any cryptographic hash function is contemplated, for example, Secure Hash Algorithm 3 (“SHA-3").
- the invention uses an AES data path component referred to as sbox-mixedcol as the source of entropy.
- the sbox-mixedcol is a functional unit of a 32-bit column AES that includes 4 copies of the SBOX and 1 copy of the MIXEDCOL.
- the protocol according to the invention may be demonstrated and implemented using a lighter-weight functional unit, for example, one consisting of single AES SBOX component. More generally, the invention may be extended to hardware encrypting engines as well as other types of data path components.
- data is collected from the sbox-mixedcol functional unit on 45 copies of the Xilinx Zynq 7020 FPGA, however any number of copies as well as any hardware such as ASIC is contemplated.
- the invention also provides a set of configuration parameters - Mod, ⁇ ,
- the invention also provides Dual Helper Data (“DHD”) algorithm for improving reliability.
- DHD Dual Helper Data
- FIG. 1 is a block diagram of a functional unit ("FU") including clock strobing method for measuring path delays according to an embodiment of the invention.
- FU functional unit
- FIG. 2 is a graph illustrating random pairings of TV compensated rising and falling PUFNum Differences (PNDc) and PNDc with an applied Modulus (“modPNDc”) according to an embodiment of the invention.
- FIG. 3 is a graph illustrating temperature-voltage compensation ("TVCOMP") of PNDc and temperature-voltage (“TV) corners according to an embodiment of the invention.
- TVCOMP temperature-voltage compensation
- FIG. 4 illustrates a block diagram of a Margin and Dual Helper Data (“DHD”) algorithm according to an embodiment of the invention.
- DHD Dual Helper Data
- FIG. 5A illustrates a flow diagram of an enrollment operation of an authentication protocol according to an embodiment of the invention.
- FIG. 5B illustrates a flow diagram of authentication operation of an authentication protocol according to an embodiment of the invention.
- FIG. 6A illustrates a graph of actual inter-chip hammering distance results using a Mean scaling factor according to an embodiment of the invention.
- FIG. 6B illustrates a graph of actual inter-chip hammering distance results using a Max. scaling factor according to an embodiment of the invention.
- FIG. 7A illustrates a graph of National Institute of Standards and Technology fNIST) statistical test results for a Margin of 2 using a Mean scaling factor according to an embodiment of the invention.
- FIG. 7B illustrates a graph of NIST statistical test results for a Margin of 3 using a Mean scaling factor according to an embodiment of the invention.
- FIG. 8A illustrates a graph of true inter-chip hammering distance results using a Mean scaling factor according to an embodiment of the invention.
- FIG. 8B illustrates a graph of true inter-chip hammering distance results using a Max. scaling factor according to an embodiment of the invention.
- FIG. 9A illustrates a graph of entropy results using a Mean scaling factor according to an embodiment of the invention.
- FIG. 9B illustrates a graph of entropy results using a Max. scaling factor according to an embodiment of the invention.
- FIG. 10A illustrates a graph of probability of failure results using a Mean scaling factor according to an embodiment of the invention.
- FIG. 10B illustrates a graph of probability of failure results using a Max. scaling factor according to an embodiment of the invention.
- FIG. 11 A illustrates a graph of smallest bitstring size results using a Mean scaling factor according to an embodiment of the invention.
- FIG. 11 B illustrates a graph of smallest bitstring size using a Max. scaling factor according to an embodiment of the invention.
- FIG. 12 illustrates a table of HELP authentication protocol area and runtime overhead.
- HELP Hardware embedded Delay PUF
- the source of entropy (randomness) for HELP is the manufacturing variations that occur in the delays of paths that define the functional unit HELP measures path delays using a clock strobing technique as illustrated in FIG. 1.
- the source of entropy is represented by the functional unit, which is an existing on-chip macro that implements, e.g., components of the authentication protocol, i.e., an integer divider or a cryptographic hash function.
- a challenge for HELP consists of a 2-vector sequence and a Path-Select-Mask.
- the 'Launch Row FFs' and 'Capture Row FFs' are also components of the functional unit.
- the only modification required for the integration of HELP into the functional unit involves the use of a second clock, labeled Cite, which drives the Capture Row FFs, and the addition of the XOR gates on the primary outputs PO[ x].
- the 'Launch Row FFs' in FIG. 1 are used to apply the 2-vector sequences to the primary inputs PI[ x] of the functional unit, while the 'Capture Row FFs' are used to measure the path delays at the primary ouputs PO[ x].
- the path delays are measured by applying a series of launch-capture clocking events (called clock strobing) using Clki and Clk2 as shown on the left side of FIG. 1.
- the first vector of the sequence represents the initialization vector.
- the application of the second vector generates a set of transitions which are timed by the clock strobing technique.
- the clock strobing technique requires the repeated application of the 2-vector sequence. For each repeated application of this 2-vector test sequence, the phase shift between Clki and Clk2 is increased by a small fixed At.
- the phase shift value between the two clocks is digitally controlled, and is referred to as the launch-capture interval ("LCI").
- LCI launch-capture interval
- the digital timing values for a large number of paths can be obtained by repeating the clock strobing operation for multiple 2-vector test sequences.
- the LCI path timing value is referred as a "PUFNum" or "PN”.
- PNDifT The signed difference of two randomly selected PNs is referred to as a "PNDifT or "PND”.
- HELP constructs PND by pairing each of the rising PNs with a falling PN using two linear-feedback shift registers ("LFSR").
- the LFSRs are initialized with a pair of configuration parameters referred to as "LFSR seeds”.
- the authentication protocol according to the invention requires HELP to generate nonces in addition to the PNs.
- the VHDL module responsible for implementing the PN timing engine generates nonces in parallel with PN generation by leveraging the meta-stability characteristics that exist in a subset of the tested paths. Meta-stability is determined for a path by repeatedly measuring it and then analyzing the variations in the fractional component of the computed average. Those paths that produce two consecutive PN values nearly of equal frequencies are used as a source of true random numbers (“TRNG"). It should be noted that the random statistical properties associated with the nonces generated in this fashion pass all of the National Institute of Standards and Technology (“NIST”) statistical test statistical tests.
- NIST National Institute of Standards and Technology
- Xilinx includes this phase shift capability even on their lowest cost FPGAs.
- this phase shift capability can be implemented with a small area overhead using a multi-tapped delay chain.
- the reliability of a PUF refers to the number of bit flip errors that occur when the bitstring is regenerated. Ideally, the bitstrings are precisely reproduced during regeneration but this is rarely possible with PUFs.
- the largest source of 'noise' that causes bit flip errors for PUFs is a change in temperature and/or supply voltage (TV noise).
- TVCOMP TV compensation
- zvah represents a standardized PND after subtracting a mean iitoken and dividing by a range Rngtoken, with utoken and Rngtoken derived from the distribution of all PND obtained during regeneration under potentially adverse environmental conditions, referred to as TV comers.
- the individual zval/ are then transformed to a set of PNDc (with 'c' for compensated) using two additional configuration parameters, and Rngref (ref is for reference). This linear transformation is very
- bitstring generation process uses the signed PNDc as a means of both hardening the algorithm against model building and increasing the diversity in the
- a "mod- PNDc” is defined by applying a Modulus to the PNDc.
- the Modulus is a fifth configuration parameter to the HELP algorithm (adding to the and LFSR seed
- the modulus is necessary because the paths in the functional unit vary in length and this path length bias is captured in the PNDc.
- the modulus reduces the bias while fully preserving the within-die delay variations, i.e., the most important source of randomness.
- FIG. 2 shows a sample set of PNDc (18) computed from pseudo-random pairings of PN measured from chip Ci. Each PNDc is measured 16 times under different TV conditions. One curve line connects the data points obtained under enrollment conditions (25°C, 1.00V) while the remaining curve lines connect data points under a set of regeneration TV corners, for example, all combinations of temperatures -40°C, 0°C, 25°C, 85°C, 100°C with supply voltages 0.95V, 1.00V and 1.05V.
- the top of FIG. 2 illustrates the modPNDc values after a Modulus of 20 is applied. The modPNDc is used in the HELP bitstring generation process described below.
- an technique may be used to further reduce
- FIG. 3 provides a graph of a PNDc obtained from a set of 45 chips to illustrate the concept.
- the line connected points in each curve are generated by the same chip and represent the value of the PNDc measured in the 16 TV corner experiments after they has been TVCOMP'ed.
- the UC-TVNoise referred to earlier that remains after TVCOMP is annotated on the bottom-most curve.
- within-die variations (“WID") are represented by the vertical extension of the individual curves, which is also annotated in FIG. 3.
- the magnitude of WID for this PNDc is approx. 11 LCIs.
- a Margin technique is used to improve reliability.
- the Margin technique identifies modPNDc that have the highest probability of introducing bit flip errors.
- the modPNDc data shown along the top of FIG. 2 is replicated and enlarged as shown by "(a)" in FIG. 4.
- the region defined by the Modulus ⁇ s split into two halves, with the lower half used as the ⁇ ' region (between 0 and 9 in "(a)” of FIG. 4) and the upper half as the ⁇ ' region.
- Margining Without Margining, bit flips would occur at modPNDc indexes 4, 6, 7, 8, 10 and 14 because some of the values in the groups of PNDc data points from the 16 TV corner experiments cross over the 0-1 lines at 9-10 and 19-0.
- the Margin technique avoids these bit flip errors by creating weak and strong classes for the bits associated with the modPNDc.
- the bit associated with a modPNDc is classified as weak if the modPNDc falls within a margin around the 0-1 boundaries, and is classified as a strong bit otherwise.
- the margin is set ideally to the worst case UC- TVNoise level for the best results, but can be tuned to attain a specific probability of failure in the authentication protocol discussed further below.
- a Dual Helper Data (“DHD”) algorithm is proposed as a means of further reducing bit flip errors.
- the helper data (“HelpD”) and response bitstrings (“RespBS”) for the hardware token are shown by “(b)” in FIG. 4, while “(c)” in FIG. 4 shows HelpD and RespBS for the verifier.
- the values are derived using the token and verifier highlighted data points from the modPNDc shown in "(a)” in FIG. 4.
- Authentication in the field makes use of data stored earlier during enrollment in the Verifier Database.
- the following operations are carried out to generate the Token and Verifier StrongBS.
- the token generates helper data (Token HelpD") using the Margimng technique to produce the Token StrongBS, which are both transmitted to the verifier.
- Token HelpD helper data
- the verifier computes helper data ("Verifier HelpD"), and then bitwise AND's it with the received Token HelpD.
- the verifier constructs the Verifier StrongBS using the AND'ed HelpD while simultaneously eliminating strong bits from the Token's StrongBS that correspond to Token HelpD bits that were changed from ⁇ ' to ⁇ ' during the AND operation (3 bits are eliminated in this example as shown along the bottom of "(c)" in FIG.
- the AND operation eliminates it.
- the smaller margins used with the DHD scheme allow the Modulus to be reduced, which in turn, allows better access to within-die variations.
- Path delay information the PNs
- the PNs is stored on the verifier instead of response bitstrings.
- the PNs can each be represented as a 15-bit values (which provides a range of +/- 1024 with 4 bits of fixed-point precision).
- the protocol employs several parameters, including a Modulus (a ⁇ so referred to as Mod), a and Rngref from Equations (1) and (2), a pair of LFSR Seeds (S), a
- Margin and a Path-Select-Mask to allow multiple response bitstrings to be generated from a fixed set of PNs.
- the verifier specifies a set of paths in the Path-Select-Mask and encodes offsets in the unused bits to improve entropy as above.
- a challenge is defined as a 2-vector sequence plus a Path-Select-Mask
- a onetime interface (implemented on the FPGA as a special programming bitstring) is used during enrollment to allow the token to transfer PNs to the verifier.
- the protocol separates token identification (ID Phase) from authentication (Authen Phase) to support the privacy preserving component.
- the protocol does not require any cryptographic primitives nor non-volatile memory (NVM) on the token.
- the enrollment operation is graphically illustrated in FIG 5A.
- automatic test pattern generation (“ATPG") is used to select a set of test vector sequences, ⁇ ck ⁇ , used as a common set of challenges for all tokens in the ID Phase.
- the number of vectors depends on the security requirements regarding privacy.
- the common challenges are transmitted to the token in a secure environment during enrollment and applied to the functional unifs Pis.
- the token generated PN are transmitted to the verifier, annotated as ⁇ PNj in FIG. 5(a).
- the verifier generates an internal identifier /a for each token using VerifferGenlDO and stores the set ⁇ PNj under IDi in the secure database.
- a similar process is carried out during the Authen Phase of enrollment except that a distinct set of ATPG-generated challenges are selected using SelectATPGf D/) for each token.
- the number of hazard-free testable paths in typical functional units can be very large, making it possible to create minimally overlapping sets for each token (some overlap is desirable for privacy reasons as discussed below).
- Note that the task of generating 2-vector sequences for all paths is likely to be computationally infeasible for even moderately sized functional units.
- ATPG it is feasible and practical to use ATPG to target random subsets of paths for the enrollment requirements.
- the set of PNs, fPNyJ, as generated in the Authen Phase are also stored, along with the challenge vectors that are used, in the secure database under IDt.
- Phase 1 is token identification (“ID Phase)
- Phase 2 is verifier authentication (“Mutual Phase”
- Phase 3 is token authentication (“Authen Phase”).
- ID Phase token identification
- Authen Phase token authentication
- the token initiates the process by transmitting a 'req. to authen.' signal to the verifier.
- the verifier generates nonce m and transmits it to the token, along with a selected set of challenges ⁇ cQ to the token. It should be noted that the transmitted challenges are typically a subset of those used during enrollment
- the token generates a nonce m and transmits it to the verifier. This prevents the adversary from constructing m as a means of carrying out a systematic attack.
- SelParam constructs the parameters Mod,
- the two LFSR Seed parameters 5 can be derived directly from a bit-field in m.
- the remaining parameters are derived using a table lookup operation as a means of constraining them to specific ranges. For example, Mod is lower bounded by the Margin and is constrained to be an even number less than 30. Similarly,
- bitstring generation (BitGenS) is performed on the token using the Margining process described above and shown graphically by "(b)" in FIG. 4.
- BitGenS returns both a bitstring bss' that is composed of only strong bits under the constraints of the Margin and a helper data string h'. Both bss' and A 'are transmitted to the verifier.
- the verifier carries out a search process by processing each of its stored token / data sets ⁇ PNJi using the same parameters.
- the DHD scheme denoted BitGenD in FIG. 5B, is used instead.
- BitGenD bitwise- AN Ds the token's helper data h' with the helper data derived for each data set (not shown), and uses it to modify the token's bitstring bss' to eliminating bits as needed (see bottom of
- the search terminates when a match is found or the database is exhausted.
- authentication terminates with failure at the end of the ID Phase. Therefore, the ID Phase also serves as a gateway that prevents an adversary from depleting a token's authentication information on the verifier in a denial-of-service attack.
- the IDi of the matching verifier data set is passed to Phase 2, verifier authentication ("Mutual Phase"), and Phase 3, token authentication ("Authen Phase”).
- the Mutual Phase the same process is carried out except the token and verifier roles are reversed and the search process is omitted. It is also contemplated that the challenges used in the ID Phase can be reused and only SelParam run using two new nonces ( ⁇ ? XOR m).
- the Authen Phase is similar to the ID Phase in that the token is again authenticating to the verifier, but uses a 'token specific' set of challenges (ex). Similar to the Mutual Phase, the search process is omitted. It is also contemplated that the Authen Phase can be omitted in applications that have lower security requirements, for example, RFID and home automation applications.
- token privacy is preserved in the ID Phase because, with high probability, the transmitted information bss' and h' is different from one run of the protocol to the next, given the diversity of the parameter space provided by the Mod, S, Href, Rngnf, Margin. This diversity is exponentially increased as discussed above through the use of the Path-Select-Mask. Moreover, by creating overlap in the challenges used by different tokens in the token authentication phase, tracking is prevented in this phase as well.
- HELP uses an error avoidance scheme and therefore, the motivating factor for previously proposed reverse fuzzy extraction schemes - for example, reducing the computing burden associated with error correction on the token - does not exist for HELP.
- the Mod, Margin collectively represent parameters that can be varied within limits to create distinct bitstrings from a set of measured PNs.
- This feature of the proposed authentication scheme offsets the increased overhead associated with storing multi-bit PNs on the verifier as an alternative to response bitstrings.
- this scheme depends heavily on high statistical quality among the generated StrongBS.
- This section investigates StrongBS statistical quality using the standard metrics, including Intra-chip hamming distance ("HDmtra”), Inter-chip hamming distance (“HDinter”) and the NIST statistical test tools, as measures of bitstring reproducibility, uniqueness and randomness, respectively.
- the protocol is provided in a hardware implementation of the Advanced Encryption Standard ("AES") algorithm using an AES data path component referred to as sbox-mixedcol as the source of entropy.
- AES Advanced Encryption Standard
- sbox-mixedcol is a functional unit of a 32-bit column AES that includes 4 copies of the SBOX and 1 copy of the MIXEDCOL.
- Data is collected from the sbox-mixedcol functional unit on 45 copies of the Xilinx Zynq 7020 FPGA, however any number of copies as well as any hardware such as ASIC is contemplated.
- the implementation of sbox-mixedcol requires approx. 3000 LUTs on the Xilinx Zynq 7020 FPGA and provides approx. 8 million paths.
- the protocol has also been demonstrated using a lighter-weight functional unit consisting of single AES SBOX component that possesses approx. 600 LUTs, reducing the overall implementation size (HELP + functional unit) from approx. 6000 LUTs to less than 3000 LUTs.
- a set of 4096 PNs are collected from the 45 chips at each of 16 TV corners.
- the enrollment data stored in the verifier database is collected at 25°C, 1.00V (nominal conditions), while regeneration data is collected at all combinations of the extended industrial-grade temperature-voltage specification limits for the parts, -40°C, 0°C, 25°C, 85°C, 100°C and voltages 0.95V, 1.00V and 1.05V
- a set of low- noise, high within-die variations paths are selected using Path-Select-Masks from approx. 600 rising and 600 falling 2-vector test sequences.
- Test data is generated by applying a set of approx. 1200 challenges to test 2048 paths with rising transitions and 2048 paths with falling transitions.
- PNDs are created using LFSR-selected pairings of the 2048 rising and 2048 falling edge PNs.
- Each of the 2048 rising edge PNs can be paired with any of the 2048 falling edge PNs, yielding 4,194,304 possible combinations, however the following results are directed to a subset of 256 of these pairing combinations.
- a 2-bit offset scheme is applied to the PNDc to improve entropy.
- the verifier computes the offsets using stored enrollment data and uses it to shift the individual PNDc upwards by 0, 1/8, 1/4, or 3/8s the range given by the applied Modulusto better center the distribution over the 0-1 lines.
- a set of Moduli between 10 and 30, in steps of size 2, and Margins of size 2 and 3, are also investigated.
- the minimum value of the Modulus is given by A* Margin + 2 because four weak regions are required as shown by "(a)" in FIG. 4 and the two strong bit regions must be at least of size 1.
- the smallest Modulus or a Margin of size 3 is 14, so elements in the histogram for Moduli of 10 and 12 are 0.
- Equations (1) and (2) The analysis also investigates two of the scaling factor combinations given by the
- a set of StrongBS are created by AND'ing pairs of Helper Data bitstrings as follows. First, the enrollment modPNDc is used to create a set of 45 Helper Data bitstrings for each of the 45 chips. Second, Helper Data is computed using the modPNDc collected under each regeneration corner for these 45 chips. For each chip, the enrollment Helper Data bitstring is AND'ed with the corresponding regeneration Helper Data bitstrings.
- the 45*15 AND'ed Dual Helper Data bitstrings are used to create a corresponding set of StrongBS using the method shown in "(b)" and "(c)" of FIG. 4. It should be noted that the DHD method creates variable-sized bitstrings. The smallest bitstring is used that is produced by one of the chips in the HDrnterA analysis. The smallest bitstring sizes are analyzed and discussed below.
- HDinterA is computed using the following equation: Equation (3).
- Equation (3) simply sums all the bitwise differences between each of the possible pairing of chip StrongBS, and then converts the sum into a percentage by dividing by the total number of bits that were examined. HDimerA is computed in this fashion for each of the 256 seeds and averaged.
- the HDimerA are shown in FIG. 6A and FIG. 6B for each of the Moduli and Margin combinations using Mean and Max. scaling factors for ⁇ ef and Rngrcf.
- the height of the bars are all very close to the ideal of 50%.
- the StrongBS referenced above are used as input to the NIST statistical test suite.
- the results using Mean Scaling and only 1 of the 256 LFSR seed pairs are presented in FIG. 7A and FIG. 7B, for Margins of 2 and 3, resp. (the results for other configuration parameters are very similar).
- NIST test criteria classifies a test category as passed if at least 42 of the 45 chips pass the test The figure shows all bars are above the threshold line at 42, and therefore all test categories are passed. Bars of height 0 for NIST Tests 1 , 2 and 3 identify Moduli that produced bitstrings with sizes less than the NIST requirement for those tests. The pass percentage when the NIST tests are applied to the bitstrings produced from all combinations of the investigated parameters is approx. 98.8%.
- FIG. 8A and FIG. 8B each illustrate a graph of true inter-chip hammering distance results using a Mean scaling factor and a Max. scaling factor according to the invention. Entropy results using a Mean scaling factor and a Max. scaling factor are shown in FIG. 9A and FIG. 9B.
- FIG. 10A and FIG. 10B each illustrate a graph of probability of failure results using a Mean scaling factor and a Max. scaling factor according to the invention.
- the smallest bitstring size results using a Mean scaling factor and a Max. scaling factor according to the invention are shown in FIG. 11A and FIG. 11 B.
- HDmterT is computed as the average percentage across 990 pairings of bitstrings and 256 different pairs of LFSR seeds. However, the full length bitstrings of length 2048 are used and for each pairing of bitstrings, the hamming distance is computed using only bits classified as strong in both bitstrings. Under the Mean scaling factor, the HDmterT vary from 30% to 50% with the smallest value of 30.2% for Margin 3 and Modulus 30 as shown by FIG. 8A. For the Max scaling, most of the HDmterT values are between 40% and 50% with the smallest value of 38.7% as shown by FIG. 8B. These results are also very good and indicate that a 2-bit offset can be used effectively with this range of Moduli.
- entropy is computed using the strong bits from each enrollment- generated bitstring of length 2048 and the following equation:
- the frequency pi of is computed as the fraction of Ts at each bit position using
- the Probability of Failure is reported as an exponent x from 10-x with a value of -6 indicating 1 chance in 1 million.
- the HD.ntra is computed by pairing the enrollment StrongBS for each chip against each of the 15 regeneration StrongBS under the DHD scheme and then counting the differences (bit flips) across all combinations of the 15 DHD-generated bitstrings. The number of bit flips for all chips are summed and divided by the total number of bits inspected. An average HDmtra is then computed using this process across a set of 256 LFSR seed pairs, which is then converted into an exponent representing the Probability of Failure. The results show that the Probability of Failure varies between 10-2 and 10-4, with the largest (worst case) value at 10-2.4. Therefore, less than 1% of the bits for any authentication differ between the token and verifier under worst case environmental conditions.
- FIG. 12 gives the resource utilization and runtime overhead associated with the ID Phase and Mutual Phase of the protocol.
- the table shown in FIG. 12 lists the resources in the order in which they are used by the authentication protocol, with '-' indicating repeated use of resources previously listed. The totals at the bottom indicate that area overhead is 6038 LUTs and 1724 FFs while the runtime is approx. 1.25 seconds.
- An alternative, lighter-weight implementation which uses only a single AES SBOX component yields an area overhead of 2909 LUTs and 952 FFs and a runtime of approx. 2.2 seconds.
- the implementation of HELP also requires an 18-bit multiplier and an on-chip BRAM memory of size 7.5 KE3ytes.
- the Xilinx IP blocks used in the implementation include a MMC and a dual-channel (64-bits) AXI-GPIO for implementing communication between the processor and programmable logic components of the Zynq 7020 FPGA.
- the runtime is measured using an 8-core 3.4 GHz Intel i7 desktop computer as the verifier.
- the authentication time of 1.25 seconds includes network transmissions between the token and verifier.
- the exhaustive search carried out on the verifier takes approx. 300 microseconds per entry in the database.
- the runtime reported uses a database with only a single entry. Therefore, applications that incorporate a relatively small number of tokens (10K or less) require a search time of approx. 1.5 seconds on average, and a total authentication time of approx. 2.75 seconds.
- the response space refers to the number of bitstrings that each token can generate using the six user-defined parameters described above.
- the security analysis assumes the verifier securely stores the token's timing information that is collected during enrollment, encrypting it if necessary.
- the size of the challenge space is 2*(3 «-2 ⁇ ) 2-vector sequences, and the number of response bitstrings is approx. 7 billion excluding the diversity introduced by the Path-Select-Mask.
- the (jii XOR m) operation used in the protocol does not allow direct control over these configuration parameters.
- the Path- Select-Mask increases the number of possible response bitstrings exponentially by changing the set of PNs used in the bitstring generation process.
- the PNs selected by the Path-Select- Mask change the characteristics of the PND distribution, which in turn impacts how each PND is transformed through the TVCOMP process (the TVCOMP process was described earlier in reference to Equation (1) and Equation (2)).
- Eq. 1 uses the utoken and Rngtoken of the measured PND distribution to standardize the
- Equation (2) PNDs before applying the reverse transformation given by Equation (2).
- the first transformation makes the final PNDc values dependent on the other components of the PND distribution. Therefore, machine learning techniques designed to leam the relative path delays as a mechanism to 'break the PUP need to account for this 'distribution effect.
Landscapes
- Engineering & Computer Science (AREA)
- Computer Security & Cryptography (AREA)
- Theoretical Computer Science (AREA)
- Computer Hardware Design (AREA)
- Physics & Mathematics (AREA)
- Software Systems (AREA)
- General Physics & Mathematics (AREA)
- General Engineering & Computer Science (AREA)
- Computer Networks & Wireless Communication (AREA)
- Signal Processing (AREA)
- Mathematical Physics (AREA)
- Storage Device Security (AREA)
Abstract
Description
Claims
Priority Applications (5)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
EP17738856.8A EP3403209B1 (en) | 2016-01-11 | 2017-01-11 | A privacy-preserving, mutual puf-based authentication protocol |
CA3011279A CA3011279A1 (en) | 2016-01-11 | 2017-01-11 | A privacy-preserving, mutual puf-based authentication protocol |
US16/067,757 US10956557B2 (en) | 2016-01-11 | 2017-01-11 | Privacy-preserving, mutual PUF-based authentication protocol |
KR1020187023050A KR20180102627A (en) | 2016-01-11 | 2017-01-11 | Privacy-preserving, mutual PUF-based authentication protocols |
JP2018554663A JP7003059B2 (en) | 2016-01-11 | 2017-01-11 | Privacy protection mutual PUF-based authentication protocol |
Applications Claiming Priority (8)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US201662277276P | 2016-01-11 | 2016-01-11 | |
US62/277,276 | 2016-01-11 | ||
US201662296490P | 2016-02-17 | 2016-02-17 | |
US62/296,490 | 2016-02-17 | ||
US201662344754P | 2016-06-02 | 2016-06-02 | |
US62/344,754 | 2016-06-02 | ||
US201662417611P | 2016-11-04 | 2016-11-04 | |
US62/417,611 | 2016-11-04 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2017123631A1 true WO2017123631A1 (en) | 2017-07-20 |
Family
ID=59311835
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/US2017/013013 WO2017123631A1 (en) | 2016-01-11 | 2017-01-11 | A privacy-preserving, mutual puf-based authentication protocol |
Country Status (6)
Country | Link |
---|---|
US (1) | US10956557B2 (en) |
EP (1) | EP3403209B1 (en) |
JP (1) | JP7003059B2 (en) |
KR (1) | KR20180102627A (en) |
CA (1) | CA3011279A1 (en) |
WO (1) | WO2017123631A1 (en) |
Cited By (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2019140218A1 (en) * | 2018-01-12 | 2019-07-18 | Stc.Unm | An autonomous, self-authenticating and self-contained secure boot-up system and methods |
US10749694B2 (en) | 2018-05-01 | 2020-08-18 | Analog Devices, Inc. | Device authentication based on analog characteristics without error correction |
US11044107B2 (en) | 2018-05-01 | 2021-06-22 | Analog Devices, Inc. | Device authentication based on analog characteristics without error correction |
CN114008974A (en) * | 2019-06-10 | 2022-02-01 | 微软技术许可有限责任公司 | Partial pattern recognition in symbol streams |
US11245680B2 (en) | 2019-03-01 | 2022-02-08 | Analog Devices, Inc. | Garbled circuit for device authentication |
Families Citing this family (18)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
EP3340216B1 (en) * | 2016-12-23 | 2020-01-29 | Secure-IC SAS | Secret key generation using a high reliability physically unclonable function |
US11522725B2 (en) * | 2017-03-29 | 2022-12-06 | Board Of Regents, The University Of Texas System | Reducing amount of helper data in silicon physical unclonable functions via lossy compression without production-time error characterization |
US10706179B2 (en) * | 2018-01-10 | 2020-07-07 | General Electric Company | Secure provisioning of secrets into MPSoC devices using untrusted third-party systems |
US11082241B2 (en) * | 2018-03-30 | 2021-08-03 | Intel Corporation | Physically unclonable function with feed-forward addressing and variable latency output |
JP2021528925A (en) * | 2018-06-27 | 2021-10-21 | ユーエヌエム レインフォレスト イノベーションズUNM Rainforest Innovations | Correlated robust authentication technology that uses only helper data |
US11093599B2 (en) * | 2018-06-28 | 2021-08-17 | International Business Machines Corporation | Tamper mitigation scheme for locally powered smart devices |
US10754619B2 (en) * | 2018-09-27 | 2020-08-25 | Intel Corporation | Self-calibrated von-neumann extractor |
KR102272750B1 (en) | 2019-01-23 | 2021-07-05 | 한국전자통신연구원 | Apparatus for generating secret information and operating method thereof |
WO2020247059A1 (en) * | 2019-06-07 | 2020-12-10 | Ohio State Innovation Foundation | Systems and methods using hybrid boolean networks as physically unclonable functions |
US11269999B2 (en) * | 2019-07-01 | 2022-03-08 | At&T Intellectual Property I, L.P. | Protecting computing devices from malicious tampering |
KR102123820B1 (en) * | 2019-07-31 | 2020-06-23 | 국민대학교산학협력단 | Apparatus and method for generating computer-executable lightweight random number |
CN110752928B (en) * | 2019-09-06 | 2022-03-01 | 温州大学 | APUF based on confusion incentive design and method for realizing machine learning attack resistance |
US11171793B2 (en) * | 2019-10-01 | 2021-11-09 | Nxp B.V. | Method and system for detecting an attack on a physically unclonable function (PUF) |
US11516028B2 (en) | 2019-12-24 | 2022-11-29 | CERA Licensing Limited | Temperature sensing physical unclonable function (PUF) authentication system |
GB201919297D0 (en) | 2019-12-24 | 2020-02-05 | Aronson Bill | Temperature sensing physical unclonable function (puf) authenication system |
CN113259135B (en) * | 2021-07-06 | 2022-01-21 | 常州市建筑科学研究院集团股份有限公司 | Lightweight blockchain communication authentication device and method for detecting data tamper |
US11917089B2 (en) | 2021-09-28 | 2024-02-27 | Nxp B.V. | Reducing helper data size for physical unclonable function device |
CN115664640B (en) * | 2022-12-23 | 2023-03-21 | 苏州浪潮智能科技有限公司 | Hardware implementation method, system, storage medium and equipment of SHA-3 algorithm |
Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20120183135A1 (en) * | 2011-01-19 | 2012-07-19 | Verayo, Inc. | Reliable puf value generation by pattern matching |
US20140189890A1 (en) * | 2012-12-28 | 2014-07-03 | Patrick Koeberl | Device authentication using a physically unclonable functions based key generation system |
US20150058928A1 (en) * | 2013-08-23 | 2015-02-26 | Qualcomm Incorporated | Applying circuit delay-based physically unclonable functions (pufs) for masking operation of memory-based pufs to resist invasive and clone attacks |
US20150169247A1 (en) * | 2012-05-18 | 2015-06-18 | Cornell University | Methods and systems for providing hardware security functions using flash memories |
Family Cites Families (23)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
KR20070058581A (en) | 2004-10-04 | 2007-06-08 | 코닌클리케 필립스 일렉트로닉스 엔.브이. | Two-way error correction for physical tokens |
WO2009079050A2 (en) * | 2007-09-19 | 2009-06-25 | Verayo, Inc. | Authentication with physical unclonable functions |
JP5423088B2 (en) * | 2009-03-25 | 2014-02-19 | ソニー株式会社 | Integrated circuit, encryption communication device, encryption communication system, information processing method, and encryption communication method |
JP2010266417A (en) | 2009-05-18 | 2010-11-25 | Sony Corp | Semiconductor integrated circuit, information processing apparatus and method, and program |
US7898283B1 (en) * | 2009-08-31 | 2011-03-01 | Farinaz Koushanfar | Lightweight secure physically unclonable functions |
WO2011048126A1 (en) * | 2009-10-21 | 2011-04-28 | Intrinsic Id B.V. | Distribution system and method for distributing digital information |
US8667265B1 (en) * | 2010-07-28 | 2014-03-04 | Sandia Corporation | Hardware device binding and mutual authentication |
US8694778B2 (en) * | 2010-11-19 | 2014-04-08 | Nxp B.V. | Enrollment of physically unclonable functions |
JP2014523192A (en) * | 2011-07-07 | 2014-09-08 | ベラヨ インク | Security by encryption using fuzzy authentication information in device and server communication |
US8590010B2 (en) * | 2011-11-22 | 2013-11-19 | International Business Machines Corporation | Retention based intrinsic fingerprint identification featuring a fuzzy algorithm and a dynamic key |
WO2013155522A1 (en) * | 2012-04-13 | 2013-10-17 | Lewis Innovative Technologies, Inc. | Electronic physical unclonable functions |
GB2507988A (en) * | 2012-11-15 | 2014-05-21 | Univ Belfast | Authentication method using physical unclonable functions |
US9015500B2 (en) * | 2013-01-16 | 2015-04-21 | Qualcomm Incorporated | Method and apparatus for using dynamic voltage and frequency scaling with circuit-delay based integrated circuit identification |
WO2015031683A1 (en) | 2013-08-28 | 2015-03-05 | Stc.Unm | Systems and methods for leveraging path delay variations in a circuit and generating error-tolerant bitstrings |
JP2015065495A (en) * | 2013-09-24 | 2015-04-09 | ルネサスエレクトロニクス株式会社 | Encryption key supply method, semiconductor integrated circuit and encryption key management device |
US9489504B2 (en) * | 2013-10-03 | 2016-11-08 | Qualcomm Incorporated | Physically unclonable function pattern matching for device identification |
DE102013227087A1 (en) * | 2013-12-23 | 2015-06-25 | Siemens Aktiengesellschaft | Secured provision of a key |
US9628272B2 (en) * | 2014-01-03 | 2017-04-18 | William Marsh Rice University | PUF authentication and key-exchange by substring matching |
US10216965B2 (en) * | 2014-01-08 | 2019-02-26 | Stc.Unm | Systems and methods for generating physically unclonable functions from non-volatile memory cells |
US10129036B2 (en) * | 2014-09-18 | 2018-11-13 | Intel Corporation | Post-processing mechanism for physically unclonable functions |
JP6608457B2 (en) | 2014-12-15 | 2019-11-20 | エスティーシー. ユーエヌエム | A method for generating physically unclonable function bitstreams with improved reliability |
US9569601B2 (en) * | 2015-05-19 | 2017-02-14 | Anvaya Solutions, Inc. | System and method for authenticating and enabling functioning of a manufactured electronic device |
US20170132434A1 (en) * | 2015-11-06 | 2017-05-11 | Mentor Graphics Corporation | Measure variation tolerant physical unclonable function device |
-
2017
- 2017-01-11 EP EP17738856.8A patent/EP3403209B1/en active Active
- 2017-01-11 WO PCT/US2017/013013 patent/WO2017123631A1/en active Application Filing
- 2017-01-11 JP JP2018554663A patent/JP7003059B2/en active Active
- 2017-01-11 CA CA3011279A patent/CA3011279A1/en active Pending
- 2017-01-11 KR KR1020187023050A patent/KR20180102627A/en not_active Application Discontinuation
- 2017-01-11 US US16/067,757 patent/US10956557B2/en active Active
Patent Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20120183135A1 (en) * | 2011-01-19 | 2012-07-19 | Verayo, Inc. | Reliable puf value generation by pattern matching |
US20150169247A1 (en) * | 2012-05-18 | 2015-06-18 | Cornell University | Methods and systems for providing hardware security functions using flash memories |
US20140189890A1 (en) * | 2012-12-28 | 2014-07-03 | Patrick Koeberl | Device authentication using a physically unclonable functions based key generation system |
US20150058928A1 (en) * | 2013-08-23 | 2015-02-26 | Qualcomm Incorporated | Applying circuit delay-based physically unclonable functions (pufs) for masking operation of memory-based pufs to resist invasive and clone attacks |
Non-Patent Citations (1)
Title |
---|
See also references of EP3403209A4 |
Cited By (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2019140218A1 (en) * | 2018-01-12 | 2019-07-18 | Stc.Unm | An autonomous, self-authenticating and self-contained secure boot-up system and methods |
US11880468B2 (en) | 2018-01-12 | 2024-01-23 | Unm Rainforest Innovations | Autonomous, self-authenticating and self-contained secure boot-up system and methods |
US10749694B2 (en) | 2018-05-01 | 2020-08-18 | Analog Devices, Inc. | Device authentication based on analog characteristics without error correction |
US11044107B2 (en) | 2018-05-01 | 2021-06-22 | Analog Devices, Inc. | Device authentication based on analog characteristics without error correction |
US11245680B2 (en) | 2019-03-01 | 2022-02-08 | Analog Devices, Inc. | Garbled circuit for device authentication |
CN114008974A (en) * | 2019-06-10 | 2022-02-01 | 微软技术许可有限责任公司 | Partial pattern recognition in symbol streams |
CN114008974B (en) * | 2019-06-10 | 2023-10-31 | 微软技术许可有限责任公司 | Partial pattern recognition in symbol streams |
Also Published As
Publication number | Publication date |
---|---|
JP7003059B2 (en) | 2022-01-20 |
EP3403209A1 (en) | 2018-11-21 |
CA3011279A1 (en) | 2017-07-20 |
KR20180102627A (en) | 2018-09-17 |
US20190026457A1 (en) | 2019-01-24 |
EP3403209B1 (en) | 2024-04-24 |
EP3403209A4 (en) | 2019-11-27 |
US10956557B2 (en) | 2021-03-23 |
JP2019501609A (en) | 2019-01-17 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
EP3403209B1 (en) | A privacy-preserving, mutual puf-based authentication protocol | |
US10366253B2 (en) | Reliability enhancement methods for physically unclonable function bitstring generation | |
Majzoobi et al. | Slender PUF protocol: A lightweight, robust, and secure authentication by substring matching | |
Che et al. | PUF-based authentication | |
US9628272B2 (en) | PUF authentication and key-exchange by substring matching | |
TWI498827B (en) | Non-networked rfid-puf authentication | |
EP2214117B1 (en) | Authentication with physical unclonable functions | |
US11095461B2 (en) | System and methods for entropy and statistical quality metrics in physical unclonable function generated bitstrings | |
Zalivaka et al. | FPGA implementation of modeling attack resistant arbiter PUF with enhanced reliability | |
Sami et al. | POCA: First power-on chip authentication in untrusted foundry and assembly | |
US11411751B2 (en) | Correlation-based robust authentication technique using helper data only | |
Plusquellic et al. | Privacy-preserving authentication protocols for iot devices using the sirf puf | |
Alibrahim | OCCRA: overt-covert challenge-response authentication using device-centric primitives | |
Plusquellic | PUF-based authentication | |
Millwood et al. | A Privacy-Preserving Protocol Level Approach to Prevent Machine Learning Modelling Attacks on PUFs in the Presence of Semi-Honest Verifiers | |
Che | Model Building and Security Analysis of PUF-Based Authentication |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 17738856 Country of ref document: EP Kind code of ref document: A1 |
|
ENP | Entry into the national phase |
Ref document number: 2018554663 Country of ref document: JP Kind code of ref document: A |
|
WWE | Wipo information: entry into national phase |
Ref document number: 3011279 Country of ref document: CA |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
ENP | Entry into the national phase |
Ref document number: 20187023050 Country of ref document: KR Kind code of ref document: A |
|
WWE | Wipo information: entry into national phase |
Ref document number: 1020187023050 Country of ref document: KR |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2017738856 Country of ref document: EP |
|
ENP | Entry into the national phase |
Ref document number: 2017738856 Country of ref document: EP Effective date: 20180813 |