US20220198008A1 - Protecting computing devices from malicious tampering - Google Patents
Protecting computing devices from malicious tampering Download PDFInfo
- Publication number
- US20220198008A1 US20220198008A1 US17/688,757 US202217688757A US2022198008A1 US 20220198008 A1 US20220198008 A1 US 20220198008A1 US 202217688757 A US202217688757 A US 202217688757A US 2022198008 A1 US2022198008 A1 US 2022198008A1
- Authority
- US
- United States
- Prior art keywords
- integrated circuit
- circuit chip
- signature
- controller
- computing device
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Abandoned
Links
- 238000012545 processing Methods 0.000 claims abstract description 54
- 238000000034 method Methods 0.000 claims abstract description 47
- 230000004044 response Effects 0.000 claims abstract description 15
- 238000012360 testing method Methods 0.000 claims abstract description 13
- 238000004891 communication Methods 0.000 claims description 18
- 208000032365 Electromagnetic interference Diseases 0.000 claims description 5
- 230000001934 delay Effects 0.000 claims description 5
- 230000006870 function Effects 0.000 description 22
- 238000012795 verification Methods 0.000 description 6
- 238000003860 storage Methods 0.000 description 5
- 238000004519 manufacturing process Methods 0.000 description 4
- 238000010200 validation analysis Methods 0.000 description 4
- 230000001413 cellular effect Effects 0.000 description 3
- 230000008569 process Effects 0.000 description 3
- 230000004075 alteration Effects 0.000 description 2
- 230000005540 biological transmission Effects 0.000 description 2
- 239000003990 capacitor Substances 0.000 description 2
- 238000009826 distribution Methods 0.000 description 2
- 230000002452 interceptive effect Effects 0.000 description 2
- 230000002093 peripheral effect Effects 0.000 description 2
- 230000000903 blocking effect Effects 0.000 description 1
- 230000008859 change Effects 0.000 description 1
- 230000002380 cytological effect Effects 0.000 description 1
- 238000010586 diagram Methods 0.000 description 1
- 230000000694 effects Effects 0.000 description 1
- 230000000977 initiatory effect Effects 0.000 description 1
- 230000007774 longterm Effects 0.000 description 1
- 238000012423 maintenance Methods 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 230000003287 optical effect Effects 0.000 description 1
- 238000012546 transfer Methods 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F21/00—Security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
- G06F21/70—Protecting specific internal or peripheral components, in which the protection of a component leads to protection of the entire computer
- G06F21/71—Protecting specific internal or peripheral components, in which the protection of a component leads to protection of the entire computer to assure secure computing or processing of information
- G06F21/73—Protecting specific internal or peripheral components, in which the protection of a component leads to protection of the entire computer to assure secure computing or processing of information by creating or determining hardware identification, e.g. serial numbers
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F21/00—Security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
- G06F21/50—Monitoring users, programs or devices to maintain the integrity of platforms, e.g. of processors, firmware or operating systems
- G06F21/55—Detecting local intrusion or implementing counter-measures
- G06F21/56—Computer malware detection or handling, e.g. anti-virus arrangements
Definitions
- the present disclosure relates generally to hardware security, and relates more particularly to devices, non-transitory computer-readable media, and methods for protecting computing devices against malicious tampering.
- a motherboard is the main printed circuit board (PCB) found in computing devices including consumer electronics and data center servers.
- a motherboard typically includes a plurality of integrated circuit chips and capacitors that collectively facilitate communications between a computing device's various electronic components (e.g., central processing unit, memory, input/output devices and other peripherals, etc.).
- Some motherboards may further include a baseboard management controller (BMC), which is a type of superchip or small computer that may be used by an administrator to remotely access a malfunctioning computing device.
- BMC baseboard management controller
- a method performed by a processing system of a server may include sending an instruction to a controller installed on an integrated circuit chip of a remote computing device, wherein the instruction requests that the controller issue a challenge to the integrated circuit, receiving a first signature of the integrated circuit chip from the controller, wherein the first signature is derived by the controller from a response of the integrated circuit chip to the challenge, comparing the first signature to a second signature that is stored on the server, wherein the second signature was derived through testing of the integrated circuit chip prior to the integrated circuit chip being deployed in the remote computing device, and generating an alert when the first signature fails to match the second signature, wherein the alert indicates that the integrated circuit chip may have been tampered with.
- a non-transitory computer-readable medium may store instructions which, when executed by a processing system of a server, cause the processing system to perform operations.
- the operations may include sending an instruction to a controller installed on an integrated circuit chip of a remote computing device, wherein the instruction requests that the controller issue a challenge to the integrated circuit, receiving a first signature of the integrated circuit chip from the controller, wherein the first signature is derived by the controller from a response of the integrated circuit chip to the challenge, comparing the first signature to a second signature that is stored on the server, wherein the second signature was derived through testing of the integrated circuit chip prior to the integrated circuit chip being deployed in the remote computing device, and generating an alert when the first signature fails to match the second signature, wherein the alert indicates that the integrated circuit chip may have been tampered with.
- a system deployed on an integrated circuit chip of a computing device may include a plurality of sensors to monitor a plurality of physical conditions of the integrated circuit chip and a controller communicatively coupled to the sensors to issue a challenge to the integrated circuit chip and to derive a first signature for the integrated circuit chip from a response of the integrated circuit chip to the challenge.
- FIG. 1 illustrates an example system in which examples of the present disclosure for protecting computing devices against malicious tampering may operate
- FIG. 2 illustrates a flowchart of an example method for protecting computing devices against malicious tampering, in accordance with the present disclosure
- FIG. 3 illustrates a flowchart of another example method for protecting computing devices against malicious tampering, in accordance with the present disclosure.
- FIG. 4 illustrates an example of a computing device, or computing system, specifically programmed to perform the steps, functions, blocks, and/or operations described herein.
- a motherboard typically includes a plurality of integrated circuit chips and capacitors that collectively facilitate communications between a computing device's various electronic components (e.g., central processing unit, memory, input/output devices and other peripherals, etc.).
- a malicious party may, unknowingly to a computing device's end user, install hardware elements on the motherboard that may cause the computing device to carry out unwanted operations.
- These unwanted operations may include, for example, preparing the computing device's operating system to accept code (e.g., malware) from an anonymous remote computing device, revealing encryption keys to a remote computing device, blocking security updates to the computing device, opening new network connections to the computing device, altering line by line operation of the computing device, and other operations.
- code e.g., malware
- the hardware elements installed by the malicious third party may be connected to the computing device's baseboard management controller (BMC).
- BMC baseboard management controller
- This may give the malicious third party access to the computing device's most sensitive code, even if the computing device has crashed or is powered off.
- the Linux operating system which runs in many servers, includes code that authorizes a user by verifying a typed password against a stored, encrypted password.
- a chip maliciously planted in one of the servers may alter part of this code, so that the server will not ask a user for a password.
- Examples of the present disclosure utilize a secure hardware controller, which is installed on a critical integrated circuit chip on a computing device's PCB (e.g., motherboard), to monitor certain physical parameters of the integrated circuit chip.
- the secure hardware controller may collect data about the physical parameters from a plurality of sensors on the integrated circuit chip, and use the data to generate a signature.
- the signature may comprise a physical unclonable function (PUF).
- the secure hardware controller may transmit the signature, using a secure communications protocol, to a remote hardware integrity management center (HIMC) server.
- the HIMC may compare the signature generated by the secure hardware controller to signatures that have been previously stored on the HIMC server (e.g., by the manufacturer of the computing device or a third-party testing and validation entity). When the signature generated by the secure hardware controller does not match the stored signature(s), this may indicate that the integrated circuit chip has been tampered with.
- examples of the present disclosure may rely on the use of PUFs.
- PUFs are used in examples of the present disclosure to verify the integrity (e.g., freedom from tampering) of an integrated circuit chip.
- the PUFs comprise physically-defined, digital “fingerprints” which uniquely identify the critical integrated circuit chips of a computing device. PUFs are easy to evaluate using a physical system, and PUF outputs resemble random functions.
- FIG. 1 illustrates an example system 100 in which examples of the present disclosure for protecting computing devices against malicious tampering may operate.
- the system 100 may include any one or more types of communication networks, such as a traditional circuit switched network (e.g., a public switched telephone network (PSTN)) or a packet network such as an Internet Protocol (IP) network (e.g., an IP Multimedia Subsystem (IMS) network), an asynchronous transfer mode (ATM) network, a wired network, a wireless network, and/or a cellular network (e.g., 2G-5G, a long term evolution (LTE) network, and the like) related to the current disclosure.
- IP Internet Protocol
- IMS IP Multimedia Subsystem
- ATM asynchronous transfer mode
- wired network e.g., a wireless network
- LTE long term evolution
- cellular network e.g., 2G-5G, a long term evolution (LTE) network, and the like
- IP network is broadly defined as a network that uses Internet
- the system 100 may comprise a core network 102 .
- the core network 102 may be in communication with one or more access networks 120 and 122 .
- the core network 102 may combine core network components of a wired or cellular network with components of a triple play service network; where triple-play services include telephone services, Internet services and television services to subscribers.
- the core network 102 may functionally comprise a fixed mobile convergence (FMC) network, e.g., an IP Multimedia Subsystem (IMS) network.
- FMC fixed mobile convergence
- IMS IP Multimedia Subsystem
- the core network 102 may functionally comprise a telephony network, e.g., an Internet Protocol/Multi-Protocol Label Switching (IP/MPLS) backbone network utilizing Session Initiation Protocol (SIP) for circuit-switched and Voice over Internet Protocol (VoIP) telephony services.
- IP/MPLS Internet Protocol/Multi-Protocol Label Switching
- SIP Session Initiation Protocol
- VoIP Voice over Internet Protocol
- the core network 102 may further comprise a broadcast television network, e.g., a traditional cable provider network or an Internet Protocol Television (IPTV) network, as well as an Internet Service Provider (ISP) network.
- IPTV Internet Protocol Television
- ISP Internet Service Provider
- the core network 102 may include a plurality of television (TV) servers (e.g., a broadcast server, a cable head-end), a plurality of content servers, an advertising server (AS), an interactive TV/video on demand (VoD) server, and so forth.
- TV television
- AS advertising server
- VoD interactive TV/video on demand
- the core network 102 may include a hardware integrity management center (HIMC) server 104 .
- HIMC hardware integrity management center
- the access networks 120 and 122 may comprise Digital Subscriber Line (DSL) networks, public switched telephone network (PSTN) access networks, broadband cable access networks, Local Area Networks (LANs), wireless access networks (e.g., an IEEE 802.11/Wi-Fi network and the like), cellular access networks, 3rd party networks, and the like.
- the operator of core network 102 may provide telecommunication services to subscribers via access networks 120 and 122 .
- the access networks 120 and 122 may comprise different types of access networks, may comprise the same type of access network, or some access networks may be the same type of access network and other may be different types of access networks.
- the core network 102 may be operated by a telecommunication network service provider.
- the core network 102 and the access networks 120 and 122 may be operated by different service providers, the same service provider or a combination thereof, or the access networks 120 and/or 122 may be operated by entities having core businesses that are not related to telecommunications services, e.g., corporate, governmental, or educational institution LANs, and the like.
- the access networks 120 and 122 may be in communication with one or more endpoint devices.
- These endpoint devices may include, for example servers, where each of the servers includes a PCB (e.g., motherboard) housing a plurality of integrated circuit chips 110 1 - 110 n (hereinafter individually referred to as a “chip 110 ” or collectively referred to as “chips 110 ”).
- Access networks 120 and 122 may transmit and receive communications between the chips 110 and the HIMC server 104 , as discussed in further detail below.
- each chip 110 is tested prior to deployment in a computing device (e.g., at time of manufacture, or after manufacture by prior to deployment).
- the chips 110 may be tested by the respective manufacturers or by a third-party testing and validation entity. Testing of a chip 110 may involve operating the chip 110 under different conditions to derive a signature (e.g., a physical unclonable function) for the chip 110 . For instance, when certain parameters of the chip 110 such as temperature, age, frequency, and the like fall outside threshold ranges, this may alter the chip's path delay (which is otherwise substantially constant).
- the combination of parameters that causes the change in delay may function as a unique signature for the chip 110 , since no two chips' path delays will be altered in precisely the same way by the precisely the same combination of parameters (thanks to process variations during fabrication).
- the unique signatures 108 for each chip 110 may be stored in the HIMC server 104 .
- the signatures 108 could also be sent (e.g., via email) to a human administrator or end user.
- the HIMC server 104 stores a plurality of signatures for each chip 110 , where each signature represents the chip's response to a different set of conditions.
- each of the chips 110 may be configured as illustrated by the example chip 110 n .
- each chip 110 may include a secure hardware controller 112 and a plurality of sensors 114 1 - 114 m (hereinafter individually referred to as a “sensor 114 ” or collectively referred to as “sensors 114 ”).
- the sensors 114 may include different types of sensors, such as temperature sensors, voltage sensors, current sensors, frequency sensors, and the like. Each of the sensors 114 may thus be designed to collect data regarding a different physical parameter of the chip 110 .
- the sensors 114 may collect the data during testing of the chip 110 , as described above.
- the sensors 114 may also continue to collect the data after deployment of the chip 110 and to send the data to the HIMC server 104 , as described in further detail below.
- the sensors 114 may send the collected data to the secure hardware controller 112 .
- the secure hardware controller 112 may comprise a microcontroller that communicates with the HIMC server 104 .
- the secure hardware controller 112 may be configured for the specific type of the chip 110 , e.g., such that the secure hardware controller 112 is able to evaluate different valid paths on the chip 110 .
- the secure hardware controller 112 and the sensors 114 may be installed in a secure manner by the manufacturer of the chip 110 or by a testing and validation entity.
- the secure hardware controller 112 may be housed within a small, tamper-resistant enclosure on the chip 110 .
- the chips 110 may be tested to generate unique signatures which are stored in the HIMC server 104 as discussed above. Subsequently, the HIMC server 104 may cooperate with the secure hardware controller 110 on each chip 110 to verify that the chip 110 has not been tampered with. For example, the HIMC server 104 may execute a verification routine 106 at various points during the supply chain, operation, and maintenance of the chip 110 .
- the HIMC server 104 may also execute the verification routine in response to the occurrence of a predefined event (e.g., every time the server of which the chip 110 is part is powered on), periodically (e.g., every x hours or days), on demand (e.g., in response to a request from the secure hardware controller 112 of the chip), or according to any other schedule.
- a predefined event e.g., every time the server of which the chip 110 is part is powered on
- periodically e.g., every x hours or days
- on demand e.g., in response to a request from the secure hardware controller 112 of the chip
- the verification routine 106 may cause the HIMC server 104 to send an instruction to the secure hardware controller 112 instructing the secure hardware controller 112 to generate and provide a signature for the chip 110 .
- the secure hardware controller 112 may, in response, generate the signature for the chip 110 based on current data provided by the sensors 114 .
- the secure hardware controller 104 may send the generated signature to the HIMC server 104 , which may compare the generated signature to the stored signature(s) 108 for the chip 110 . If the generated signature matches the stored signature, then it can be assumed that the chip 110 has not been tampered with. If, however, the generated signature does not match the stored signature, then it may be assumed that the chip 110 has been tampered with.
- a secure communications protocol such as two-way transport layer security (TLS) may be used to carry communications from the HIMC server 104 to the secure hardware controller 112 , and vice versa.
- TLS transport layer security
- any of the HIMC server 104 and/or the secure hardware controller 112 may comprise a computing system or server, such as computing system 400 depicted in FIG. 4 , and may be configured to provide one or more operations or functions in connection with examples of the present disclosure for protecting computing devices against malicious tampering, as described herein.
- any of the HIMC server 104 and/or the secure hardware controller 112 may comprise one or more physical devices, e.g., one or more computing systems or servers, such as computing system 400 depicted in FIG. 4 , and may be configured to provide one or more operations protecting computing devices against malicious tampering, as described herein.
- the terms “configure,” and “reconfigure” may refer to programming or loading a processing system with computer-readable/computer-executable instructions, code, and/or programs, e.g., in a distributed or non-distributed memory, which when executed by a processor, or processors, of the processing system within a same device or within distributed devices, may cause the processing system to perform various functions.
- Such terms may also encompass providing variables, data values, tables, objects, or other data structures or the like which may cause a processing system executing computer-readable instructions, code, and/or programs to function differently depending upon the values of the variables or other data structures that are provided.
- a “processing system” may comprise a computing device including one or more processors, or cores (e.g., as illustrated in FIG. 4 and discussed below) or multiple computing devices collectively configured to perform various steps, functions, and/or operations in accordance with the present disclosure.
- system 100 has been simplified. Thus, those skilled in the art will realize that the system 100 may be implemented in a different form than that which is illustrated in FIG. 1 , or may be expanded by including additional endpoint devices, access networks, network elements, application servers, etc. without altering the scope of the present disclosure.
- system 100 may be altered to omit various elements, substitute elements for devices that perform the same or similar functions, combine elements that are illustrated as separate devices, and/or implement network elements as functions that are spread across several devices that operate collectively as the respective network elements.
- the system 100 may include other network elements (not shown) such as border elements, routers, switches, policy servers, security devices, gateways, a content distribution network (CDN) and the like.
- CDN content distribution network
- portions of the core network 102 and/or access networks 120 and 122 may comprise a content distribution network (CDN) having ingest servers, edge servers, and the like for packet-based streaming of video, audio, or other content.
- CDN content distribution network
- access networks 120 and/or 122 may each comprise a plurality of different access networks that may interface with the core network 102 independently or in a chained manner.
- FIG. 2 illustrates a flowchart of an example method 200 for protecting computing devices against malicious tampering, in accordance with the present disclosure.
- steps, functions and/or operations of the method 200 may be performed by a device as illustrated in FIG. 1 , e.g., an HIMC server 104 or any one or more components thereof.
- the steps, functions, or operations of method 200 may be performed by a computing device or system 400 , and/or a processing system 402 as described in connection with FIG. 4 below.
- the computing device 400 may represent at least a portion of HIMC server 104 in accordance with the present disclosure.
- the method 200 is described in greater detail below in connection with an example performed by a processing system, such as processing system 402 .
- the method 200 begins in step 202 and proceeds to step 204 .
- the processing system may send an instruction to a secure hardware controller of an integrated circuit chip requesting a first signature for the integrated circuit chip.
- the integrated circuit chip may be housed on a motherboard of a computing device, such as a server.
- the secure hardware controller may also be housed on the motherboard of the computing device, and may be connected to a plurality of sensors that monitors various conditions of the integrated circuit chip (e.g., temperature, supply voltage, electro-magnetic interference, frequency, etc.).
- the secure hardware controller may derive the first signature by issuing a challenge to the integrated circuit chip, as discussed in further detail below.
- the instruction may be sent from the processing system to the secure hardware controller using a secure communications protocol, such as two-way TLS.
- a secure communications protocol such as two-way TLS.
- the instruction may be encoded in a data packet that is encrypted using encryption keys that are generated specifically and uniquely for the connection between the processing system and the secure hardware controller.
- the data packet encoding the instruction may include a message authentication code that guards against loss or alteration of the instruction during transmission.
- the instruction is sent in step 204 in response to the occurrence of a predefined event (e.g., the computing device that contains the integrated circuit chip being powered on).
- the instruction is sent in step 204 according to a predefined schedule (e.g., periodically every x hours or days).
- the instruction is sent in step 204 on demand (e.g., in response to a request from the secure hardware controller).
- the instruction sent in step 204 may be sent according to any other schedule.
- the processing system may receive the first signature from the secure hardware controller.
- the first signature may be derived by the secure hardware controller from current conditions of the integrated circuit chip, e.g., as measured and reported to the secure hardware controller by the plurality of sensors.
- the first signature may comprise a PUF.
- the first signature may be sent by the secure hardware controller to the processing system using a secure communications protocol, such as two-way TLS, as discussed above.
- the processing system may compare the first signature to a second signature for the integrated circuit chip that is stored at the HIMC server.
- the second signature may be derived from testing of the integrated circuit chip prior to deployment (e.g., by the manufacturer or by a testing and validation entity).
- the second signature may represent what the processing system expects to see when receiving the first signature from the secure hardware controller.
- step 210 the processing system may determine whether the first signature and the second signature match.
- step 212 the processing system may send an alert (e.g., to the secure hardware controller in the form of a message, to a human administrator or end user in the form of an email, or the like) indicating that the integrated circuit chip is believed to be free from tampering. In one example, however, no alert is generated as long as the integrated circuit chip is believed to be free from tampering (e.g., alerts are only generated if tampering is suspected).
- an alert e.g., to the secure hardware controller in the form of a message, to a human administrator or end user in the form of an email, or the like
- step 210 the processing system determines in step 210 that the first signature and the second signature do not match, then the method 200 may proceed to step 214 .
- the processing system may send an alert (e.g., to the secure hardware controller in the form of a message, to a human administrator or end user in the form of an email, or the like) indicating that the integrated circuit chip may have been tampered with.
- any alert sent in accordance with step 212 or 214 may be sent by the secure hardware controller using a secure communications protocol, such as two-way TLS, as discussed above.
- a secure communications protocol such as two-way TLS, as discussed above.
- the method 200 may end.
- FIG. 3 illustrates a flowchart of another example method 300 for protecting computing devices against malicious tampering, in accordance with the present disclosure.
- steps, functions and/or operations of the method 300 may be performed by a device as illustrated in FIG. 1 , e.g., a secure hardware controller 112 or a chip 110 , or any one or more components thereof.
- the steps, functions, or operations of method 300 may be performed by a computing device or system 400 , and/or a processing system 402 as described in connection with FIG. 4 below.
- the computing device 400 may represent at least a portion of secure hardware controller 112 in accordance with the present disclosure.
- the method 300 is described in greater detail below in connection with an example performed by a processing system, such as processing system 402 .
- the method 300 begins in step 302 and proceeds to step 304 .
- the processing system (of a secure hardware controller installed on a motherboard of a computing device, such as a server) may detect a condition under which an integrated circuit chip of the computing device is to be verified (i.e., determined to be free from tampering).
- the integrated circuit chip may also be housed on the motherboard.
- the secure hardware controller may be connected to a plurality of sensors that monitors various conditions of the integrated circuit chip (e.g., temperature, supply voltage, electro-magnetic interference, frequency, etc.).
- the occurrence of a predefined event may trigger a verification routine.
- the verification routine may be performed according to a predefined schedule (e.g., periodically every x hours or days). In another example, the verification routine may be performed according to any other schedule.
- the processing system may send a request to an HIMC server to verify the integrated circuit chip.
- the request may be sent from the processing system to the HIMC server using a secure communications protocol, such as two-way TLS.
- the request may be encoded in a data packet that is encrypted using encryption keys that are generated specifically and uniquely for the connection between the processing system and the HIMC server.
- the data packet encoding the request may include a message authentication code that guards against loss or alteration of the request during transmission.
- the processing system may receive an instruction from the HIMC requesting a first signature for the integrated circuit chip.
- the instruction may be sent from the HIMC server to the processing system using a secure communications protocol, such as two-way TLS, as discussed above.
- the processing system may collect data about the current conditions of the integrated circuit chip from the plurality of sensors.
- the data may include, for example, the current temperature, supply voltage, electro-magnetic interference, frequency, and/or the like of the integrated circuit chip.
- the processing system may collect the data in step 310 by issuing a challenge to the integrated circuit chip, where the challenge may allow the processing system to determine a delay on a specific path of the integrated circuit chip under the current of conditions.
- a challenge as issued in step 310 creates two paths through the integrated circuit that are excited simultaneously. The digital response of the integrated circuit chip to the challenge may thus be based on a timing comparison of the delays of the two paths. Path delays in an integrated circuit are statistically distributed due to process variations during fabrication.
- the processing system may wait to issue the challenge in step 310 until the processing system can determine that the challenge can be issued without interfering with operation of the motherboard.
- a secure hardware controller as disclosed herein may include intelligence that allows the secure hardware controller to monitor motherboard activity (power on/power off, etc.) and traffic density.
- the processing system may generate the first signature from the data collected in step 310 .
- the first signature may comprise a PUF.
- the processing system may send the first signature to the HIMC server using a secure communications protocol, such as two-way TLS, as discussed above.
- the processing system may receive an alert from the HIMC indicating the status of the integrated circuit chip.
- the alert may indicate that the integrated circuit chip is believed to be free from tampering or that the integrated circuit chip may have been tampered with. In one example, however, no alert is sent by the HIMC server as long as the integrated circuit chip is believed to be free from tampering (e.g., an alert is only received in step 316 if tampering is suspected).
- the method 300 may end.
- steps, functions, or operations of the methods 200 and 300 may include a storing, displaying, and/or outputting step as required for a particular application.
- any data, records, fields, and/or intermediate results discussed in the method can be stored, displayed, and/or outputted either on the device executing the method or to another device, as required for a particular application.
- steps, blocks, functions or operations in FIG. 2 or FIG. 3 that recite a determining operation or involve a decision do not necessarily require that both branches of the determining operation be practiced.
- one of the branches of the determining operation can be deemed as an optional step.
- steps, blocks, functions or operations of the above described method can be combined, separated, and/or performed in a different order from that described above, without departing from the examples of the present disclosure.
- FIG. 4 depicts a high-level block diagram of a computing device or processing system specifically programmed to perform the functions described herein.
- the processing system 400 comprises one or more hardware processor elements 402 (e.g., a central processing unit (CPU), a microprocessor, or a multi-core processor), a memory 404 (e.g., random access memory (RAM) and/or read only memory (ROM)), a module 405 for protecting computing devices against malicious tampering, and various input/output devices 406 (e.g., storage devices, including but not limited to, a tape drive, a floppy drive, a hard disk drive or a compact disk drive, a receiver, a transmitter, a speaker, a display, a speech synthesizer, an output port, an input port and a user input device (such as a keyboard, a keypad, a mouse, a microphone and the like)).
- hardware processor elements 402 e.g., a central processing unit (CPU), a microprocess
- the computing device may employ a plurality of processor elements.
- the method 200 or 300 as discussed above is implemented in a distributed or parallel manner for a particular illustrative example, i.e., the steps of the above method 200 or 300 , or the entire method 200 or 300 is implemented across multiple or parallel computing devices, e.g., a processing system, then the computing device of this figure is intended to represent each of those multiple computing devices.
- one or more hardware processors can be utilized in supporting a virtualized or shared computing environment.
- the virtualized computing environment may support one or more virtual machines representing computers, servers, or other computing devices.
- hardware components such as hardware processors and computer-readable storage devices may be virtualized or logically represented.
- the hardware processor 402 can also be configured or programmed to cause other devices to perform one or more operations as discussed above. In other words, the hardware processor 402 may serve the function of a central controller directing other devices to perform the one or more operations as discussed above.
- the present disclosure can be implemented in software and/or in a combination of software and hardware, e.g., using application specific integrated circuits (ASIC), a programmable gate array (PGA) including a Field PGA, or a state machine deployed on a hardware device, a computing device or any other hardware equivalents, e.g., computer readable instructions pertaining to the method discussed above can be used to configure a hardware processor to perform the steps, functions and/or operations of the above disclosed method 200 or 300 .
- ASIC application specific integrated circuits
- PGA programmable gate array
- Field PGA or a state machine deployed on a hardware device
- computing device or any other hardware equivalents e.g., computer readable instructions pertaining to the method discussed above can be used to configure a hardware processor to perform the steps, functions and/or operations of the above disclosed method 200 or 300 .
- instructions and data for the present module or process 405 for protecting computing devices against malicious tampering can be loaded into memory 404 and executed by hardware processor element 402 to implement the steps, functions, or operations as discussed above in connection with the illustrative method 200 or 300 .
- a hardware processor executes instructions to perform “operations,” this could include the hardware processor performing the operations directly and/or facilitating, directing, or cooperating with another hardware device or component (e.g., a co-processor and the like) to perform the operations.
- the processor executing the computer readable or software instructions relating to the above described method can be perceived as a programmed processor or a specialized processor.
- the present module 405 for protecting computing devices against malicious tampering (including associated data structures) of the present disclosure can be stored on a tangible or physical (broadly non-transitory) computer-readable storage device or medium, e.g., volatile memory, non-volatile memory, ROM memory, RAM memory, magnetic or optical drive, device or diskette, and the like.
- a “tangible” computer-readable storage device or medium comprises a physical device, a hardware device, or a device that is discernible by the touch. More specifically, the computer-readable storage device may comprise any physical devices that provide the ability to store information such as data and/or instructions to be accessed by a processor or a computing device such as a computer or an application server.
Landscapes
- Engineering & Computer Science (AREA)
- Computer Security & Cryptography (AREA)
- Computer Hardware Design (AREA)
- Theoretical Computer Science (AREA)
- Software Systems (AREA)
- General Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Virology (AREA)
- Health & Medical Sciences (AREA)
- General Health & Medical Sciences (AREA)
- Mathematical Physics (AREA)
- Telephonic Communication Services (AREA)
- Computer And Data Communications (AREA)
Abstract
In one example, a method performed by a processing system of a server includes sending an instruction to a controller installed on an integrated circuit chip of a remote computing device, wherein the instruction requests that the controller issue a challenge to the integrated circuit, receiving a first signature of the integrated circuit chip from the controller, wherein the first signature is derived by the controller from a response of the integrated circuit chip to the challenge, comparing the first signature to a second signature that is stored on the server, wherein the second signature was derived through testing of the integrated circuit chip prior to the integrated circuit chip being deployed in the remote computing device, and generating an alert when the first signature fails to match the second signature, wherein the alert indicates that the integrated circuit chip may have been tampered with.
Description
- This application is a continuation of U.S. patent application Ser. No. 16/459,043, filed on Jul. 1, 2019, now U.S. Pat. No. 11,269,999, which is herein incorporated by reference in its entirety.
- The present disclosure relates generally to hardware security, and relates more particularly to devices, non-transitory computer-readable media, and methods for protecting computing devices against malicious tampering.
- A motherboard is the main printed circuit board (PCB) found in computing devices including consumer electronics and data center servers. A motherboard typically includes a plurality of integrated circuit chips and capacitors that collectively facilitate communications between a computing device's various electronic components (e.g., central processing unit, memory, input/output devices and other peripherals, etc.). Some motherboards (including motherboards for servers, switches, memory devices, and the like) may further include a baseboard management controller (BMC), which is a type of superchip or small computer that may be used by an administrator to remotely access a malfunctioning computing device.
- In one example, the present disclosure discloses a device, computer-readable medium, and method for protecting computing devices against malicious tampering. For example, a method performed by a processing system of a server may include sending an instruction to a controller installed on an integrated circuit chip of a remote computing device, wherein the instruction requests that the controller issue a challenge to the integrated circuit, receiving a first signature of the integrated circuit chip from the controller, wherein the first signature is derived by the controller from a response of the integrated circuit chip to the challenge, comparing the first signature to a second signature that is stored on the server, wherein the second signature was derived through testing of the integrated circuit chip prior to the integrated circuit chip being deployed in the remote computing device, and generating an alert when the first signature fails to match the second signature, wherein the alert indicates that the integrated circuit chip may have been tampered with.
- In another example, a non-transitory computer-readable medium may store instructions which, when executed by a processing system of a server, cause the processing system to perform operations. The operations may include sending an instruction to a controller installed on an integrated circuit chip of a remote computing device, wherein the instruction requests that the controller issue a challenge to the integrated circuit, receiving a first signature of the integrated circuit chip from the controller, wherein the first signature is derived by the controller from a response of the integrated circuit chip to the challenge, comparing the first signature to a second signature that is stored on the server, wherein the second signature was derived through testing of the integrated circuit chip prior to the integrated circuit chip being deployed in the remote computing device, and generating an alert when the first signature fails to match the second signature, wherein the alert indicates that the integrated circuit chip may have been tampered with.
- In another example, a system deployed on an integrated circuit chip of a computing device may include a plurality of sensors to monitor a plurality of physical conditions of the integrated circuit chip and a controller communicatively coupled to the sensors to issue a challenge to the integrated circuit chip and to derive a first signature for the integrated circuit chip from a response of the integrated circuit chip to the challenge.
- The teachings of the present disclosure can be readily understood by considering the following detailed description in conjunction with the accompanying drawings, in which:
-
FIG. 1 illustrates an example system in which examples of the present disclosure for protecting computing devices against malicious tampering may operate; -
FIG. 2 illustrates a flowchart of an example method for protecting computing devices against malicious tampering, in accordance with the present disclosure; -
FIG. 3 illustrates a flowchart of another example method for protecting computing devices against malicious tampering, in accordance with the present disclosure; and -
FIG. 4 illustrates an example of a computing device, or computing system, specifically programmed to perform the steps, functions, blocks, and/or operations described herein. - To facilitate understanding, similar reference numerals have been used, where possible, to designate elements that are common to the figures.
- The present disclosure broadly discloses methods, computer-readable media, and devices for protecting computing devices against malicious tampering. As discussed above, a motherboard typically includes a plurality of integrated circuit chips and capacitors that collectively facilitate communications between a computing device's various electronic components (e.g., central processing unit, memory, input/output devices and other peripherals, etc.). In some cases, a malicious party may, unknowingly to a computing device's end user, install hardware elements on the motherboard that may cause the computing device to carry out unwanted operations. These unwanted operations may include, for example, preparing the computing device's operating system to accept code (e.g., malware) from an anonymous remote computing device, revealing encryption keys to a remote computing device, blocking security updates to the computing device, opening new network connections to the computing device, altering line by line operation of the computing device, and other operations.
- The hardware elements installed by the malicious third party may be connected to the computing device's baseboard management controller (BMC). This may give the malicious third party access to the computing device's most sensitive code, even if the computing device has crashed or is powered off. For instance, the Linux operating system, which runs in many servers, includes code that authorizes a user by verifying a typed password against a stored, encrypted password. A chip maliciously planted in one of the servers may alter part of this code, so that the server will not ask a user for a password.
- Examples of the present disclosure utilize a secure hardware controller, which is installed on a critical integrated circuit chip on a computing device's PCB (e.g., motherboard), to monitor certain physical parameters of the integrated circuit chip. The secure hardware controller may collect data about the physical parameters from a plurality of sensors on the integrated circuit chip, and use the data to generate a signature. The signature may comprise a physical unclonable function (PUF).
- The secure hardware controller may transmit the signature, using a secure communications protocol, to a remote hardware integrity management center (HIMC) server. The HIMC may compare the signature generated by the secure hardware controller to signatures that have been previously stored on the HIMC server (e.g., by the manufacturer of the computing device or a third-party testing and validation entity). When the signature generated by the secure hardware controller does not match the stored signature(s), this may indicate that the integrated circuit chip has been tampered with.
- As discussed above, examples of the present disclosure may rely on the use of PUFs. Although conventionally used for cytological applications, PUFs are used in examples of the present disclosure to verify the integrity (e.g., freedom from tampering) of an integrated circuit chip. The PUFs comprise physically-defined, digital “fingerprints” which uniquely identify the critical integrated circuit chips of a computing device. PUFs are easy to evaluate using a physical system, and PUF outputs resemble random functions.
- To further aid in understanding the present disclosure,
FIG. 1 illustrates anexample system 100 in which examples of the present disclosure for protecting computing devices against malicious tampering may operate. Thesystem 100 may include any one or more types of communication networks, such as a traditional circuit switched network (e.g., a public switched telephone network (PSTN)) or a packet network such as an Internet Protocol (IP) network (e.g., an IP Multimedia Subsystem (IMS) network), an asynchronous transfer mode (ATM) network, a wired network, a wireless network, and/or a cellular network (e.g., 2G-5G, a long term evolution (LTE) network, and the like) related to the current disclosure. It should be noted that an IP network is broadly defined as a network that uses Internet Protocol to exchange data packets. Additional example IP networks include Voice over IP (VoIP) networks, Service over IP (SoIP) networks, and the like. - In one example, the
system 100 may comprise acore network 102. Thecore network 102 may be in communication with one ormore access networks core network 102 may combine core network components of a wired or cellular network with components of a triple play service network; where triple-play services include telephone services, Internet services and television services to subscribers. For example, thecore network 102 may functionally comprise a fixed mobile convergence (FMC) network, e.g., an IP Multimedia Subsystem (IMS) network. In addition, thecore network 102 may functionally comprise a telephony network, e.g., an Internet Protocol/Multi-Protocol Label Switching (IP/MPLS) backbone network utilizing Session Initiation Protocol (SIP) for circuit-switched and Voice over Internet Protocol (VoIP) telephony services. Thecore network 102 may further comprise a broadcast television network, e.g., a traditional cable provider network or an Internet Protocol Television (IPTV) network, as well as an Internet Service Provider (ISP) network. In one example, thecore network 102 may include a plurality of television (TV) servers (e.g., a broadcast server, a cable head-end), a plurality of content servers, an advertising server (AS), an interactive TV/video on demand (VoD) server, and so forth. As further illustrated inFIG. 1 , thecore network 102 may include a hardware integrity management center (HIMC)server 104. For ease of illustration, various additional elements ofnetwork 102 are omitted fromFIG. 1 . - In one example, the
access networks core network 102 may provide telecommunication services to subscribers viaaccess networks access networks core network 102 may be operated by a telecommunication network service provider. Thecore network 102 and theaccess networks access networks 120 and/or 122 may be operated by entities having core businesses that are not related to telecommunications services, e.g., corporate, governmental, or educational institution LANs, and the like. - In one example, the
access networks Access networks server 104, as discussed in further detail below. - In one example, each chip 110 is tested prior to deployment in a computing device (e.g., at time of manufacture, or after manufacture by prior to deployment). As discussed above, the chips 110 may be tested by the respective manufacturers or by a third-party testing and validation entity. Testing of a chip 110 may involve operating the chip 110 under different conditions to derive a signature (e.g., a physical unclonable function) for the chip 110. For instance, when certain parameters of the chip 110 such as temperature, age, frequency, and the like fall outside threshold ranges, this may alter the chip's path delay (which is otherwise substantially constant). The combination of parameters that causes the change in delay may function as a unique signature for the chip 110, since no two chips' path delays will be altered in precisely the same way by the precisely the same combination of parameters (thanks to process variations during fabrication). The
unique signatures 108 for each chip 110 may be stored in theHIMC server 104. Thesignatures 108 could also be sent (e.g., via email) to a human administrator or end user. In one example, theHIMC server 104 stores a plurality of signatures for each chip 110, where each signature represents the chip's response to a different set of conditions. - In one example, each of the chips 110 may be configured as illustrated by the
example chip 110 n. For example, each chip 110 may include asecure hardware controller 112 and a plurality of sensors 114 1-114 m (hereinafter individually referred to as a “sensor 114” or collectively referred to as “sensors 114”). The sensors 114 may include different types of sensors, such as temperature sensors, voltage sensors, current sensors, frequency sensors, and the like. Each of the sensors 114 may thus be designed to collect data regarding a different physical parameter of the chip 110. The sensors 114 may collect the data during testing of the chip 110, as described above. The sensors 114 may also continue to collect the data after deployment of the chip 110 and to send the data to theHIMC server 104, as described in further detail below. - The sensors 114 may send the collected data to the
secure hardware controller 112. Thesecure hardware controller 112 may comprise a microcontroller that communicates with theHIMC server 104. In a further example, thesecure hardware controller 112 may be configured for the specific type of the chip 110, e.g., such that thesecure hardware controller 112 is able to evaluate different valid paths on the chip 110. In one example, thesecure hardware controller 112 and the sensors 114 may be installed in a secure manner by the manufacturer of the chip 110 or by a testing and validation entity. In a further example, thesecure hardware controller 112 may be housed within a small, tamper-resistant enclosure on the chip 110. - In operation, the chips 110 may be tested to generate unique signatures which are stored in the
HIMC server 104 as discussed above. Subsequently, theHIMC server 104 may cooperate with the secure hardware controller 110 on each chip 110 to verify that the chip 110 has not been tampered with. For example, theHIMC server 104 may execute averification routine 106 at various points during the supply chain, operation, and maintenance of the chip 110. TheHIMC server 104 may also execute the verification routine in response to the occurrence of a predefined event (e.g., every time the server of which the chip 110 is part is powered on), periodically (e.g., every x hours or days), on demand (e.g., in response to a request from thesecure hardware controller 112 of the chip), or according to any other schedule. - The
verification routine 106 may cause theHIMC server 104 to send an instruction to thesecure hardware controller 112 instructing thesecure hardware controller 112 to generate and provide a signature for the chip 110. Thesecure hardware controller 112 may, in response, generate the signature for the chip 110 based on current data provided by the sensors 114. Thesecure hardware controller 104 may send the generated signature to theHIMC server 104, which may compare the generated signature to the stored signature(s) 108 for the chip 110. If the generated signature matches the stored signature, then it can be assumed that the chip 110 has not been tampered with. If, however, the generated signature does not match the stored signature, then it may be assumed that the chip 110 has been tampered with. - A secure communications protocol, such as two-way transport layer security (TLS), may be used to carry communications from the
HIMC server 104 to thesecure hardware controller 112, and vice versa. In accordance with the present disclosure, any of theHIMC server 104 and/or thesecure hardware controller 112 may comprise a computing system or server, such ascomputing system 400 depicted inFIG. 4 , and may be configured to provide one or more operations or functions in connection with examples of the present disclosure for protecting computing devices against malicious tampering, as described herein. - In accordance with the present disclosure, any of the
HIMC server 104 and/or thesecure hardware controller 112 may comprise one or more physical devices, e.g., one or more computing systems or servers, such ascomputing system 400 depicted inFIG. 4 , and may be configured to provide one or more operations protecting computing devices against malicious tampering, as described herein. It should be noted that as used herein, the terms “configure,” and “reconfigure” may refer to programming or loading a processing system with computer-readable/computer-executable instructions, code, and/or programs, e.g., in a distributed or non-distributed memory, which when executed by a processor, or processors, of the processing system within a same device or within distributed devices, may cause the processing system to perform various functions. Such terms may also encompass providing variables, data values, tables, objects, or other data structures or the like which may cause a processing system executing computer-readable instructions, code, and/or programs to function differently depending upon the values of the variables or other data structures that are provided. As referred to herein a “processing system” may comprise a computing device including one or more processors, or cores (e.g., as illustrated inFIG. 4 and discussed below) or multiple computing devices collectively configured to perform various steps, functions, and/or operations in accordance with the present disclosure. - It should be noted that the
system 100 has been simplified. Thus, those skilled in the art will realize that thesystem 100 may be implemented in a different form than that which is illustrated inFIG. 1 , or may be expanded by including additional endpoint devices, access networks, network elements, application servers, etc. without altering the scope of the present disclosure. In addition,system 100 may be altered to omit various elements, substitute elements for devices that perform the same or similar functions, combine elements that are illustrated as separate devices, and/or implement network elements as functions that are spread across several devices that operate collectively as the respective network elements. For example, thesystem 100 may include other network elements (not shown) such as border elements, routers, switches, policy servers, security devices, gateways, a content distribution network (CDN) and the like. For example, portions of thecore network 102 and/oraccess networks access networks 120 and/or 122 may each comprise a plurality of different access networks that may interface with thecore network 102 independently or in a chained manner. Thus, these and other modifications are all contemplated within the scope of the present disclosure. -
FIG. 2 illustrates a flowchart of anexample method 200 for protecting computing devices against malicious tampering, in accordance with the present disclosure. In one example, steps, functions and/or operations of themethod 200 may be performed by a device as illustrated inFIG. 1 , e.g., anHIMC server 104 or any one or more components thereof. In one example, the steps, functions, or operations ofmethod 200 may be performed by a computing device orsystem 400, and/or aprocessing system 402 as described in connection withFIG. 4 below. For instance, thecomputing device 400 may represent at least a portion ofHIMC server 104 in accordance with the present disclosure. For illustrative purposes, themethod 200 is described in greater detail below in connection with an example performed by a processing system, such asprocessing system 402. Themethod 200 begins instep 202 and proceeds to step 204. - At
step 204, the processing system (of an HIMC server in communication with a plurality of secure hardware controllers) may send an instruction to a secure hardware controller of an integrated circuit chip requesting a first signature for the integrated circuit chip. The integrated circuit chip may be housed on a motherboard of a computing device, such as a server. The secure hardware controller may also be housed on the motherboard of the computing device, and may be connected to a plurality of sensors that monitors various conditions of the integrated circuit chip (e.g., temperature, supply voltage, electro-magnetic interference, frequency, etc.). The secure hardware controller may derive the first signature by issuing a challenge to the integrated circuit chip, as discussed in further detail below. In one example, the instruction may be sent from the processing system to the secure hardware controller using a secure communications protocol, such as two-way TLS. For instance, the instruction may be encoded in a data packet that is encrypted using encryption keys that are generated specifically and uniquely for the connection between the processing system and the secure hardware controller. Furthermore, the data packet encoding the instruction may include a message authentication code that guards against loss or alteration of the instruction during transmission. - In one example, the instruction is sent in
step 204 in response to the occurrence of a predefined event (e.g., the computing device that contains the integrated circuit chip being powered on). In another example, the instruction is sent instep 204 according to a predefined schedule (e.g., periodically every x hours or days). In another example, the instruction is sent instep 204 on demand (e.g., in response to a request from the secure hardware controller). In another example, the instruction sent instep 204 may be sent according to any other schedule. - At
step 206, the processing system may receive the first signature from the secure hardware controller. The first signature may be derived by the secure hardware controller from current conditions of the integrated circuit chip, e.g., as measured and reported to the secure hardware controller by the plurality of sensors. For example, the first signature may comprise a PUF. In one example, the first signature may be sent by the secure hardware controller to the processing system using a secure communications protocol, such as two-way TLS, as discussed above. - At
step 208, the processing system may compare the first signature to a second signature for the integrated circuit chip that is stored at the HIMC server. As discussed above, the second signature may be derived from testing of the integrated circuit chip prior to deployment (e.g., by the manufacturer or by a testing and validation entity). Thus, the second signature may represent what the processing system expects to see when receiving the first signature from the secure hardware controller. - In
step 210, the processing system may determine whether the first signature and the second signature match. - If the processing system determines in
step 210 that the first signature and the second signature match, then themethod 200 may optionally proceed to step 212. At optional step 212 (illustrated in phantom), the processing system may send an alert (e.g., to the secure hardware controller in the form of a message, to a human administrator or end user in the form of an email, or the like) indicating that the integrated circuit chip is believed to be free from tampering. In one example, however, no alert is generated as long as the integrated circuit chip is believed to be free from tampering (e.g., alerts are only generated if tampering is suspected). - If, however, the processing system determines in
step 210 that the first signature and the second signature do not match, then themethod 200 may proceed to step 214. Atstep 214, the processing system may send an alert (e.g., to the secure hardware controller in the form of a message, to a human administrator or end user in the form of an email, or the like) indicating that the integrated circuit chip may have been tampered with. - In one example, any alert sent in accordance with
step - At
step 216, themethod 200 may end. -
FIG. 3 illustrates a flowchart of anotherexample method 300 for protecting computing devices against malicious tampering, in accordance with the present disclosure. In one example, steps, functions and/or operations of themethod 300 may be performed by a device as illustrated inFIG. 1 , e.g., asecure hardware controller 112 or a chip 110, or any one or more components thereof. In one example, the steps, functions, or operations ofmethod 300 may be performed by a computing device orsystem 400, and/or aprocessing system 402 as described in connection withFIG. 4 below. For instance, thecomputing device 400 may represent at least a portion ofsecure hardware controller 112 in accordance with the present disclosure. For illustrative purposes, themethod 300 is described in greater detail below in connection with an example performed by a processing system, such asprocessing system 402. Themethod 300 begins instep 302 and proceeds to step 304. - At optional step 304 (illustrated in phantom), the processing system (of a secure hardware controller installed on a motherboard of a computing device, such as a server) may detect a condition under which an integrated circuit chip of the computing device is to be verified (i.e., determined to be free from tampering). The integrated circuit chip may also be housed on the motherboard. The secure hardware controller may be connected to a plurality of sensors that monitors various conditions of the integrated circuit chip (e.g., temperature, supply voltage, electro-magnetic interference, frequency, etc.).
- As discussed above, in one example, the occurrence of a predefined event (e.g., the computing device that contains the integrated circuit chip being powered on) may trigger a verification routine. In another example, the verification routine may be performed according to a predefined schedule (e.g., periodically every x hours or days). In another example, the verification routine may be performed according to any other schedule.
- At optional step 306 (illustrated in phantom), the processing system may send a request to an HIMC server to verify the integrated circuit chip. In one example, the request may be sent from the processing system to the HIMC server using a secure communications protocol, such as two-way TLS. For instance, the request may be encoded in a data packet that is encrypted using encryption keys that are generated specifically and uniquely for the connection between the processing system and the HIMC server. Furthermore, the data packet encoding the request may include a message authentication code that guards against loss or alteration of the request during transmission.
- At
step 308, the processing system may receive an instruction from the HIMC requesting a first signature for the integrated circuit chip. In one example, the instruction may be sent from the HIMC server to the processing system using a secure communications protocol, such as two-way TLS, as discussed above. - At
step 310, the processing system may collect data about the current conditions of the integrated circuit chip from the plurality of sensors. The data may include, for example, the current temperature, supply voltage, electro-magnetic interference, frequency, and/or the like of the integrated circuit chip. Thus, the processing system may collect the data instep 310 by issuing a challenge to the integrated circuit chip, where the challenge may allow the processing system to determine a delay on a specific path of the integrated circuit chip under the current of conditions. In one example, a challenge as issued instep 310 creates two paths through the integrated circuit that are excited simultaneously. The digital response of the integrated circuit chip to the challenge may thus be based on a timing comparison of the delays of the two paths. Path delays in an integrated circuit are statistically distributed due to process variations during fabrication. - In one example, the processing system may wait to issue the challenge in
step 310 until the processing system can determine that the challenge can be issued without interfering with operation of the motherboard. For instance, a secure hardware controller as disclosed herein may include intelligence that allows the secure hardware controller to monitor motherboard activity (power on/power off, etc.) and traffic density. - At
step 312, the processing system may generate the first signature from the data collected instep 310. For example, the first signature may comprise a PUF. - At
step 314, the processing system may send the first signature to the HIMC server using a secure communications protocol, such as two-way TLS, as discussed above. - At optional step 316 (illustrated in phantom), the processing system may receive an alert from the HIMC indicating the status of the integrated circuit chip. For instance, the alert may indicate that the integrated circuit chip is believed to be free from tampering or that the integrated circuit chip may have been tampered with. In one example, however, no alert is sent by the HIMC server as long as the integrated circuit chip is believed to be free from tampering (e.g., an alert is only received in
step 316 if tampering is suspected). - At
step 318, themethod 300 may end. - It should be noted that the
methods methods FIG. 2 orFIG. 3 that recite a determining operation or involve a decision do not necessarily require that both branches of the determining operation be practiced. In other words, one of the branches of the determining operation can be deemed as an optional step. Furthermore, steps, blocks, functions or operations of the above described method can be combined, separated, and/or performed in a different order from that described above, without departing from the examples of the present disclosure. -
FIG. 4 depicts a high-level block diagram of a computing device or processing system specifically programmed to perform the functions described herein. As depicted inFIG. 4 , theprocessing system 400 comprises one or more hardware processor elements 402 (e.g., a central processing unit (CPU), a microprocessor, or a multi-core processor), a memory 404 (e.g., random access memory (RAM) and/or read only memory (ROM)), amodule 405 for protecting computing devices against malicious tampering, and various input/output devices 406 (e.g., storage devices, including but not limited to, a tape drive, a floppy drive, a hard disk drive or a compact disk drive, a receiver, a transmitter, a speaker, a display, a speech synthesizer, an output port, an input port and a user input device (such as a keyboard, a keypad, a mouse, a microphone and the like)). Although only one processor element is shown, it should be noted that the computing device may employ a plurality of processor elements. Furthermore, although only one computing device is shown in the figure, if themethod above method entire method - Furthermore, one or more hardware processors can be utilized in supporting a virtualized or shared computing environment. The virtualized computing environment may support one or more virtual machines representing computers, servers, or other computing devices. In such virtualized virtual machines, hardware components such as hardware processors and computer-readable storage devices may be virtualized or logically represented. The
hardware processor 402 can also be configured or programmed to cause other devices to perform one or more operations as discussed above. In other words, thehardware processor 402 may serve the function of a central controller directing other devices to perform the one or more operations as discussed above. - It should be noted that the present disclosure can be implemented in software and/or in a combination of software and hardware, e.g., using application specific integrated circuits (ASIC), a programmable gate array (PGA) including a Field PGA, or a state machine deployed on a hardware device, a computing device or any other hardware equivalents, e.g., computer readable instructions pertaining to the method discussed above can be used to configure a hardware processor to perform the steps, functions and/or operations of the above disclosed
method process 405 for protecting computing devices against malicious tampering (e.g., a software program comprising computer-executable instructions) can be loaded intomemory 404 and executed byhardware processor element 402 to implement the steps, functions, or operations as discussed above in connection with theillustrative method - The processor executing the computer readable or software instructions relating to the above described method can be perceived as a programmed processor or a specialized processor. As such, the
present module 405 for protecting computing devices against malicious tampering (including associated data structures) of the present disclosure can be stored on a tangible or physical (broadly non-transitory) computer-readable storage device or medium, e.g., volatile memory, non-volatile memory, ROM memory, RAM memory, magnetic or optical drive, device or diskette, and the like. Furthermore, a “tangible” computer-readable storage device or medium comprises a physical device, a hardware device, or a device that is discernible by the touch. More specifically, the computer-readable storage device may comprise any physical devices that provide the ability to store information such as data and/or instructions to be accessed by a processor or a computing device such as a computer or an application server. - While various examples have been described above, it should be understood that they have been presented by way of illustration only, and not a limitation. Thus, the breadth and scope of any aspect of the present disclosure should not be limited by any of the above-described examples, but should be defined only in accordance with the following claims and their equivalents.
Claims (20)
1. A method comprising:
sending, by a processing system of a server, an instruction to a controller installed on an integrated circuit chip of a remote computing device, wherein the instruction requests that the controller issue a challenge to the integrated circuit chip;
receiving, by the processing system, a first signature of the integrated circuit chip from the controller, wherein the first signature is derived by the controller from a response of the integrated circuit chip to the challenge;
comparing, by the processing system, the first signature to a second signature that is stored on the server, wherein the second signature was derived through testing of the integrated circuit chip prior to the integrated circuit chip being deployed in the remote computing device; and
generating, by the processing system, an alert when the first signature fails to match the second signature, wherein the alert indicates that the integrated circuit chip may have been tampered with.
2. The method of claim 1 , wherein the instruction is sent in response to the remote computing device being powered on.
3. The method of claim 1 , wherein the instruction is sent periodically.
4. The method of claim 1 , wherein each of the first signature and the second comprises a physical unclonable function.
5. The method of claim 4 , wherein the challenge comprises simultaneously exciting two paths through the integrated circuit chip and comparing respective delays on the two paths.
6. The method of claim 1 , wherein the first signature is sent using a secure communications protocol.
7. The method of claim 1 , further comprising:
sending, by the processing system, the alert to a human administrator.
8. The method of claim 1 , wherein the controller is in communication with a plurality of sensors on the integrated circuit chip, and the plurality of sensors comprises different types of sensors configured to monitor different physical conditions of the integrated circuit chip during the challenge.
9. The method of claim 8 , wherein at least one sensor of the plurality of sensors measures a temperature of the integrated circuit chip.
10. The method of claim 8 , wherein at least one sensor of the plurality of sensors measures a supply voltage of the integrated circuit chip.
11. The method of claim 8 , wherein at least one sensor of the plurality of sensors measures an electro-magnetic interference of the integrated circuit chip.
12. A non-transitory computer-readable medium storing instructions which, when executed by a processing system of a server, cause the processing system to perform operations, the operations comprising:
sending an instruction to a controller installed on an integrated circuit chip of a remote computing device, wherein the instruction requests that the controller issue a challenge to the integrated circuit chip;
receiving a first signature of the integrated circuit chip from the controller, wherein the first signature is derived by the controller from a response of the integrated circuit chip to the challenge;
comparing the first signature to a second signature that is stored on the server, wherein the second signature was derived through testing of the integrated circuit chip prior to the integrated circuit chip being deployed in the remote computing device; and
generating an alert when the first signature fails to match the second signature, wherein the alert indicates that the integrated circuit chip may have been tampered with.
13. The non-transitory computer-readable medium of claim 12 , wherein each of the first signature and the second comprises a physical unclonable function.
14. The non-transitory computer-readable medium of claim 13 , wherein the challenge comprises simultaneously exciting two paths through the integrated circuit chip and comparing respective delays on the two paths.
15. A system deployed on an integrated circuit chip of a computing device, comprising:
a plurality of sensors to monitor a plurality of physical conditions of the integrated circuit chip; and
a controller communicatively coupled to the plurality of sensors to issue a challenge to the integrated circuit chip and to derive a first signature for the integrated circuit chip from a response of the integrated circuit chip to the challenge, wherein the controller is further communicatively coupled to a remote server that stores a second signature for the integrated circuit chip, wherein the second signature was derived through testing of the integrated circuit chip prior to the integrated circuit chip being deployed in the computing device.
16. The system of claim 15 , wherein each of the first signature and the second signature comprises a physical unclonable function.
17. The system of claim 15 , wherein the controller is housed in a tamper-resistant enclosure.
18. The system of claim 15 , wherein at least one sensor of the plurality of sensors measures a temperature of the integrated circuit chip.
19. The system of claim 15 , wherein at least one sensor of the plurality of sensors measures a supply voltage of the integrated circuit chip.
20. The system of claim 15 , wherein at least one sensor of the plurality of sensors measures an electro-magnetic interference of the integrated circuit chip.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US17/688,757 US20220198008A1 (en) | 2019-07-01 | 2022-03-07 | Protecting computing devices from malicious tampering |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US16/459,043 US11269999B2 (en) | 2019-07-01 | 2019-07-01 | Protecting computing devices from malicious tampering |
US17/688,757 US20220198008A1 (en) | 2019-07-01 | 2022-03-07 | Protecting computing devices from malicious tampering |
Related Parent Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US16/459,043 Continuation US11269999B2 (en) | 2019-07-01 | 2019-07-01 | Protecting computing devices from malicious tampering |
Publications (1)
Publication Number | Publication Date |
---|---|
US20220198008A1 true US20220198008A1 (en) | 2022-06-23 |
Family
ID=74066507
Family Applications (2)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US16/459,043 Active 2040-05-07 US11269999B2 (en) | 2019-07-01 | 2019-07-01 | Protecting computing devices from malicious tampering |
US17/688,757 Abandoned US20220198008A1 (en) | 2019-07-01 | 2022-03-07 | Protecting computing devices from malicious tampering |
Family Applications Before (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US16/459,043 Active 2040-05-07 US11269999B2 (en) | 2019-07-01 | 2019-07-01 | Protecting computing devices from malicious tampering |
Country Status (1)
Country | Link |
---|---|
US (2) | US11269999B2 (en) |
Families Citing this family (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US11733767B2 (en) * | 2021-06-25 | 2023-08-22 | Qualcomm Incorporated | Power management for multiple-chiplet systems |
US11593490B2 (en) * | 2021-07-28 | 2023-02-28 | Dell Products, L.P. | System and method for maintaining trusted execution in an untrusted computing environment using a secure communication channel |
Citations (76)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US5254942A (en) * | 1991-04-25 | 1993-10-19 | Daniel D'Souza | Single chip IC tester architecture |
US20030204743A1 (en) * | 2002-04-16 | 2003-10-30 | Srinivas Devadas | Authentication of integrated circuits |
US20040030912A1 (en) * | 2001-05-09 | 2004-02-12 | Merkle James A. | Systems and methods for the prevention of unauthorized use and manipulation of digital content |
US20040181303A1 (en) * | 2002-12-02 | 2004-09-16 | Silverbrook Research Pty Ltd | Relatively unique ID in integrated circuit |
US20060273438A1 (en) * | 2005-06-03 | 2006-12-07 | International Business Machines Corporation | Stacked chip security |
US20070169183A1 (en) * | 1998-10-13 | 2007-07-19 | Nds Limited | Remote administration of smart cards for secure access systems |
US20080034334A1 (en) * | 2004-02-17 | 2008-02-07 | Oussama Laouamri | Integrated Circuit Chip with Communication Means Enabling Remote Control of Testing Means of Ip Cores of the Integrated Circuit |
US20090320110A1 (en) * | 2008-06-23 | 2009-12-24 | Nicolson Kenneth Alexander | Secure boot with optional components method |
US20100031065A1 (en) * | 2006-11-06 | 2010-02-04 | Yuichi Futa | Information security apparatus |
US20100073147A1 (en) * | 2006-12-06 | 2010-03-25 | Koninklijke Philips Electronics N.V. | Controlling data access to and from an rfid device |
US20100127822A1 (en) * | 2008-11-21 | 2010-05-27 | Verayo, Inc. | Non-networked rfid-puf authentication |
US20100241864A1 (en) * | 2008-11-21 | 2010-09-23 | Dafca, Inc. | Authenticating an integrated circuit based on stored information |
US20120131340A1 (en) * | 2010-11-19 | 2012-05-24 | Philippe Teuwen | Enrollment of Physically Unclonable Functions |
US20120158339A1 (en) * | 2010-12-21 | 2012-06-21 | Stmicroelectronics Pvt. Ltd. | Calibration arrangement |
US20120233685A1 (en) * | 2011-03-09 | 2012-09-13 | Qualcomm Incorporated | Method for authentication of a remote station using a secure element |
US20130047209A1 (en) * | 2010-03-24 | 2013-02-21 | National Institute Of Advanced Industrial Science And Technology | Authentication processing method and apparatus |
US20130082733A1 (en) * | 2010-06-07 | 2013-04-04 | Mitsubishi Electric Corporation | Signal processing system |
US20130133031A1 (en) * | 2011-11-22 | 2013-05-23 | International Business Machines Corporation | Retention Based Intrinsic Fingerprint Identification Featuring A Fuzzy Algorithm and a Dynamic Key |
US20130141137A1 (en) * | 2011-06-01 | 2013-06-06 | ISC8 Inc. | Stacked Physically Uncloneable Function Sense and Respond Module |
US20130145467A1 (en) * | 2002-12-12 | 2013-06-06 | Victor J. Yodaiken | Systems and methods for detecting a security breach in a computer system |
US20130198873A1 (en) * | 2012-01-27 | 2013-08-01 | International Business Machines Corporation | Chip authentication using scan chains |
US20130202107A1 (en) * | 2010-01-18 | 2013-08-08 | Institut Telecom-Telecom Paris Tech | Integrated Silicon Circuit Comprising a Physicallly Non-Reproducible Function, and Method and System for Testing Such a Circuit |
US20130307578A1 (en) * | 2012-05-15 | 2013-11-21 | Nxp B.V. | Tamper resistant ic |
US20140025944A1 (en) * | 2012-07-19 | 2014-01-23 | Atmel Corporation | Secure Storage and Signature |
US20140108786A1 (en) * | 2011-03-11 | 2014-04-17 | Emsycon Gmbh | Tamper-protected hardware and method for using same |
US20140103344A1 (en) * | 2012-03-12 | 2014-04-17 | Mohammad Tehranipoor | Detection of recovered integrated circuits |
US20140156998A1 (en) * | 2012-11-30 | 2014-06-05 | Certicom Corp. | Challenge-Response Authentication Using a Masked Response Value |
US20140327468A1 (en) * | 2013-05-03 | 2014-11-06 | International Business Machines Corporation | Physical unclonable function generation and management |
US20140340112A1 (en) * | 2013-03-15 | 2014-11-20 | University Of Connecticut | Methods and systems for hardware piracy prevention |
US20150007337A1 (en) * | 2013-07-01 | 2015-01-01 | Christian Krutzik | Solid State Drive Physical Uncloneable Function Erase Verification Device and Method |
US20150046715A1 (en) * | 2013-08-06 | 2015-02-12 | Ologn Technologies Ag | Systems, Methods and Apparatuses for Prevention of Unauthorized Cloning of a Device |
US20150143545A1 (en) * | 2012-05-25 | 2015-05-21 | Rainer Falk | Function for the Challenge Derivation for Protecting Components in a Challenge-Response Authentication Protocol |
US9071638B1 (en) * | 2004-04-01 | 2015-06-30 | Fireeye, Inc. | System and method for malware containment |
US20150192637A1 (en) * | 2012-07-17 | 2015-07-09 | Siemens Aktiengesellschaft | Use of a (Digital) PUF for Implementing Physical Degradation/Tamper Recognition for a Digital IC |
US20150229477A1 (en) * | 2014-02-10 | 2015-08-13 | Ims Health Incorporated | System and method for remote access, remote digital signature |
US9129536B2 (en) * | 2012-08-31 | 2015-09-08 | Freescale Semiconductor, Inc. | Circuit for secure provisioning in an untrusted environment |
US20160047855A1 (en) * | 2014-08-15 | 2016-02-18 | Case Western Reserve University | Pcb authentication and counterfeit detection |
US20160173289A1 (en) * | 2014-12-15 | 2016-06-16 | Honeywell International Inc. | Physically uncloneable function device using mram |
US20160357176A1 (en) * | 2015-06-02 | 2016-12-08 | Rockwell Automation Technologies, Inc. | Security System for Industrial Control Infrastructure |
US20170005811A1 (en) * | 2015-06-30 | 2017-01-05 | Maxim Integrated Products, Inc. | Systems and methods for authentication based on physically unclonable functions |
US20170091438A1 (en) * | 2015-09-25 | 2017-03-30 | Intel Corporation | Secure authentication protocol systems and methods |
US20170155389A1 (en) * | 2015-12-01 | 2017-06-01 | Semiconductor Manufacturing International (Beijing) Corpor | Physically unclonable product and fabrication method thereof |
US9740888B1 (en) * | 2014-02-07 | 2017-08-22 | Seagate Technology Llc | Tamper evident detection |
US20170329954A1 (en) * | 2016-05-13 | 2017-11-16 | Regents Of The University Of Minnesota | Robust device authentication |
US20170366553A1 (en) * | 2016-06-16 | 2017-12-21 | Ca, Inc. | Restricting access to content based on a posterior probability that a terminal signature was received from a previously unseen computer terminal |
US20180013779A1 (en) * | 2016-07-06 | 2018-01-11 | Power Fingerprinting Inc. | Methods and apparatuses for integrity validation of remote devices using side-channel information in a power signature analysis |
US20180019872A1 (en) * | 2016-06-03 | 2018-01-18 | Chronicled, Inc. | Open registry for internet of things including sealed materials |
US9876645B1 (en) * | 2015-02-17 | 2018-01-23 | Amazon Technologies, Inc. | Tamper detection for hardware devices |
US9892293B1 (en) * | 2016-12-16 | 2018-02-13 | Square, Inc. | Tamper detection system |
US9934411B2 (en) * | 2015-07-13 | 2018-04-03 | Texas Instruments Incorporated | Apparatus for physically unclonable function (PUF) for a memory array |
US20180102907A1 (en) * | 2016-10-07 | 2018-04-12 | Taiwan Semiconductor Manufacturing Co., Ltd. | Sram-based authentication circuit |
US9948470B2 (en) * | 2013-08-23 | 2018-04-17 | Qualcomm Incorporated | Applying circuit delay-based physically unclonable functions (PUFs) for masking operation of memory-based PUFs to resist invasive and clone attacks |
US20180167391A1 (en) * | 2016-12-14 | 2018-06-14 | The Boeing Company | Authenticating an aircraft data exchange using detected differences of onboard electronics |
US10002362B1 (en) * | 2016-12-21 | 2018-06-19 | Merck Patent Gmbh | Composite security marking |
US20180183613A1 (en) * | 2015-07-01 | 2018-06-28 | Secure-Ic Sas | Embedded test circuit for physically unclonable function |
US10019571B2 (en) * | 2016-03-13 | 2018-07-10 | Winbond Electronics Corporation | Protection from side-channel attacks by varying clock delays |
US20180225459A1 (en) * | 2015-04-14 | 2018-08-09 | Capital One Services, Llc | System and methods for secure firmware validation |
US10107855B1 (en) * | 2014-11-07 | 2018-10-23 | Xilinx, Inc. | Electromagnetic verification of integrated circuits |
US10127409B1 (en) * | 2016-12-16 | 2018-11-13 | Square, Inc. | Tamper detection system |
US10132858B2 (en) * | 2013-07-29 | 2018-11-20 | Nxp B.V. | PUF method using and circuit having an array of bipolar transistors |
US20190028284A1 (en) * | 2017-07-18 | 2019-01-24 | Square, Inc. | Devices with modifiable physically unclonable functions |
US20190028283A1 (en) * | 2017-07-18 | 2019-01-24 | Square, Inc | Device security with physically unclonable functions |
US20190026457A1 (en) * | 2016-01-11 | 2019-01-24 | Stc.Unm | A privacy-preserving, mutual puf-based authentication protocol |
US20190334730A1 (en) * | 2018-04-30 | 2019-10-31 | Merck Patent Gmbh | Composite security marking and methods and apparatuses for providing and reading same |
US20190378575A1 (en) * | 2018-06-08 | 2019-12-12 | Taiwan Semiconductor Manufacturing Co., Ltd. | Method and apparatus for puf generator characterization |
US20190384915A1 (en) * | 2017-12-22 | 2019-12-19 | The Boeing Company | Countermeasures to frequency alteration attacks on ring oscillator based physical unclonable functions |
US10554405B1 (en) * | 2018-12-20 | 2020-02-04 | Merck Patent Gmbh | Methods and systems for preparing and performing an object authentication |
US10587421B2 (en) * | 2017-01-12 | 2020-03-10 | Honeywell International Inc. | Techniques for genuine device assurance by establishing identity and trust using certificates |
US20200089866A1 (en) * | 2015-12-02 | 2020-03-19 | Power Fingerprinting Inc. | Methods and apparatuses for validating supply chain for electronic devices using side-channel information in a signature analysis |
US20200099541A1 (en) * | 2017-02-10 | 2020-03-26 | Telefonaktiebolaget Lm Ericsson (Publ) | Methods of verifying that a first device and a second device are physically interconnected |
US20200344077A1 (en) * | 2019-04-24 | 2020-10-29 | International Business Machines Corporation | On-chipset certification to prevent spy chip |
US20200351108A1 (en) * | 2018-01-19 | 2020-11-05 | Renesas Electronics Corporation | Semiconductor device, update data-providing method, update data-receiving method, and program |
US10877531B2 (en) * | 2015-08-03 | 2020-12-29 | Texas Instruments Incorporated | Methods and apparatus to create a physically unclonable function |
US20210148977A1 (en) * | 2019-11-14 | 2021-05-20 | University Of Florida Research Foundation, Inc. | Side-channel signature based pcb authentication using jtag architecture and a challenge-response mechanism |
US20210258174A1 (en) * | 2018-10-17 | 2021-08-19 | Nokia Solutions And Networks Oy | Secure cryptoprocessor |
US20220147974A1 (en) * | 2018-03-13 | 2022-05-12 | Ethernom, Inc. | Secure tamper resistant smart card |
-
2019
- 2019-07-01 US US16/459,043 patent/US11269999B2/en active Active
-
2022
- 2022-03-07 US US17/688,757 patent/US20220198008A1/en not_active Abandoned
Patent Citations (79)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US5254942A (en) * | 1991-04-25 | 1993-10-19 | Daniel D'Souza | Single chip IC tester architecture |
US20070169183A1 (en) * | 1998-10-13 | 2007-07-19 | Nds Limited | Remote administration of smart cards for secure access systems |
US20040030912A1 (en) * | 2001-05-09 | 2004-02-12 | Merkle James A. | Systems and methods for the prevention of unauthorized use and manipulation of digital content |
US20030204743A1 (en) * | 2002-04-16 | 2003-10-30 | Srinivas Devadas | Authentication of integrated circuits |
US20040181303A1 (en) * | 2002-12-02 | 2004-09-16 | Silverbrook Research Pty Ltd | Relatively unique ID in integrated circuit |
US20130145467A1 (en) * | 2002-12-12 | 2013-06-06 | Victor J. Yodaiken | Systems and methods for detecting a security breach in a computer system |
US20080034334A1 (en) * | 2004-02-17 | 2008-02-07 | Oussama Laouamri | Integrated Circuit Chip with Communication Means Enabling Remote Control of Testing Means of Ip Cores of the Integrated Circuit |
US9071638B1 (en) * | 2004-04-01 | 2015-06-30 | Fireeye, Inc. | System and method for malware containment |
US20060273438A1 (en) * | 2005-06-03 | 2006-12-07 | International Business Machines Corporation | Stacked chip security |
US20100031065A1 (en) * | 2006-11-06 | 2010-02-04 | Yuichi Futa | Information security apparatus |
US20100073147A1 (en) * | 2006-12-06 | 2010-03-25 | Koninklijke Philips Electronics N.V. | Controlling data access to and from an rfid device |
US20090320110A1 (en) * | 2008-06-23 | 2009-12-24 | Nicolson Kenneth Alexander | Secure boot with optional components method |
US20100127822A1 (en) * | 2008-11-21 | 2010-05-27 | Verayo, Inc. | Non-networked rfid-puf authentication |
US20100241864A1 (en) * | 2008-11-21 | 2010-09-23 | Dafca, Inc. | Authenticating an integrated circuit based on stored information |
US20130202107A1 (en) * | 2010-01-18 | 2013-08-08 | Institut Telecom-Telecom Paris Tech | Integrated Silicon Circuit Comprising a Physicallly Non-Reproducible Function, and Method and System for Testing Such a Circuit |
US20130047209A1 (en) * | 2010-03-24 | 2013-02-21 | National Institute Of Advanced Industrial Science And Technology | Authentication processing method and apparatus |
US20130082733A1 (en) * | 2010-06-07 | 2013-04-04 | Mitsubishi Electric Corporation | Signal processing system |
US20120131340A1 (en) * | 2010-11-19 | 2012-05-24 | Philippe Teuwen | Enrollment of Physically Unclonable Functions |
US20120158339A1 (en) * | 2010-12-21 | 2012-06-21 | Stmicroelectronics Pvt. Ltd. | Calibration arrangement |
US20120233685A1 (en) * | 2011-03-09 | 2012-09-13 | Qualcomm Incorporated | Method for authentication of a remote station using a secure element |
US20160359636A1 (en) * | 2011-03-11 | 2016-12-08 | Emsycon Gmbh | Tamper-protected hardware and method for using same |
US20140108786A1 (en) * | 2011-03-11 | 2014-04-17 | Emsycon Gmbh | Tamper-protected hardware and method for using same |
US20130141137A1 (en) * | 2011-06-01 | 2013-06-06 | ISC8 Inc. | Stacked Physically Uncloneable Function Sense and Respond Module |
US20130133031A1 (en) * | 2011-11-22 | 2013-05-23 | International Business Machines Corporation | Retention Based Intrinsic Fingerprint Identification Featuring A Fuzzy Algorithm and a Dynamic Key |
US20130198873A1 (en) * | 2012-01-27 | 2013-08-01 | International Business Machines Corporation | Chip authentication using scan chains |
US20140103344A1 (en) * | 2012-03-12 | 2014-04-17 | Mohammad Tehranipoor | Detection of recovered integrated circuits |
US9509306B2 (en) * | 2012-05-15 | 2016-11-29 | Nxp B.V. | Tamper resistant IC |
US20130307578A1 (en) * | 2012-05-15 | 2013-11-21 | Nxp B.V. | Tamper resistant ic |
US20150143545A1 (en) * | 2012-05-25 | 2015-05-21 | Rainer Falk | Function for the Challenge Derivation for Protecting Components in a Challenge-Response Authentication Protocol |
US20150192637A1 (en) * | 2012-07-17 | 2015-07-09 | Siemens Aktiengesellschaft | Use of a (Digital) PUF for Implementing Physical Degradation/Tamper Recognition for a Digital IC |
US20140025944A1 (en) * | 2012-07-19 | 2014-01-23 | Atmel Corporation | Secure Storage and Signature |
US9129536B2 (en) * | 2012-08-31 | 2015-09-08 | Freescale Semiconductor, Inc. | Circuit for secure provisioning in an untrusted environment |
US20140156998A1 (en) * | 2012-11-30 | 2014-06-05 | Certicom Corp. | Challenge-Response Authentication Using a Masked Response Value |
US20140340112A1 (en) * | 2013-03-15 | 2014-11-20 | University Of Connecticut | Methods and systems for hardware piracy prevention |
US20140327468A1 (en) * | 2013-05-03 | 2014-11-06 | International Business Machines Corporation | Physical unclonable function generation and management |
US20150007337A1 (en) * | 2013-07-01 | 2015-01-01 | Christian Krutzik | Solid State Drive Physical Uncloneable Function Erase Verification Device and Method |
US10132858B2 (en) * | 2013-07-29 | 2018-11-20 | Nxp B.V. | PUF method using and circuit having an array of bipolar transistors |
US20150046715A1 (en) * | 2013-08-06 | 2015-02-12 | Ologn Technologies Ag | Systems, Methods and Apparatuses for Prevention of Unauthorized Cloning of a Device |
US9948470B2 (en) * | 2013-08-23 | 2018-04-17 | Qualcomm Incorporated | Applying circuit delay-based physically unclonable functions (PUFs) for masking operation of memory-based PUFs to resist invasive and clone attacks |
US9740888B1 (en) * | 2014-02-07 | 2017-08-22 | Seagate Technology Llc | Tamper evident detection |
US20150229477A1 (en) * | 2014-02-10 | 2015-08-13 | Ims Health Incorporated | System and method for remote access, remote digital signature |
US20160047855A1 (en) * | 2014-08-15 | 2016-02-18 | Case Western Reserve University | Pcb authentication and counterfeit detection |
US10107855B1 (en) * | 2014-11-07 | 2018-10-23 | Xilinx, Inc. | Electromagnetic verification of integrated circuits |
US20160173289A1 (en) * | 2014-12-15 | 2016-06-16 | Honeywell International Inc. | Physically uncloneable function device using mram |
US9876645B1 (en) * | 2015-02-17 | 2018-01-23 | Amazon Technologies, Inc. | Tamper detection for hardware devices |
US20180159690A1 (en) * | 2015-02-17 | 2018-06-07 | Amazon Technologies, Inc. | Tamper detection for hardware devices |
US20180225459A1 (en) * | 2015-04-14 | 2018-08-09 | Capital One Services, Llc | System and methods for secure firmware validation |
US20160357176A1 (en) * | 2015-06-02 | 2016-12-08 | Rockwell Automation Technologies, Inc. | Security System for Industrial Control Infrastructure |
US20170005811A1 (en) * | 2015-06-30 | 2017-01-05 | Maxim Integrated Products, Inc. | Systems and methods for authentication based on physically unclonable functions |
US20180183613A1 (en) * | 2015-07-01 | 2018-06-28 | Secure-Ic Sas | Embedded test circuit for physically unclonable function |
US9934411B2 (en) * | 2015-07-13 | 2018-04-03 | Texas Instruments Incorporated | Apparatus for physically unclonable function (PUF) for a memory array |
US10877531B2 (en) * | 2015-08-03 | 2020-12-29 | Texas Instruments Incorporated | Methods and apparatus to create a physically unclonable function |
US20170091438A1 (en) * | 2015-09-25 | 2017-03-30 | Intel Corporation | Secure authentication protocol systems and methods |
US20170155389A1 (en) * | 2015-12-01 | 2017-06-01 | Semiconductor Manufacturing International (Beijing) Corpor | Physically unclonable product and fabrication method thereof |
US20200089866A1 (en) * | 2015-12-02 | 2020-03-19 | Power Fingerprinting Inc. | Methods and apparatuses for validating supply chain for electronic devices using side-channel information in a signature analysis |
US20190026457A1 (en) * | 2016-01-11 | 2019-01-24 | Stc.Unm | A privacy-preserving, mutual puf-based authentication protocol |
US10019571B2 (en) * | 2016-03-13 | 2018-07-10 | Winbond Electronics Corporation | Protection from side-channel attacks by varying clock delays |
US20170329954A1 (en) * | 2016-05-13 | 2017-11-16 | Regents Of The University Of Minnesota | Robust device authentication |
US20180019872A1 (en) * | 2016-06-03 | 2018-01-18 | Chronicled, Inc. | Open registry for internet of things including sealed materials |
US20170366553A1 (en) * | 2016-06-16 | 2017-12-21 | Ca, Inc. | Restricting access to content based on a posterior probability that a terminal signature was received from a previously unseen computer terminal |
US20180013779A1 (en) * | 2016-07-06 | 2018-01-11 | Power Fingerprinting Inc. | Methods and apparatuses for integrity validation of remote devices using side-channel information in a power signature analysis |
US20180102907A1 (en) * | 2016-10-07 | 2018-04-12 | Taiwan Semiconductor Manufacturing Co., Ltd. | Sram-based authentication circuit |
US20180167391A1 (en) * | 2016-12-14 | 2018-06-14 | The Boeing Company | Authenticating an aircraft data exchange using detected differences of onboard electronics |
US9892293B1 (en) * | 2016-12-16 | 2018-02-13 | Square, Inc. | Tamper detection system |
US10127409B1 (en) * | 2016-12-16 | 2018-11-13 | Square, Inc. | Tamper detection system |
US10002362B1 (en) * | 2016-12-21 | 2018-06-19 | Merck Patent Gmbh | Composite security marking |
US10587421B2 (en) * | 2017-01-12 | 2020-03-10 | Honeywell International Inc. | Techniques for genuine device assurance by establishing identity and trust using certificates |
US20200099541A1 (en) * | 2017-02-10 | 2020-03-26 | Telefonaktiebolaget Lm Ericsson (Publ) | Methods of verifying that a first device and a second device are physically interconnected |
US20190028284A1 (en) * | 2017-07-18 | 2019-01-24 | Square, Inc. | Devices with modifiable physically unclonable functions |
US20190028283A1 (en) * | 2017-07-18 | 2019-01-24 | Square, Inc | Device security with physically unclonable functions |
US20190384915A1 (en) * | 2017-12-22 | 2019-12-19 | The Boeing Company | Countermeasures to frequency alteration attacks on ring oscillator based physical unclonable functions |
US20200351108A1 (en) * | 2018-01-19 | 2020-11-05 | Renesas Electronics Corporation | Semiconductor device, update data-providing method, update data-receiving method, and program |
US20220147974A1 (en) * | 2018-03-13 | 2022-05-12 | Ethernom, Inc. | Secure tamper resistant smart card |
US20190334730A1 (en) * | 2018-04-30 | 2019-10-31 | Merck Patent Gmbh | Composite security marking and methods and apparatuses for providing and reading same |
US20190378575A1 (en) * | 2018-06-08 | 2019-12-12 | Taiwan Semiconductor Manufacturing Co., Ltd. | Method and apparatus for puf generator characterization |
US20210258174A1 (en) * | 2018-10-17 | 2021-08-19 | Nokia Solutions And Networks Oy | Secure cryptoprocessor |
US10554405B1 (en) * | 2018-12-20 | 2020-02-04 | Merck Patent Gmbh | Methods and systems for preparing and performing an object authentication |
US20200344077A1 (en) * | 2019-04-24 | 2020-10-29 | International Business Machines Corporation | On-chipset certification to prevent spy chip |
US20210148977A1 (en) * | 2019-11-14 | 2021-05-20 | University Of Florida Research Foundation, Inc. | Side-channel signature based pcb authentication using jtag architecture and a challenge-response mechanism |
Also Published As
Publication number | Publication date |
---|---|
US20210004462A1 (en) | 2021-01-07 |
US11269999B2 (en) | 2022-03-08 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US11153101B2 (en) | Scalable certificate management system architectures | |
US11258769B2 (en) | Provisioning network keys to devices to allow them to provide their identity | |
US10425282B2 (en) | Verifying a network configuration | |
US10999072B2 (en) | Trusted remote proving method, apparatus and system | |
US20180278584A1 (en) | Authenticating Access Configuration for Application Programming Interfaces | |
US20220198008A1 (en) | Protecting computing devices from malicious tampering | |
US11108803B2 (en) | Determining security vulnerabilities in application programming interfaces | |
US10608996B2 (en) | Trust status of a communication session | |
US10715547B2 (en) | Detecting “man-in-the-middle” attacks | |
TW201939922A (en) | Policy Deployment Method, Apparatus, System and Computing System of Trusted Server | |
US11252193B2 (en) | Attestation service for enforcing payload security policies in a data center | |
CN111866044A (en) | Data acquisition method, device, equipment and computer readable storage medium | |
CN112134692B (en) | Remote certification mode negotiation method and device | |
CN109587134B (en) | Method, apparatus, device and medium for secure authentication of interface bus | |
Tseng et al. | A comprehensive 3‐dimensional security analysis of a controller in software‐defined networking | |
US10979297B1 (en) | Network inventory reporting device | |
US20210195418A1 (en) | A technique for authenticating data transmitted over a cellular network | |
US20230153429A1 (en) | Method and Device for Identifying Malicious Services in a Network | |
US20240106842A1 (en) | Implementing network security rules in home routers | |
US20220131856A1 (en) | Remote Attestation Method and Apparatus | |
Kohnhäuser | Advanced Remote Attestation Protocols for Embedded Systems | |
CN116074021A (en) | Access method, device, equipment and storage medium of zero trust network | |
JP2013003968A (en) | Log management device, log management method and log management program | |
Di Sarno et al. | D5. 1.4-Resilient SIEM Framework Architecture, Services and Protocols | |
JP2020028068A (en) | Communication system and communication method |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
AS | Assignment |
Owner name: AT&T INTELLECTUAL PROPERTY I, L.P., GEORGIA Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:SORYAL, JOSEPH;REEL/FRAME:059189/0795 Effective date: 20190701 |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: DOCKETED NEW CASE - READY FOR EXAMINATION |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: NON FINAL ACTION MAILED |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: FINAL REJECTION MAILED |
|
STCB | Information on status: application discontinuation |
Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION |