US20070220603A1 - Data Processing Method and Device - Google Patents

Data Processing Method and Device Download PDF

Info

Publication number
US20070220603A1
US20070220603A1 US11/660,218 US66021805A US2007220603A1 US 20070220603 A1 US20070220603 A1 US 20070220603A1 US 66021805 A US66021805 A US 66021805A US 2007220603 A1 US2007220603 A1 US 2007220603A1
Authority
US
United States
Prior art keywords
action
step
method according
verification
data
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
US11/660,218
Inventor
Francis Chamberot
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Idemia France SAS
Original Assignee
Idemia France SAS
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority to FR0408928A priority Critical patent/FR2874440B1/en
Priority to FR0408928 priority
Application filed by Idemia France SAS filed Critical Idemia France SAS
Priority to PCT/FR2005/002083 priority patent/WO2006021686A2/en
Assigned to OBERTHUR CARD SYSTEMS SA reassignment OBERTHUR CARD SYSTEMS SA ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: CHAMBEROT, FRANCIS
Publication of US20070220603A1 publication Critical patent/US20070220603A1/en
Assigned to OBERTHUR TECHNOLOGIES reassignment OBERTHUR TECHNOLOGIES CHANGE OF NAME (SEE DOCUMENT FOR DETAILS). Assignors: OBERTHUR CARD SYSTEMS SA
Application status is Abandoned legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F21/00Security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
    • G06F21/60Protecting data
    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F21/00Security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
    • G06F21/50Monitoring users, programs or devices to maintain the integrity of platforms, e.g. of processors, firmware or operating systems
    • G06F21/55Detecting local intrusion or implementing counter-measures
    • G06F21/552Detecting local intrusion or implementing counter-measures involving long-term monitoring or reporting

Abstract

The invention concerns a data processing method comprising a step (E308) which consists in verifying a criterion indicative of the normal running of the method and a step (E320) which consists in processing performed in case of negative verification. The processing step (E230) is separated from the verifying step (E308) by an intermediate step (E312, E314) of non-null duration. The intermediate step (E312, E314) and/or the processing step (E320) includes at least one action (E314) performed in case of positive verification. The invention also concerns a corresponding device.

Description

    DATA PROCESSING METHOD AND DEVICE
  • The present invention concerns a method of processing data, used for example in a microcircuit card.
  • In certain contexts, one seeks to render secure the operation of data processing apparatus. This is in particular the case in the field of monetics, in which an electronic entity (for example microcircuit card) carries information representing a pecuniary value and which can therefore be modified only in accordance with a particular protocol. It may equally be a question of an electronic entity for identifying its carrier, in which case operation must be rendered secure to prevent any falsification or abusive use.
  • One such electronic entity is for example a bank card, a telephone SIM card (the acronym SIM stemming from the English Subscriber Identity Module), an electronic passport, a secure module of the HSM type (from the English Hardware Security Module) such as a PCMCIA card of the IBM4758 type, without these examples being limiting.
  • In order to make operation more secure, one seeks to be protected against the various types of attack that may be envisaged. One large category of attacks to be combated consists of attacks known as fault generation attacks, during which malicious persons seek to cause the data processing apparatus to depart from its normal, and thus secure, operation.
  • To parry this kind of attack, the data processing methods commonly used provide steps for verification of the normal running of the method, with the aim of detecting anomalies one possible origin whereof is a fault generation attack. If an anomaly is detected (i.e. if normal running is not verified), the anomaly is processed immediately, and this is generally called security processing. This type of processing consists in fact in a countermeasure intended to combat the attack, for example by prohibiting all subsequent operation of the data processing apparatus.
  • As indicated, the processing of the anomaly is usually thought of as following on immediately from detection, since the fact of continuing the processing in the presence of an anomaly clearly entails the risk of further degrading the operation of the data processing apparatus and therefore its security.
  • However, the inventor has noted that this ordinary thinking gives the attacker information as to the moment at which the anomaly is detected. In fact, the time of detection of the anomaly is in itself difficult to access from outside. It is nevertheless thought that the attacker, by observing and analyzing the electrical consumption (or the electromagnetic radiation) of the apparatus, can obtain access to the time of implementation of the processing of the anomaly, for example in the case where this processing consists in an action on an external device. Since according to the ordinary thinking this processing follows on immediately from the detection of the anomaly, the attacker could deduce relatively easily from this the time of detection of the anomaly.
  • Accordingly, because of the proximity of the detection of the anomaly and of the processing thereof in the usual systems, the attacker has access to additional information on the operation of the data processing apparatus, which of course compromises making the method secure.
  • In order in particular to avoid this problem, and consequently to improve further the security of the data processing methods, the invention proposes a data processing method comprising a step of verification of a criterion indicative of the normal running of the method and a processing step effected in the case of negative verification, wherein the processing step is separated from the verification step by an intermediate step of non-null duration.
  • A first action being effected in the case of positive verification, the intermediate step entails effecting at least one second action having at least one first characteristic in common with the first action.
  • An attacker seeking to understand the operation of the method will therefore have difficulties in distinguishing normal operation (positive verification) from operation in the case of an anomaly (i.e. negative verification).
  • The second action is different from the first action, for example. Thus the second action may comprise fewer risks, or even no risk, to the security of the system.
  • If a third action is effected in the case of positive verification, the processing step may entail effecting at least one fourth action having at least one second characteristic in common with the third action.
  • Thus an attacker will not be able to distinguish between the modes of operation. He will therefore not be able to prevent the processing of the anomaly.
  • If the method is implemented in electronic apparatus, the first or second common characteristic is for example the electrical consumption or the electromagnetic radiation of the apparatus generated by the first, respectively the third, action and by the second, respectively the fourth, action. Thus an attacker will not be able to distinguish the normal mode of operation from the abnormal mode of operation by observation of the electrical and/or electromagnetic behavior of the electronic apparatus.
  • The first or second common characteristic may equally be the number of instructions used in the first, respectively the third, action and in the second, respectively the fourth, action, which makes it impossible to distinguish between the modes of operation by the duration of said actions.
  • The first or second common characteristic may further be the type of instruction used by the first, respectively the third, action and by the second, respectively the fourth, action, which ensures great similarity in the electrical and/or electromagnetic signature of said actions.
  • The first or second common characteristic may also be the type of data processed by the first, respectively the third, action and by the second, respectively the fourth, action, which also ensures such similarity.
  • If the first, respectively the third, action entails access to a first area of a memory, the second, respectively the fourth, action may entail access to a second area of said memory different from the first area. The processing of the data therefore appears similar in both the modes of operation mentioned above although it is in fact effected in different contexts.
  • According to another possibility, the first or second common characteristic is communication with an external device, which may be for example a cryptoprocessor, a memory (e.g. a rewritable semiconductor memory), or a user terminal. Communication with such external devices is in fact observed by attackers and this common characteristic is therefore particularly likely to lead them into error.
  • The first action may entail a secure step, for example a cryptographic algorithm, which is thus protected against fault generation attacks.
  • The processing step entails for example writing blocking data into a physical memory.
  • According to a particularly interesting possibility, the writing of blocking data may be effected in accordance with a chronology identical to writing data into the physical memory in the case of normal running of the method.
  • In one embodiment, this data represents a pecuniary value. Thus an attacker will not be able to distinguish a priori between blocking of the apparatus and an operation on the value that it represents.
  • The criterion is negative for example if an erroneous signature is provided or if an anomaly is detected.
  • The criterion may also be negative if an attack is detected. As has been stated, the invention is of particular interest in this context.
  • The criterion may equally be negative if a functional error is detected. Such processing in the case of functional error is unusual but proves interesting for enhancing the security of the system, in particular because such functional errors are very rare outside attack situations and thus reflect the probable presence of an attack.
  • According to one possible feature, the intermediate step entails at least one instruction determined during the execution of the method, for example randomly. The understanding of the operation of the system by the attacker is further complicated by this.
  • According to one possible embodiment, a microcircuit card comprises a microprocessor and the method is executed by the microprocessor.
  • The invention also proposes a data processing device comprising means for verification of a criterion indicative of the normal operation of the device and processing means used in the case of negative verification characterized by separation means for separating the operation of the verification means from the operation of the processing means by a non-null duration.
  • According to one implementation possibility, first action means are used in the case of positive verification and the separation means have at least one first characteristic in common with the first action means.
  • According to another implementation possibility, second action means are used in the case of positive verification and the processing means have at least one second characteristic in common with the second action means.
  • This device is for example a microcircuit card.
  • The invention further proposes, in a manner that is novel in itself, a method of processing data comprising a step of verification of a criterion indicative of the normal running of the method and a processing step effected in the case of negative verification, characterized in that, a first action being effected in the case of positive verification, the processing step entails effecting at least one second action having a characteristic in common with the first action.
  • In this context, there may be provision for the second action to take place with a chronology identical to the first action in the case of normal running of the method.
  • Here the first action is for example writing data into a physical memory and the second action is for example writing blocking data into that physical memory.
  • This method may moreover have the characteristics associated with the method proposed hereinabove and the advantages that flow therefrom. Moreover, a device which comprises means for implementing the various steps of this method is proposed in the same line of thinking.
  • Other characteristics and advantages of the present invention will become more apparent on reading the following description, given with reference to the appended drawings, in which:
  • FIG. 1 represents diagrammatically the main elements of one possible embodiment of a microcircuit card;
  • FIG. 2 represents the general physical appearance of the microcircuit card from FIG. 1;
  • FIG. 3 represents a method implemented in accordance with a first embodiment of the invention;
  • FIG. 4 represents a method implemented in accordance with a second embodiment of the invention;
  • FIG. 5 represents a method implemented in accordance with a third embodiment of the invention.
  • The microcircuit card 10 the principal elements whereof are represented in FIG. 1 includes a microprocessor 2 connected on the one hand to a random access memory (or RAM from the English Random Access Memory) 4 and on the other hand to a rewritable semiconductor memory 6, for example a read-only memory that can be erased and programmed electrically (or EEPROM from the English Electrically Erasable Programmable Read Only Memory). Alternatively, the rewritable semiconductor memory 6 could be a flash memory.
  • The memories 4, 6 are each connected to the microprocessor 2 by a bus in FIG. 1; alternatively, there could be one common bus.
  • The microcircuit card 10 also includes an interface 8 for communication with a user terminal that here takes the form of contacts one of which provides a bidirectional link with the microprocessor 2, for example. The interface 8 thus enables bidirectional communication to be set up between the microprocessor 2 and the user terminal into which the microcircuit card 10 will be inserted.
  • Thus, on insertion of the microcircuit card 10 into a user terminal, the microprocessor 2 executes a method of operation of the microcircuit card 10 in accordance with a set of instructions stored for example in a read-only memory (or ROM from the English Read-Only Memory)—not shown—or in the rewritable memory 6, which defines a computer program. This method generally includes the exchange of data with the user terminal via the interface 8 and the processing of data within the microcircuit card 10, and precisely within the microprocessor 2, possibly using data stored in the rewritable memory 6 and data stored temporarily in the random access memory 4.
  • Examples of such methods that use the invention are given hereinafter with reference to FIGS. 3 to 5.
  • FIG. 2 represents the general physical appearance of the microcircuit card 10 produced with the general shape of a rectangular parallelepiped of very small thickness.
  • The communication interface 8 provided with the contacts already mentioned is clearly apparent on the face of the microcircuit card 10 visible in FIG. 2, in the form of a rectangle inscribed within the upper face of the microcircuit card 10.
  • FIG. 3 represents a method of reading in the rewritable memory 6 such as may be used by the microcircuit card 10 shown in FIG. 1, for example thanks to the execution of a computer program within the microprocessor 2. This method is given as a first example of use of the invention.
  • Such a method is used for example if the data written in the rewritable memory (or EEPROM) 6 must be used by the microprocessor 2 during a data processing operation; the data written in rewritable memory 6 is read beforehand in order to be transferred into the random access memory 4 in which it can be easily manipulated.
  • As represented in the step E302 of FIG. 3, the method receives as input the address ADR at which it must effect the read operation in the rewritable memory (or EEPROM) 6.
  • In the step E304, the method initializes to the value 0 an error flag S.
  • The variables being correctly initialized, there follow verification steps dedicated to ensuring safe operation of the system as is generally required in the secure contexts of use of microcircuit cards.
  • Thus the step E306 proceeds to the verification of a checksum. Naturally, other verifications are possible but have not been represented in FIG. 3 to clarify the explanation of the invention.
  • The step E306 of verification of a checksum consists in verifying that the data written in rewritable memory 6 is consistent with the checksum (sometimes designated by the Anglo-Saxon terminology “checksum”) associated with that data.
  • There is then an alternative in the step E308: if the checksum is not erroneous, i.e. the verification of the checksum is positive, normal operation continues with the step E314 described hereinafter; if on the other hand an error is detected in the checksum, i.e. the step of verification of the checksum gives a negative result, there follows the step E310.
  • It may be noted that the presence of an erroneous checksum, although it indicates in all cases abnormal operation of the microcircuit card, may have diverse origins: it may be a question of a functional error (for example an error in the content of the rewritable memory 6), but it may equally be a question of the trace of a fault generation attack.
  • In the step E310, the error flag S is updated to the value 1 to indicate that an anomaly has been detected.
  • There then follows the step E312 in which the read address ADR received as input to the method (see the step E302 described hereinabove) is replaced by the address of a “bait-area” situated in the rewritable memory 6. The bait-area is a memory area a priori different from the address received in the step E302; it is for example an address at which no reading should normally be effected in normal operation (i.e. during the normal and anomaly-free operation of the microcircuit card).
  • By way of example, these bait-areas may include data determined randomly and/or data of the same structure as the data that was initially to be read at the address ADR received in the step E302.
  • Note that the step E312 that has just been described does not constitute a step of processing of the anomaly detected in the step E308: for example, it is not a question of the transmission of a code relating to the detected anomaly, the display of a message relating to that anomaly, or a countermeasure when it is considered that the anomaly stems from a fault generation attack.
  • After the step E312 (effected like the step E310 only in the case of abnormal operation of the microcircuit card, i.e. in the case of negative verification of normal operation in the step E306), there follows the step E314 which, as indicated hereinabove, forms part of the normal operation of the microcircuit card (in the case of positive verification of the checksum as indicated with regard to the step E308).
  • The step E314 consists in transferring the data from the rewritable memory 6 to the random access memory 4 and therefore constitutes in this regard the core of the FIG. 3 method, following the initialization and verification steps.
  • Precisely, the step E314 reads in the rewritable memory 6 at the address specified by the variable ADR previously mentioned and writes the data it has read in random access memory 4 in an area usually referred to as buffer memory that is generally used to manipulate data.
  • It may be noted that, if the checksum has been verified positively in the preceding steps, the variable ADR actually points to the address in rewritable memory 6 received as input, i.e. to the data that must actually be read. On the other hand, if an erroneous checksum has been detected (i.e. the verification was negative in the preceding steps), the variable ADR points to the bait-area defined in the step E312 with the result that the step E314 will in fact transfer data from the bait-area into the buffer memory area in random access memory 4, and not effect the reading operation required by the address received in the step E302.
  • Accordingly, if an operating anomaly is detected by the negative verification of the step E306, no access is effected to the rewritable memory 6 at the address provided in normal operation, at which is generally stored data that is relatively sensitive from the security point of view. This gives protection against an attacker finding out this sensitive data if the origin of the detected anomaly is a fault generation attack.
  • Furthermore, the step E314 being effected in normal operation as after detection of an operating anomaly (although with different data), it is virtually impossible for an attacker to detect a departure from normal operation, for example by current measurements.
  • After the step E314, the method proceeds to the step E316 in which the error flag S is tested.
  • If the error flag S has the value 1, which corresponds to the situation in which the step E310 has been effected (i.e. the case of detection of an anomaly by negative verification of the checksum in the step E306), there follows the step E320 in which the anomaly previously detected is processed, for example by writing blocking (or lock) data in the rewritable memory 6.
  • Writing a lock in the rewritable memory 6 consists in writing certain data into that memory 6 that will prevent any subsequent use of the microcircuit card 10. For example, if the microcircuit card 10 is subsequently inserted into another user terminal, the microprocessor 2 of the microcircuit card 10 will detect the presence of the blocking data in the rewritable memory 6 and will not carry out any processing or exchange of information with the user terminal.
  • Writing a lock in the rewritable memory 6 constitutes a particularly effective countermeasure against a fault generation attack. This type of processing of the detected anomaly is therefore of particular interest if this anomaly stems from a fault generation attack or in situations where security must be of such a level that any operating anomaly must entail the blocking of the microcircuit card.
  • The countermeasure may equally consist in deleting confidential data, for example secret keys, from the rewritable memory 6.
  • Alternatively, the step of processing the anomaly could consist in updating a flag (stored in random access memory 4 or in rewritable memory 6) representative of the anomaly detected during the FIG. 3 method. This solution does not lead immediately to the blocking of the microcircuit card, but keeps a trace of the presence of an operating anomaly in order to analyze the problem encountered and where appropriate then to proceed to blocking the card (for example if other elements corroborate the hypothesis of a fault generation attack).
  • If the step E316 previously described indicates that the error flag S does not have the value 1 (i.e. that the verification of the step E306 was positive), the method continues its normal operation with the step E318, in which the address of the buffer memory area in random access memory 4 previously mentioned is sent to the output in order to enable use of the data read in the rewritable memory 6 in subsequent operation of the microcircuit card.
  • For example, if the method is a Read Record routine as defined by the ISO 7816 standard, the content of the buffer memory area is sent in the subsequent steps.
  • FIG. 4 shows a method of reading in rewritable memory 6 carried out in accordance with a second embodiment of the invention.
  • This method is for example implemented by the execution of the instructions of a computer program by the microprocessor 2.
  • As for the method of FIG. 3, the method described here receives as input, in the step E402, the address ADR at which it must read data in the rewritable memory 6.
  • There then follows in the step E404 a verification of the type of file read. The step E404 may for example consist in reading in the rewritable memory 6 the header of the file designated by the address ADR and verifying that the data of that header corresponds to data indicative of the type that must be read using the method described here.
  • If the file type designated by the header does not correspond to the type of file that must be read, there is a departure from the normal running of the method in the step E406 to go to the step E430 described hereinafter. If on the other hand the file type is correct, the step E406 leads to the step E408 of normal operation as now described.
  • In the step E408, it is verified that reading in the rewritable memory 6 is authorized by comparing the necessary access rights specified in the header of the file to those presented by the user, this information being accessible in random access memory 4.
  • If reading in rewritable memory 6 is not authorized according to the data read in random access memory, the verification of the possibility of access to that memory 6 is negative and there then follows the step E432 described hereinafter.
  • On the other hand, if it is determined in the step E410 that access to the rewritable memory 6 is authorized, there follows the step E412 for the continuation of normal operation of the microcircuit card.
  • The steps E404 to E410 therefore proceed to verifications of the normal operation of the microcircuit card. Verifications other than those given here by way of example could naturally be carried out.
  • The step E412 already mentioned, which follows if the various verifications relating to normal operation have been positive, consists in reading the data stored at the address ADR in the rewritable memory 6 in order to store it in random access memory 4 to use it subsequently during data processing effected by the miropressor 2.
  • This step therefore brings about repeated access in read mode to the rewritable memory 6 and repeated access in write mode to the random access memory 4.
  • Once the transfer of data from the rewritable memory 6 to the random access memory 4 is finished (either by reading a fixed number of bytes in rewritable memory 6 or by reading the precise number of bytes to read received for example as input in the step E402), there follows a step E414 of verifying the accuracy of the read data. To do this, for example, the data in the rewritable memory 6 is read again and that data is compared to the corresponding data previously stored in the random access memory 4 in the step E412.
  • If an error is detected during the comparison, the step E434 described hereinafter follows on from the step E416.
  • On the other hand, if all the data has been read correctly (i.e. the second reading in the rewritable memory 6 generates data identical to that read during the step E412), the step E416 leads to the continuation of normal operation in the step E418 now described.
  • The step E418 consists in returning as output the address at which the data read in the rewritable memory 6 is stored in random access memory in order for the latter data to be available for the continuation of the normal operation diagrammatically represented by the step E420 in FIG. 4. When it is a question of a Read Record command defined by the ISO 7816 standard, this step E420 consists for example in sending the data that has been read.
  • In the situation where an erroneous file type has been detected in the step E406, there follows in the manner previously indicated the step E430 of updating an error report E so that the latter indicates an error stemming from the file type.
  • The step E430 is followed by the step E450 in which a bait-area in random access memory 4 is read. The bait-area is for example a dedicated area, with no other use, that contains data dedicated to this use and that is different here for example from the access rights relating to the authorization to read in rewritable memory 6 mentioned in relation to the step E408.
  • There then follows the step E452 in which the data read in the step E450 (i.e. the data read in the bait-area which can therefore be referred to as “bait-data”) is compared to the value 0.
  • It may be noted that the steps E450 and E452 have no functional role in the operation of the card, i.e. neither the data that is processed therein nor the result of the comparison that is effected has any impact on the other portions of the method.
  • Nevertheless, for an external observer such as an attacker who is measuring the current consumption of the microcircuit card, executing these two steps is not distinguished from operations carried out during the normal operation step E408 described hereinabove. In fact, the operations of the steps E408 and E410 that consisted in reading data in random access memory and comparison thereof have a signature similar to the similar operations of reading in random access memory and comparison with the value 0 effected in the steps E450 and E452.
  • Thus, at this stage of the method, it is impossible for an attacker to determine by means of observations conducted from the outside that an anomaly has been detected in the step E406.
  • The step E452 is followed by the step E454 that will be described hereinafter.
  • In the same line of thinking as that which has just been described with regard to the detection of an erroneous file type, there follows as already stated in the case of prohibition of access to rewritable memory (steps E408 and E410) a step E432 in which the error report E is updated to indicate that the-error stems from a prohibition of reading the rewritable memory 6.
  • The step E432 is also followed by the step E454 already mentioned and that will now be described.
  • In the step E454, the method reads in a bait-area of the rewritable memory 6 and writes in a bait-area of the random access memory 4.
  • As indicated hereinabove, the bait-areas in the memories 4, 6 are areas of those memories in which data is written that has no particular function in normal operation. Access in read or in write mode to these bait-areas nevertheless allows simulation, for an external observer of the running of the method, the steps carried out during normal operation, for example in the step E412, without impacting on the data used elsewhere by the method or on the security thereof.
  • The step E454 is followed by a step E456 of reading bait-areas in the random access memory 4 and in the rewritable memory 6. As indicated hereinabove, this access in read mode has no functional role in the normal operation of the program (i.e. the data read in the bait-areas is not used in other portions of the method). However, for an attacker who is attempting to discover the internal working of the method by means of observations (for example of the electrical consumption of the microprocessor or the memories 4, 6), the steps E454 and E456 generate signatures that are respectively similar to the signatures generated by the steps E412 and E414 during normal operation.
  • Thus, at this stage of the method, it is impossible for an attacker, by external observation of the operation of the microcircuit card, to determine if the method is effecting the steps E412 and E414 of normal operation or the steps E454 or E456 subsequent to detection of an operating anomaly (whether that be an anomaly caused by an erroneous file type or a departure from normal operation through prohibition of access to the rewritable memory 6).
  • Like the steps E450 and E452, the steps E454 and E456 are not steps of processing the detected anomaly, since these steps work on bait-data, without seeking to remedy a functional error or to implement a countermeasure if the anomaly stems from a fault generation attack.
  • The step E456 is followed by the step E458 described hereinafter.
  • On the assumption mentioned hereinabove to the effect that the verification of the accuracy of the data read in the step E412 by the step E414 is negative, the step E416 is followed by the step E434 of updating the error report E in order for the latter to indicate that the error detected stems from an error in reading the rewritable memory 6.
  • The step E434 is then followed by the step E458 already mentioned and now described.
  • The step E458 consists in sending the error report as determined during a preceding step (i.e. during one of the steps E430, E432 and E434). The error code may be sent to another method (or another part-method) implemented in the microcircuit card. For example, if the method represented in FIG. 4 represents a subroutine executed if the main program managing the operation of the microcircuit card requires the rewritable memory 6 to be read, the step E458 may consist in returning the value of the error report to the main program.
  • Alternatively, the error report may be sent in the step E458 to the user terminal via the interface 8.
  • The method represented in FIG. 4 therefore has two main branches:
  • a first branch which corresponds to the normal running of the method (steps E402 to E420);
  • a second branch made up of steps at least some of which are effected after an anomaly has been detected (steps E450 to E458).
  • As has been seen, many steps of the first branch are simulated, in the case of detection of an anomaly, by a corresponding step of the second branch. For this purpose, the corresponding step of the second branch uses an instruction of the same type as the corresponding step of the first branch, and where necessary uses a call with a device identical to that used in the corresponding step of the first branch, to generate for an attacker the same signature, for example in terms of electrical consumption or electromagnetic radiation.
  • FIG. 5 represents a method used in an electronic purse type microcircuit card (sometimes designated by the English word purse) according to the teachings of a third embodiment of the invention.
  • This method is implemented for example by the execution of a program consisting of instructions in the microprocessor 2 of the microcircuit card.
  • FIG. 5 represents the main steps of the method used to credit the electronic purse, i.e. to modify the data stored in the microcircuit card that represents the value of the electronic purse in the sense of an increase in that value.
  • This method therefore begins with the step E502, in which the microprocessor 2 of the microcircuit card 10 receives from the user (via the interface 8) a credit command code C, the amount M to be credited and a signature S, as represented in the step E502.
  • As will be seen hereinafter, the signature S ensures that the user actually has an authorization to effect this credit; in fact, without this precaution, anyone could command the increase in the value of the electronic purse, which is obviously unacceptable.
  • The steps effected during normal operation of the electronic purse crediting method are now described.
  • In the step E504, which follows the step E502 in normal operation, the microprocessor commands the reading in the rewritable memory 6 of a key K by specifying the storage address of this key in the rewritable memory 6. The key K stored in the rewritable memory 6 is secret and is therefore not accessible from the outside.
  • The microprocessor 2 then proceeds to the step E506 of sending the secret key K read in the rewritable memory 6 and the control code C to a cryptoprocessor (not shown in FIG. 1). The cryptoprocessor effects a cryptographic calculation on the basis of the data received (secret key K, command code C) using conventional cryptographic algorithms, for example the DES algorithm. More precisely, an algorithm is applied here to the control code C using the secret key K in order to obtain a calculated signal S1.
  • The cryptoprocessor then returns to the microprocessor 2 the calculated signature S1 (step E508).
  • If the user who commands the execution of the method is effectively authorized to effect the credit, he also knows the secret key K and can therefore determine the signature S in exactly the same way as the calculation that has just been effected to calculate the signature S1.
  • This is why in the step E the signature received from the user S is compared to the signature S1 calculated on the basis of the secret key K stored in the microcircuit card, which makes it possible to determine if writing the credit is authorized.
  • Accordingly, if the comparison between S and S1 is positive in the step E510, there follows the step E512 in which the amount M to be credited, or alternatively the value of the electronic purse resulting from that credit, is written in the rewritable memory 6.
  • On the other hand, if it is determined in the step E510 that the signature S received from the user does not correspond to the calculated signature S1, crediting the electronic purse cannot be authorized and there then follows the step E514 in which the microprocessor 2 sends the user terminal an error report.
  • There have just been described the main steps of a method of crediting an electronic purse. Naturally, to ensure secure working of this method, these main steps are separated by steps for verification of the normal running of the method, which by various tests detect functional errors on the one hand and attacks on the other hand, for example fault generation attacks.
  • In the case of detection of an attack, the microprocessor 2 carries out steps different from those of normal operation, as now described. According to a variant that maybe compatible with what is described hereinafter, the method may also use these different steps (or other steps different from normal operation) if a functional error is detected (rather than an attack).
  • As represented in dashed line in FIG. 5, if an attack is detected between the step E502 of receiving data and the step E504 of reading the secret key K in the rewritable memory 6, there follows the step E520 in which data is read in the rewritable memory 6 at an address K′ that constitutes a bait-area corresponding to the area containing the secret key K read in the step E504 during normal operation.
  • Accordingly, if an attacker generates a fault attack that is detected by the microprocessor 2, there follows the step E520 whereof the electrical or electromagnetic signature (as observed by the attacker) is similar to that of the step E504 effected in normal operation. The attacker therefore thinks that his attack has not been detected and that the microprocessor 2 is actually reading the secret key K in the rewritable memory 6.
  • After the step E520, there follows the step E522. This step E522 also follows if an attack is detected by steps of verification of the normal running of the method situated between the steps E504 and E506 previously described.
  • The step E522 consists in sending to the cryptoprocessor (previously mentioned with regard to the steps E506 and E508) data C′ and K′ that constitutes bait-data. Thus the step E522 has no particular functional role, but simulates the step E506 for an attacker who is observing the operation of the microcircuit card 10 by simply studying the electrical consumption of or the electromagnetic radiation generated by the card.
  • If the step E522 is preceded by the step E520, the bait-data K′ may be the data read during the step E520. Alternatively, it may be predetermined data, and preferably data unrelated to the secure operation of the microcircuit card 10.
  • As previously, the signature of the step E522 being similar to that of the step E506 effected in normal operation, an attacker is unaware when the step E522 is executed that his attack has in fact been detected.
  • The step E522 is followed by the step E524 described hereinafter.
  • As before, the steps E506 and E508 may be separated by steps of verification of the normal operation of the method adapted to detect an attack, as represented in dashed line in FIG. 5. If an attack is detected, the method also continues at the step E524.
  • The step E524 consists in receiving a signature calculated by the cryptoprocessor. As previously for the steps E520 and E522, the step E524 simulates a step of normal operation (here the step E508) in order to prevent an attacker detecting that his preceding attack has been detected.
  • If the step E522 has previously been effected, the signature S1′ is for example the signature calculated by the cryptoprocessor on the basis of the data C′ and K′ transmitted during the step E522. In this case, the calculated signature S1′ has no functional role since it is based on bait-data.
  • When the method reaches the step E524 following the detection of an attack between the steps E506 and E508, the step E524 may for example consist in actually receiving only a portion of the signature calculated by the cryptoprocessor on the basis of the data C and K sent in the step E506. The rest of the signature S′ is for example forced to a predetermined value, such that the result obtained (signature S1′) does not correspond to the signature S1 in order to preserve the security of the operation of the microcircuit card.
  • The step E524 is followed by the step E526 in which there is effected for example a comparison of the value S1′ that has just been determined with itself. As previously, this step has no functional role (since the result of the comparison of a number with itself is obviously known in advance), but simulates the step E510 for an attacker who is observing the electrical and/or electromagnetic behavior of the microcircuit card 10. In fact, the steps E526 and E510 using the same instruction, they have very similar electrical (and electromagnetic) signatures.
  • Thus, if an attack has been detected between the steps E502 and E508 of normal operation, normal operation has been interrupted (by the switch to the bait-steps E520 to E526) without this change being detectable by an attacker, however.
  • The step E526 is followed by the step E528 in which blocking (or lock) data is written in the rewritable memory 6.
  • As already mentioned with regard to the first embodiment described with reference to FIG. 3, writing blocking data in the rewritable memory 6 prevents all subsequent use of the microcircuit card 10. It is therefore a particularly severe and effective countermeasure in response to the detection of an attack.
  • Note further that the step E528 is executed at a time when normal operation would have executed the step E512 of writing the amount in rewritable memory 6. Thus the step E528 is initially confused by the attacker with the step E512 of normal operation that has the same electrical and/or electromagnetic signature, in particular because the steps E512 and E528 correspond to instructions of the same type that both effect a communication from the microprocessor 2 to the rewritable memory 6.
  • As clearly visible in FIG. 5, the step E512 and the preceding steps of normal operation each have a similar signature step executed in the case of detection of an attack. In particular, the step E512 of writing the amount in the rewritable memory 6, which constitutes an important step in the execution of the method, is associated with the step E528 of writing blocking data, which precisely constitutes the countermeasure in the case of detection of an attack against this method.
  • Thus not only is an attacker unable to determine the detection of his attack because of the steps E520 to E526 simulating normal operation, but the countermeasure (here the writing of blocking data) is also applied with a chronology such that it is confused with a step of normal operation having a similar electrical and/or electromagnetic signature.
  • The examples that have just been described are merely possible embodiments of the invention.

Claims (29)

1. Data processing method comprising:
a step of verification of a criterion indicative of the normal running of the method; and
a processing step (E528) effected in the case of negative verification,
characterized in that, a first action (E512) being effected in the case of positive verification, the processing step entails effecting at least one second action (E528) having a first characteristic in common with the first action (E512).
2. Data processing method comprising:
a step (E308; E406, E410, E416) of verification of a criterion indicative of the normal running of the method; and
a processing step (E320; E458; E528) effected in the case of negative verification,
wherein the processing step (E320; E458; E528) is separated from the verification step (E308; E406, E410, E416) by an intermediate step (E312, E314; E450, E452, E454, E456; E520, E522, E524, E526) of non-null duration, characterized in that, a first action (E314; E408, E412, E414; E504, E506, E508, E510) being effected in the case of positive verification, the intermediate step entails effecting at least one second action (E314; E450, E454, E456; E520, E522, E524, E528) having at least one first characteristic in common with the first action.
3. Method according to claim 1, wherein the second action is different from the first action.
4. Method according to claim 2, wherein, a third action (E418; E512) being effected in the case of positive verification, the processing step entails effecting at least one fourth action (E458; E528) having at least one second characteristic in common with the third action (E418; E512).
5. Method according to claim 1, wherein, the method being implemented in electronic apparatus (10), the first or second common characteristic is the electrical consumption or the electromagnetic radiation of the apparatus generated by the first, respectively the third, action and by the second, respectively the fourth, action.
6. Method according to claim 1 any, wherein the first or second common characteristic is the number of instructions used in the first, respectively the third, action and in the second, respectively the fourth, action.
7. Method according to claim 1, wherein the first or second common characteristic is the type of instruction used by the first, respectively the third, action and by the second, respectively the fourth, action.
8. Method according to claim 1 wherein the first or second common characteristic is the type of data processed by the first, respectively the third, action and by the second, respectively the fourth, action.
9. Method according to claim 1, wherein the first, respectively the third, action entailing access to a first area of a memory, the second, respectively the fourth, action entails access to a second area of said memory different from the first area.
10. Method according to claim 1, wherein the first or second common characteristic is communication with an external device.
11. Method according to claim 10, wherein the external device is a cryptoprocessor.
12. Method according to claim 10, wherein the external device is a memory (2, 6).
13. Method according to claim 10, wherein the external device is a rewritable semiconductor memory (6).
14. Method according to claim 10, wherein the external device is a user terminal.
15. Method according to claim 1, wherein the first action entails a secure step.
16. Method according to claim 15, wherein the secure step entails a cryptographic algorithm.
17. Method according to claim 1, wherein the processing step (E320, E528) entails writing blocking data into a physical memory (6).
18. Method according to claim 17, wherein the writing (E528) of blocking data is effected in accordance with a chronology identical to writing (E512) data into the physical memory (6) in the case of normal running of the method.
19. Method according to claim 18, wherein the data represents a pecuniary value.
20. Method according to claim 1, wherein the criterion is negative if an erroneous signature is provided.
21. Method according to claim 1, wherein the criterion is negative if an anomaly is detected.
22. Method according to claim 1, wherein the criterion is negative if an attack is detected.
23. Method according to claim 1, wherein the criterion is negative if a functional error is detected.
24. Method according to claim 2, wherein the intermediate step entails at least one instruction determined during the execution of the method.
25. Method according to claim 1, executed by a microprocessor (2) of a microcircuit card (10).
26. Data processing device comprising:
means for verification of a criterion indicative of the normal operation of the device;
processing means used in the case of negative verification; and
separation means for separating the operation of the verification means from the operation of the processing means by a non-null duration,
characterized in that, first action means being used in the case of positive verification, the separation means have at least one first characteristic in common with the first action means.
27. Device according to claim 26, wherein, second action means being used in the case of positive verification, the processing means have at least one second characteristic in common with the second action means.
28. Data processing device comprising:
means for verification of a criterion indicative of the normal operation of the device; and
processing means used in the case of negative verification,
characterized in that, first action means being used in the case of positive verification, the processing means have at least one first characteristic in common with the first action means.
29. Device according to claim 26, the device being a microcircuit card.
US11/660,218 2004-08-17 2005-08-12 Data Processing Method and Device Abandoned US20070220603A1 (en)

Priority Applications (3)

Application Number Priority Date Filing Date Title
FR0408928A FR2874440B1 (en) 2004-08-17 2004-08-17 Method and data processing device
FR0408928 2004-08-17
PCT/FR2005/002083 WO2006021686A2 (en) 2004-08-17 2005-08-12 Data processing method and device

Related Parent Applications (1)

Application Number Title Priority Date Filing Date
PCT/FR2005/002083 A-371-Of-International WO2006021686A2 (en) 2004-08-17 2005-08-12 Data processing method and device

Related Child Applications (1)

Application Number Title Priority Date Filing Date
US13/845,914 Division US9454663B2 (en) 2004-08-17 2013-03-18 Data processing method and device

Publications (1)

Publication Number Publication Date
US20070220603A1 true US20070220603A1 (en) 2007-09-20

Family

ID=34948280

Family Applications (2)

Application Number Title Priority Date Filing Date
US11/660,218 Abandoned US20070220603A1 (en) 2004-08-17 2005-08-12 Data Processing Method and Device
US13/845,914 Active US9454663B2 (en) 2004-08-17 2013-03-18 Data processing method and device

Family Applications After (1)

Application Number Title Priority Date Filing Date
US13/845,914 Active US9454663B2 (en) 2004-08-17 2013-03-18 Data processing method and device

Country Status (7)

Country Link
US (2) US20070220603A1 (en)
EP (2) EP1779284B1 (en)
JP (2) JP4790717B2 (en)
CA (1) CA2575143C (en)
ES (2) ES2473326T3 (en)
FR (1) FR2874440B1 (en)
WO (1) WO2006021686A2 (en)

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20060112436A1 (en) * 2004-11-19 2006-05-25 Proton World International N.V. Protection of a microcontroller
US20080018927A1 (en) * 2006-07-21 2008-01-24 Research In Motion Limited Method and system for providing a honeypot mode for an electronic device
US20100299511A1 (en) * 2007-11-26 2010-11-25 Herve Pelletier Method of Masking the End-of-Life Transition of an Electronic Device, and a Device Including a Corresponding Control Module
US20110010775A1 (en) * 2007-01-05 2011-01-13 Proton World International N.V. Protection of information contained in an electronic circuit
US20110007567A1 (en) * 2007-01-05 2011-01-13 Jean-Louis Modave Temporary locking of an electronic circuit
US20110122694A1 (en) * 2007-01-05 2011-05-26 Proton World International N.V. Limitation of the access to a resource of an electronic circuit

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
FR2935823B1 (en) * 2008-09-11 2010-10-01 Oberthur Technologies Method and device for protecting a microcircuit against attacks.
FR3075430A1 (en) * 2017-12-20 2019-06-21 Oberthur Technologies Data processing method and associated electronic device

Citations (61)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4719626A (en) * 1983-12-30 1988-01-12 Fujitsu Limited Diagnostic method and apparatus for channel control apparatus
US5367149A (en) * 1992-08-27 1994-11-22 Mitsubishi Denki Kabushiki Kaisha IC card and method of checking personal identification number of the same
US6000004A (en) * 1996-10-23 1999-12-07 Sharp Kabushiki Kaisha Nonvolatile semiconductor memory device with write protect data settings for disabling erase from and write into a block, and erase and re-erase settings for enabling write into and erase from a block
US6223290B1 (en) * 1998-05-07 2001-04-24 Intel Corporation Method and apparatus for preventing the fraudulent use of a cellular telephone
US20010010602A1 (en) * 2000-02-02 2001-08-02 Fujitsu Limited Method detecting a fault of a magnetic recording head, and a magnetic recording device
US20020018384A1 (en) * 2000-04-21 2002-02-14 Ken Sumitani Semiconductor storage device, control device, and electronic apparatus
US20020106084A1 (en) * 2000-06-12 2002-08-08 Hiroo Azuma Encryption method and apparatus
US20020108036A1 (en) * 2000-07-24 2002-08-08 Takumi Okaue Data processing system, data processing method, data processing apparatus, license system, and program providing medium
US20020151992A1 (en) * 1999-02-01 2002-10-17 Hoffberg Steven M. Media recording device with packet data interface
US6510521B1 (en) * 1996-02-09 2003-01-21 Intel Corporation Methods and apparatus for preventing unauthorized write access to a protected non-volatile storage
US20030028784A1 (en) * 2001-08-03 2003-02-06 Nec Corporation User authentication method and user authentication device
US20030065828A1 (en) * 2001-08-31 2003-04-03 Autodesk Canada Inc. Processing data
US6567539B1 (en) * 1998-02-12 2003-05-20 Bull Cp8 Method for producing an image using a portable object
US20030097344A1 (en) * 1994-01-11 2003-05-22 David Chaum Multi-purpose transaction card system
US20030110390A1 (en) * 2000-05-22 2003-06-12 Christian May Secure data processing unit, and an associated method
US20030112665A1 (en) * 2001-12-17 2003-06-19 Nec Electronics Corporation Semiconductor memory device, data processor, and method of determining frequency
US20030188117A1 (en) * 2001-03-15 2003-10-02 Kenji Yoshino Data access management system and management method using access control tickert
US20030222797A1 (en) * 2002-04-12 2003-12-04 Yuichi Futa Positional information storage system and method , semiconductor memory, and program
US20040003321A1 (en) * 2002-06-27 2004-01-01 Glew Andrew F. Initialization of protected system
US20040030905A1 (en) * 2000-02-18 2004-02-12 Chow Stanley T. Encoding method and system resistant to power analysis
US20040061500A1 (en) * 2002-09-30 2004-04-01 Continental Teves, Inc. Offset calibration of a semi-relative steering wheel angle sensor
US20040088064A1 (en) * 2002-10-28 2004-05-06 Satoshi Endo Backup system for multi-source audio apparatus
US6738749B1 (en) * 1998-09-09 2004-05-18 Ncr Corporation Methods and apparatus for creating and storing secure customer receipts on smart cards
US20040103177A1 (en) * 2002-11-13 2004-05-27 Waeil Ben Ismail Software upgrade over a USB connection
US20040133794A1 (en) * 2001-03-28 2004-07-08 Kocher Paul C. Self-protecting digital content
US20040136421A1 (en) * 2003-01-10 2004-07-15 Robinson Michael A. Loss of signal detection and programmable behavior after error detection
US20040145339A1 (en) * 2001-04-02 2004-07-29 Paul Dischamp Methods for protecting a smart card
US20040153626A1 (en) * 2003-01-29 2004-08-05 Kabushiki Kaisha Toshiba Semiconductor device and a method for checking state transition thereof
US20040151026A1 (en) * 2003-01-30 2004-08-05 Micron Technology, Inc. Chip protection register unlocking
US20040158728A1 (en) * 2003-02-06 2004-08-12 Seo-Kyu Kim Smart cards having protection circuits therein that inhibit power analysis attacks and methods of operating same
US20040172576A1 (en) * 2001-09-28 2004-09-02 Takeo Yoshii Data writing apparatus, data writing method, and program
US20040172538A1 (en) * 2002-12-18 2004-09-02 International Business Machines Corporation Information processing with data storage
US20040199743A1 (en) * 2000-10-19 2004-10-07 Loaiza Juan R. Data block location verification
US20040204800A1 (en) * 2003-04-11 2004-10-14 Denso Corporation Electronic control unit for a vehicle
US20040215909A1 (en) * 2003-04-23 2004-10-28 Renesas Technology Corp. Nonvolatile memory device and data processing system
US20040225776A1 (en) * 2001-03-12 2004-11-11 Motorola, Inc. Method of regulating usage and/or concession eligibility via distributed list management in a smart card system
US6820047B1 (en) * 1999-11-05 2004-11-16 Kabushiki Kaisha Toshiba Method and system for simulating an operation of a memory
US20040236961A1 (en) * 1997-07-15 2004-11-25 Walmsley Simon Robert Integrated circuit incorporating protection from power supply attacks
US20040243766A1 (en) * 2003-05-30 2004-12-02 Lovelace John V. Writing cached data to system management memory
US20040240097A1 (en) * 2003-04-28 2004-12-02 Hewlett-Packard Development Company, L.P. Method and apparatus for use in data transfer
US20040260593A1 (en) * 2003-05-20 2004-12-23 Klaus Abraham-Fuchs System and user interface supporting workflow operation improvement
US20040268312A1 (en) * 2003-05-30 2004-12-30 International Business Machines Corporation Application development support, component invocation monitoring, and data processing
US20040264023A1 (en) * 2003-05-30 2004-12-30 International Business Machines Corp. System, method and computer program product for tape failure detection
US20050021990A1 (en) * 2001-09-04 2005-01-27 Pierre-Yvan Liardet Method for making secure a secret quantity
US20050021427A1 (en) * 2003-07-22 2005-01-27 Norio Takahashi System and method for processing account data
US20050073885A1 (en) * 2002-11-18 2005-04-07 Matsushita Electric Industrial Co., Ltd. Semiconductor memory device
US20050073902A1 (en) * 2003-10-02 2005-04-07 Broadcom Corporation Phase controlled high speed interfaces
US20050154672A1 (en) * 2004-01-13 2005-07-14 Griffin Daniel C. Performance optimized smartcard transaction management
US20050157563A1 (en) * 2004-01-19 2005-07-21 Comax Semiconductor Inc. Memory Device and mobile communication device using a specific access procedure
US20050183072A1 (en) * 1999-07-29 2005-08-18 Intertrust Technologies Corporation Software self-defense systems and methods
US20050271202A1 (en) * 2004-06-08 2005-12-08 Hrl Laboratories, Llc Cryptographic architecture with random instruction masking to thwart differential power analysis
US20060020810A1 (en) * 2004-07-24 2006-01-26 International Business Machines Corporation System and method for software load authentication
US20060031676A1 (en) * 2004-08-05 2006-02-09 Luc Vantalon Methods and apparatuses for configuring products
US7036739B1 (en) * 1999-10-23 2006-05-02 Ultracard, Inc. Data storage device apparatus and method for using same
US7043615B1 (en) * 2000-06-02 2006-05-09 Renesas Technology Corp. Nonvolatile semiconductor memory and method of managing information in information distribution system
US7058819B2 (en) * 2000-07-24 2006-06-06 Sony Corporation Data processing system, data processing method, and program providing medium
US7057937B1 (en) * 1992-03-17 2006-06-06 Renesas Technology Corp. Data processing apparatus having a flash memory built-in which is rewritable by use of external device
US7073073B1 (en) * 1999-07-06 2006-07-04 Sony Corporation Data providing system, device, and method
US7149946B2 (en) * 2003-06-13 2006-12-12 Microsoft Corporation Systems and methods for enhanced stored data verification utilizing pageable pool memory
US7249108B1 (en) * 1997-07-15 2007-07-24 Silverbrook Research Pty Ltd Validation protocol and system
US7349884B1 (en) * 2001-03-29 2008-03-25 Gsc Enterprises, Inc. Method and apparatus for electronic commerce services at a point of sale

Family Cites Families (15)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5274817A (en) * 1991-12-23 1993-12-28 Caterpillar Inc. Method for executing subroutine calls
US5483518A (en) * 1992-06-17 1996-01-09 Texas Instruments Incorporated Addressable shadow port and protocol for serial bus networks
FR2773250B1 (en) 1997-12-31 2000-03-10 Grp Des Cartes Bancaires Method and PINs processing device
US6141771A (en) * 1998-02-06 2000-10-31 International Business Machines Corporation Method and system for providing a trusted machine state
US7599491B2 (en) * 1999-01-11 2009-10-06 Certicom Corp. Method for strengthening the implementation of ECDSA against power analysis
US20040141439A1 (en) * 2000-03-28 2004-07-22 Takayuki Suzuki Decoder
US6745370B1 (en) * 2000-07-14 2004-06-01 Heuristics Physics Laboratories, Inc. Method for selecting an optimal level of redundancy in the design of memories
DE10131575A1 (en) * 2001-07-02 2003-01-16 Bosch Gmbh Robert Method for protecting a microcomputer system from manipulation of data stored in a memory device of the microcomputer system data
JP3993063B2 (en) * 2001-10-15 2007-10-17 三菱電機株式会社 Encrypted communication device
FR2832824A1 (en) * 2001-11-28 2003-05-30 St Microelectronics Sa Integrated circuit card operation blocking method e.g. for smart card, involves executing blocking program including sequence of instructions to proceed with loop operation of blocking program, when jump table is generated
US7284111B1 (en) * 2002-04-17 2007-10-16 Dinochip, Inc. Integrated multidimensional sorter
JP2004102825A (en) * 2002-09-11 2004-04-02 Renesas Technology Corp Cache memory controller
US9002724B2 (en) * 2003-02-28 2015-04-07 Panasonic Corporation Incentive provision system
US7216270B1 (en) * 2004-05-14 2007-05-08 National Semiconductor Corporation System and method for providing testing and failure analysis of integrated circuit memory devices
US7483533B2 (en) * 2004-08-05 2009-01-27 King Fahd University Of Petroleum Elliptic polynomial cryptography with multi x-coordinates embedding

Patent Citations (63)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4719626A (en) * 1983-12-30 1988-01-12 Fujitsu Limited Diagnostic method and apparatus for channel control apparatus
US7057937B1 (en) * 1992-03-17 2006-06-06 Renesas Technology Corp. Data processing apparatus having a flash memory built-in which is rewritable by use of external device
US5367149A (en) * 1992-08-27 1994-11-22 Mitsubishi Denki Kabushiki Kaisha IC card and method of checking personal identification number of the same
US20030097344A1 (en) * 1994-01-11 2003-05-22 David Chaum Multi-purpose transaction card system
US6510521B1 (en) * 1996-02-09 2003-01-21 Intel Corporation Methods and apparatus for preventing unauthorized write access to a protected non-volatile storage
US6000004A (en) * 1996-10-23 1999-12-07 Sharp Kabushiki Kaisha Nonvolatile semiconductor memory device with write protect data settings for disabling erase from and write into a block, and erase and re-erase settings for enabling write into and erase from a block
US20040236961A1 (en) * 1997-07-15 2004-11-25 Walmsley Simon Robert Integrated circuit incorporating protection from power supply attacks
US7249108B1 (en) * 1997-07-15 2007-07-24 Silverbrook Research Pty Ltd Validation protocol and system
US6567539B1 (en) * 1998-02-12 2003-05-20 Bull Cp8 Method for producing an image using a portable object
US6223290B1 (en) * 1998-05-07 2001-04-24 Intel Corporation Method and apparatus for preventing the fraudulent use of a cellular telephone
US6738749B1 (en) * 1998-09-09 2004-05-18 Ncr Corporation Methods and apparatus for creating and storing secure customer receipts on smart cards
US20020151992A1 (en) * 1999-02-01 2002-10-17 Hoffberg Steven M. Media recording device with packet data interface
US7073073B1 (en) * 1999-07-06 2006-07-04 Sony Corporation Data providing system, device, and method
US7430670B1 (en) * 1999-07-29 2008-09-30 Intertrust Technologies Corp. Software self-defense systems and methods
US20050183072A1 (en) * 1999-07-29 2005-08-18 Intertrust Technologies Corporation Software self-defense systems and methods
US7036739B1 (en) * 1999-10-23 2006-05-02 Ultracard, Inc. Data storage device apparatus and method for using same
US6820047B1 (en) * 1999-11-05 2004-11-16 Kabushiki Kaisha Toshiba Method and system for simulating an operation of a memory
US20010010602A1 (en) * 2000-02-02 2001-08-02 Fujitsu Limited Method detecting a fault of a magnetic recording head, and a magnetic recording device
US20040030905A1 (en) * 2000-02-18 2004-02-12 Chow Stanley T. Encoding method and system resistant to power analysis
US20040078588A1 (en) * 2000-02-18 2004-04-22 Chow Stanley T Method and apparatus for balanced electronic operations
US20020018384A1 (en) * 2000-04-21 2002-02-14 Ken Sumitani Semiconductor storage device, control device, and electronic apparatus
US20030110390A1 (en) * 2000-05-22 2003-06-12 Christian May Secure data processing unit, and an associated method
US7043615B1 (en) * 2000-06-02 2006-05-09 Renesas Technology Corp. Nonvolatile semiconductor memory and method of managing information in information distribution system
US20020106084A1 (en) * 2000-06-12 2002-08-08 Hiroo Azuma Encryption method and apparatus
US7058819B2 (en) * 2000-07-24 2006-06-06 Sony Corporation Data processing system, data processing method, and program providing medium
US20020108036A1 (en) * 2000-07-24 2002-08-08 Takumi Okaue Data processing system, data processing method, data processing apparatus, license system, and program providing medium
US20040199743A1 (en) * 2000-10-19 2004-10-07 Loaiza Juan R. Data block location verification
US20040225776A1 (en) * 2001-03-12 2004-11-11 Motorola, Inc. Method of regulating usage and/or concession eligibility via distributed list management in a smart card system
US20030188117A1 (en) * 2001-03-15 2003-10-02 Kenji Yoshino Data access management system and management method using access control tickert
US20040133794A1 (en) * 2001-03-28 2004-07-08 Kocher Paul C. Self-protecting digital content
US7349884B1 (en) * 2001-03-29 2008-03-25 Gsc Enterprises, Inc. Method and apparatus for electronic commerce services at a point of sale
US20040145339A1 (en) * 2001-04-02 2004-07-29 Paul Dischamp Methods for protecting a smart card
US20030028784A1 (en) * 2001-08-03 2003-02-06 Nec Corporation User authentication method and user authentication device
US20030065828A1 (en) * 2001-08-31 2003-04-03 Autodesk Canada Inc. Processing data
US20050021990A1 (en) * 2001-09-04 2005-01-27 Pierre-Yvan Liardet Method for making secure a secret quantity
US20040172576A1 (en) * 2001-09-28 2004-09-02 Takeo Yoshii Data writing apparatus, data writing method, and program
US20030112665A1 (en) * 2001-12-17 2003-06-19 Nec Electronics Corporation Semiconductor memory device, data processor, and method of determining frequency
US20030222797A1 (en) * 2002-04-12 2003-12-04 Yuichi Futa Positional information storage system and method , semiconductor memory, and program
US20040003321A1 (en) * 2002-06-27 2004-01-01 Glew Andrew F. Initialization of protected system
US20040061500A1 (en) * 2002-09-30 2004-04-01 Continental Teves, Inc. Offset calibration of a semi-relative steering wheel angle sensor
US20040088064A1 (en) * 2002-10-28 2004-05-06 Satoshi Endo Backup system for multi-source audio apparatus
US20040103177A1 (en) * 2002-11-13 2004-05-27 Waeil Ben Ismail Software upgrade over a USB connection
US20050073885A1 (en) * 2002-11-18 2005-04-07 Matsushita Electric Industrial Co., Ltd. Semiconductor memory device
US20040172538A1 (en) * 2002-12-18 2004-09-02 International Business Machines Corporation Information processing with data storage
US20040136421A1 (en) * 2003-01-10 2004-07-15 Robinson Michael A. Loss of signal detection and programmable behavior after error detection
US20040153626A1 (en) * 2003-01-29 2004-08-05 Kabushiki Kaisha Toshiba Semiconductor device and a method for checking state transition thereof
US20040151026A1 (en) * 2003-01-30 2004-08-05 Micron Technology, Inc. Chip protection register unlocking
US20040158728A1 (en) * 2003-02-06 2004-08-12 Seo-Kyu Kim Smart cards having protection circuits therein that inhibit power analysis attacks and methods of operating same
US20040204800A1 (en) * 2003-04-11 2004-10-14 Denso Corporation Electronic control unit for a vehicle
US20040215909A1 (en) * 2003-04-23 2004-10-28 Renesas Technology Corp. Nonvolatile memory device and data processing system
US20040240097A1 (en) * 2003-04-28 2004-12-02 Hewlett-Packard Development Company, L.P. Method and apparatus for use in data transfer
US20040260593A1 (en) * 2003-05-20 2004-12-23 Klaus Abraham-Fuchs System and user interface supporting workflow operation improvement
US20040243766A1 (en) * 2003-05-30 2004-12-02 Lovelace John V. Writing cached data to system management memory
US20040268312A1 (en) * 2003-05-30 2004-12-30 International Business Machines Corporation Application development support, component invocation monitoring, and data processing
US20040264023A1 (en) * 2003-05-30 2004-12-30 International Business Machines Corp. System, method and computer program product for tape failure detection
US7149946B2 (en) * 2003-06-13 2006-12-12 Microsoft Corporation Systems and methods for enhanced stored data verification utilizing pageable pool memory
US20050021427A1 (en) * 2003-07-22 2005-01-27 Norio Takahashi System and method for processing account data
US20050073902A1 (en) * 2003-10-02 2005-04-07 Broadcom Corporation Phase controlled high speed interfaces
US20050154672A1 (en) * 2004-01-13 2005-07-14 Griffin Daniel C. Performance optimized smartcard transaction management
US20050157563A1 (en) * 2004-01-19 2005-07-21 Comax Semiconductor Inc. Memory Device and mobile communication device using a specific access procedure
US20050271202A1 (en) * 2004-06-08 2005-12-08 Hrl Laboratories, Llc Cryptographic architecture with random instruction masking to thwart differential power analysis
US20060020810A1 (en) * 2004-07-24 2006-01-26 International Business Machines Corporation System and method for software load authentication
US20060031676A1 (en) * 2004-08-05 2006-02-09 Luc Vantalon Methods and apparatuses for configuring products

Cited By (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20060112436A1 (en) * 2004-11-19 2006-05-25 Proton World International N.V. Protection of a microcontroller
US7516902B2 (en) * 2004-11-19 2009-04-14 Proton World International N.V. Protection of a microcontroller
US20080018927A1 (en) * 2006-07-21 2008-01-24 Research In Motion Limited Method and system for providing a honeypot mode for an electronic device
US8479288B2 (en) * 2006-07-21 2013-07-02 Research In Motion Limited Method and system for providing a honeypot mode for an electronic device
US20110010775A1 (en) * 2007-01-05 2011-01-13 Proton World International N.V. Protection of information contained in an electronic circuit
US20110007567A1 (en) * 2007-01-05 2011-01-13 Jean-Louis Modave Temporary locking of an electronic circuit
US20110122694A1 (en) * 2007-01-05 2011-05-26 Proton World International N.V. Limitation of the access to a resource of an electronic circuit
US8411504B2 (en) 2007-01-05 2013-04-02 Proton World International N.V. Limitation of the access to a resource of an electronic circuit
US8566931B2 (en) * 2007-01-05 2013-10-22 Proton World International N.V. Protection of information contained in an electronic circuit
US9036414B2 (en) 2007-01-05 2015-05-19 Proton World International N.V. Temporary locking of an electronic circuit to protect data contained in the electronic circuit
US20100299511A1 (en) * 2007-11-26 2010-11-25 Herve Pelletier Method of Masking the End-of-Life Transition of an Electronic Device, and a Device Including a Corresponding Control Module
US8566572B2 (en) 2007-11-26 2013-10-22 Morpho Method, device and non-transitory computer readable storage medium for masking the end of life transition of a electronic device

Also Published As

Publication number Publication date
JP4790717B2 (en) 2011-10-12
WO2006021686A2 (en) 2006-03-02
EP2309409B1 (en) 2015-04-22
EP1779284A2 (en) 2007-05-02
JP2011076636A (en) 2011-04-14
EP1779284B1 (en) 2014-03-19
US9454663B2 (en) 2016-09-27
US20130219522A1 (en) 2013-08-22
FR2874440A1 (en) 2006-02-24
FR2874440B1 (en) 2008-04-25
JP5254372B2 (en) 2013-08-07
JP2008510242A (en) 2008-04-03
CA2575143A1 (en) 2006-03-02
CA2575143C (en) 2016-05-31
WO2006021686A3 (en) 2006-04-13
EP2309409A1 (en) 2011-04-13
ES2543210T3 (en) 2015-08-17
ES2473326T3 (en) 2014-07-04

Similar Documents

Publication Publication Date Title
JP4079200B2 (en) Equipment external
JP4855679B2 (en) Encapsulation of high platform module functional reliability by server management coprocessor subsystem internal tcpa
Smith et al. Building a high-performance, programmable secure coprocessor
US7996911B2 (en) Memory card
US7797549B2 (en) Secure method and system for biometric verification
US6226749B1 (en) Method and apparatus for operating resources under control of a security module or other secure processor
US6820177B2 (en) Protected configuration space in a protected environment
CN102037499B (en) NFC mobile communication device and NFC reader
EP0275510B1 (en) Smart card having external programming capability and method of making same
US5442645A (en) Method for checking the integrity of a program or data, and apparatus for implementing this method
JP4172745B2 (en) Method and monitoring device for monitoring the execution of the instruction sequence by the processor
EP0785514B1 (en) Method of implementing a secure program in a microprocessor card, and microprocessor card including a secure program
CN1229705C (en) Biometric-based device and system and associated safety system
CN100535822C (en) Method for detecting and reacting against possible attack to security enforcing operation performed by a cryptographic token or card
CA2053741C (en) Access security integrated circuit
US6952778B1 (en) Protecting access to microcontroller memory blocks
US20060101047A1 (en) Method and system for fortifying software
US7788730B2 (en) Secure bytecode instrumentation facility
DK2164031T4 (en) Method and device for protecting a microcircuit from attack
Smith Trusted computing platforms: design and applications
JP2011527777A (en) Computer system equipped with a safe start-up mechanism
WO2008048800A1 (en) Identification and visualization of trusted user interface objects
KR20000022696A (en) Storing data objects in a smart card memory
US5875248A (en) Method of counterfeit detection of electronic data stored on a device
EP1573466B1 (en) Enhancing data integrity and security in a processor-based system

Legal Events

Date Code Title Description
AS Assignment

Owner name: OBERTHUR CARD SYSTEMS SA, FRANCE

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:CHAMBEROT, FRANCIS;REEL/FRAME:018949/0820

Effective date: 20070104

AS Assignment

Owner name: OBERTHUR TECHNOLOGIES, FRANCE

Free format text: CHANGE OF NAME;ASSIGNOR:OBERTHUR CARD SYSTEMS SA;REEL/FRAME:029026/0923

Effective date: 20071227

STCB Information on status: application discontinuation

Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION