WO2024056293A1 - Method and system to authenticate camera device and camera data from common attacks - Google Patents

Method and system to authenticate camera device and camera data from common attacks Download PDF

Info

Publication number
WO2024056293A1
WO2024056293A1 PCT/EP2023/072408 EP2023072408W WO2024056293A1 WO 2024056293 A1 WO2024056293 A1 WO 2024056293A1 EP 2023072408 W EP2023072408 W EP 2023072408W WO 2024056293 A1 WO2024056293 A1 WO 2024056293A1
Authority
WO
WIPO (PCT)
Prior art keywords
camera data
mcu
technique
camera
data
Prior art date
Application number
PCT/EP2023/072408
Other languages
French (fr)
Inventor
Yi Wang
Original Assignee
Continental Automotive Technologies GmbH
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Continental Automotive Technologies GmbH filed Critical Continental Automotive Technologies GmbH
Publication of WO2024056293A1 publication Critical patent/WO2024056293A1/en

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F21/00Security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
    • G06F21/30Authentication, i.e. establishing the identity or authorisation of security principals
    • G06F21/44Program or device authentication
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L63/00Network architectures or network communication protocols for network security
    • H04L63/08Network architectures or network communication protocols for network security for authentication of entities
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L63/00Network architectures or network communication protocols for network security
    • H04L63/08Network architectures or network communication protocols for network security for authentication of entities
    • H04L63/0869Network architectures or network communication protocols for network security for authentication of entities for achieving mutual authentication
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L63/00Network architectures or network communication protocols for network security
    • H04L63/10Network architectures or network communication protocols for network security for controlling access to devices or network resources
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L9/00Cryptographic mechanisms or cryptographic arrangements for secret or secure communications; Network security protocols
    • H04L9/08Key distribution or management, e.g. generation, sharing or updating, of cryptographic keys or passwords
    • H04L9/0816Key establishment, i.e. cryptographic processes or cryptographic protocols whereby a shared secret becomes available to two or more parties, for subsequent use
    • H04L9/0838Key agreement, i.e. key establishment technique in which a shared key is derived by parties as a function of information contributed by, or associated with, each of these
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L9/00Cryptographic mechanisms or cryptographic arrangements for secret or secure communications; Network security protocols
    • H04L9/32Cryptographic mechanisms or cryptographic arrangements for secret or secure communications; Network security protocols including means for verifying the identity or authority of a user of the system or for message authentication, e.g. authorization, entity authentication, data integrity or data verification, non-repudiation, key authentication or verification of credentials
    • H04L9/3236Cryptographic mechanisms or cryptographic arrangements for secret or secure communications; Network security protocols including means for verifying the identity or authority of a user of the system or for message authentication, e.g. authorization, entity authentication, data integrity or data verification, non-repudiation, key authentication or verification of credentials using cryptographic hash functions
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L9/00Cryptographic mechanisms or cryptographic arrangements for secret or secure communications; Network security protocols
    • H04L9/32Cryptographic mechanisms or cryptographic arrangements for secret or secure communications; Network security protocols including means for verifying the identity or authority of a user of the system or for message authentication, e.g. authorization, entity authentication, data integrity or data verification, non-repudiation, key authentication or verification of credentials
    • H04L9/3271Cryptographic mechanisms or cryptographic arrangements for secret or secure communications; Network security protocols including means for verifying the identity or authority of a user of the system or for message authentication, e.g. authorization, entity authentication, data integrity or data verification, non-repudiation, key authentication or verification of credentials using challenge-response
    • H04L9/3273Cryptographic mechanisms or cryptographic arrangements for secret or secure communications; Network security protocols including means for verifying the identity or authority of a user of the system or for message authentication, e.g. authorization, entity authentication, data integrity or data verification, non-repudiation, key authentication or verification of credentials using challenge-response for mutual authentication
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L9/00Cryptographic mechanisms or cryptographic arrangements for secret or secure communications; Network security protocols
    • H04L9/32Cryptographic mechanisms or cryptographic arrangements for secret or secure communications; Network security protocols including means for verifying the identity or authority of a user of the system or for message authentication, e.g. authorization, entity authentication, data integrity or data verification, non-repudiation, key authentication or verification of credentials
    • H04L9/3297Cryptographic mechanisms or cryptographic arrangements for secret or secure communications; Network security protocols including means for verifying the identity or authority of a user of the system or for message authentication, e.g. authorization, entity authentication, data integrity or data verification, non-repudiation, key authentication or verification of credentials involving time stamps, e.g. generation of time stamps
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04WWIRELESS COMMUNICATION NETWORKS
    • H04W12/00Security arrangements; Authentication; Protecting privacy or anonymity
    • H04W12/06Authentication
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2221/00Indexing scheme relating to security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
    • G06F2221/21Indexing scheme relating to G06F21/00 and subgroups addressing additional information or applications relating to security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
    • G06F2221/2103Challenge-response

Definitions

  • the present subject matter is generally related to the field of security, more particularly, but not exclusively, to a method, a Micro Control Unit (MCU), and a system for authenticating a camera device and camera data.
  • MCU Micro Control Unit
  • a monitoring system comprises of software and/or hardware components that help to monitor or supervise infrastructure, traffic conditions, applications, cabin of a vehicle, and the like.
  • the monitoring system also, help in alerting disruptions (for example, disruption in traffic movements), malfunctions (for example, lift malfunction in an infrastructure), and the like.
  • disruptions for example, disruption in traffic movements
  • malfunctions for example, lift malfunction in an infrastructure
  • the monitoring system provides safety and security, advancement in technology has made these monitoring systems vulnerable to attacks such as third-party counterfeits, spoofing and reply attacks, and physical attacks (also, referred as perturbation attack).
  • the present disclosure relates to a method of authenticating a camera device and camera data.
  • the method comprising receiving an access request from an imaging and sensing unit. Thereafter, the method comprising authenticating the access request using at least one of a key exchange technique, a challenge response technique, a general authentication technique and a mutual authentication technique. Subsequently, the method comprising receiving the camera data from the imaging and sensing unit upon authenticating the access request.
  • the method comprising authenticating the camera data using at least one of a hash value verification of partial data, and a timestamp verification technique.
  • the method comprising identifying presence of noise in the camera data for processing the camera data upon authenticating the camera data.
  • the present disclosure relates to a Micro Control Unit (MCU) for authenticating a camera device and camera data.
  • the MCU comprising a processor and a memory communicatively coupled to the processor, wherein the memory stores processor executable instructions, which on execution, cause the processor to receive an access request from an imaging and sensing unit.
  • the processor is configured to authenticate the access request using at least one of a key exchange technique, a challenge response technique, a general authentication technique and a mutual authentication technique.
  • the processor is configured to receive the camera data from the imaging and sensing unit upon authenticating the access request.
  • the processor is configured to authenticate the camera data using at least one of a hash value verification of partial data, and a timestamp verification technique.
  • the processor is configured to identify presence of noise in the camera data for processing the camera data upon authenticating the camera data.
  • the present disclosure relates to a system for authenticating a camera device and camera data.
  • the system comprising an imaging and sensing unit 111 comprising the camera device and a Micro Control Unit (MCU) communicatively coupled to the imaging and sensing unit.
  • the MCU is configured to receive an access request from an imaging and sensing unit. Thereafter, the MCU is configured to authenticate the access request using at least one of a key exchange technique, a challenge response technique, a general authentication technique and a mutual authentication technique.
  • the MCU is configured to receive the camera data from the imaging and sensing unit upon authenticating the access request.
  • the MCU is configured to authenticate the camera data using at least one of a hash value verification of partial data, and a timestamp verification technique.
  • the MCU is configured to identify presence of noise in the camera data for processing the camera data upon authenticating the camera data.
  • Embodiments of the disclosure according to the above-mentioned method, the MCU, and the system bring about following technical advantages.
  • the MCU 101 of the present disclosure prevents (1 ) third party counterfeits by authenticating access request from the imaging and sensing unit 111 , (2) spoofing and reply attacks by authenticating camera data, and (3) physical attacks by identifying presence of noise in the camera data and processing the camera data using a denoising technique for retrieving original camera data.
  • the method steps of the present disclosure provide a robust and sequential authentication process for the camera device and camera data wherein if any of method (i.e., authentication) steps fails, the MCU 101 stops processing further and prevents illegitimate access to the system connected to the camera device.
  • Figure 1 illustrates an exemplary environment for authenticating a camera device and camera data in accordance with some embodiments of the present disclosure.
  • FIG. 2 shows a detailed block diagram of a monitoring control unit in accordance with some embodiments of the present disclosure.
  • Figure 3a illustrates a flowchart showing a method for authenticating a camera device and camera data in accordance with some embodiments of present disclosure.
  • Figure 3b illustrates a flowchart showing a method for hash value verification of partial data in accordance with some embodiments of present disclosure.
  • Figure 3c illustrates a flowchart showing a method for timestamp verification in accordance with some embodiments of present disclosure.
  • Figure 4 illustrates a block diagram of an exemplary computer system for implementing embodiments consistent with the present disclosure.
  • Embodiment of the present disclosure provides a solution for authenticating a camera device and camera data.
  • the present disclosure discloses a method, a Micro Control Unit (MCU) and a system to authenticate the camera device and camera data.
  • the MCU may, also, be referred as an Electric Control Unit (ECU).
  • ECU Electric Control Unit
  • the present disclosure involves three authentication steps: first step is authenticating an access request, received from an imaging and sensing unit, using at least one of a key exchange technique, a challenge response technique, a general authentication technique and a mutual authentication technique.
  • This aspect of authentication prevents third-party counterfeit attack i.e., a camera of the imaging and sensing unit is substituted by a fake camera by an attacker/criminal and fake video captured using the fake camera is transferred to the MCU, which make wrong decision according to the fake video received as an input. If the authentication fails at this step, the MCU stops further processing of the input, thereby, preventing the third- party counterfeit attack.
  • Second step is authenticating a camera data, received from the imaging and sensing unit, using at least one of a hash value verification of partial data, and a timestamp verification technique.
  • spoofing and reply attacks i.e., a spoofing attack is a situation in which a person or program successfully identifies another entity, which is spoofed or non-genuine, as legitimate and a replay attack (also, referred as a repeat attack or a playback attack) is a form of network attack in which valid data transmission is maliciously or fraudulently repeated or delayed.
  • a replay attack also, referred as a repeat attack or a playback attack
  • the attacker uses spoofing attack to get access to data from the camera of the imaging and sensing unit and then replay a past recorded video. The past recorded video data does not reflect current real-time situation, which leads the MCU to make wrong decisions.
  • Third step is identifying presence of noise in the camera data for processing the camera data and processing the camera data using a denoising technique for retrieving original camera data.
  • This aspect of authentication i.e., identifying step
  • prevents physical attack also, referred as perturbation attack
  • the MCU stops further processing of the input, thereby, preventing the physical attack.
  • Figure 1 illustrates an exemplary environment for authenticating a camera device and camera data in accordance with some embodiments of the present disclosure.
  • the environment 100 includes a Micro Control Unit (MCU) 101 , a communication network 109, and an imaging and sensing unit 111.
  • the MCU 101 and the imaging and sensing unit 111 together form a system for authenticating a camera device and camera data.
  • the MCU 101 is a part of a vehicle, or part of a cloud system or a backend system for authenticating a camera device and camera data, or part of an Internet of Things (loT) system, or a part of Artificial Intelligence (Al) module located on a local server or a cloud server or a remote server.
  • the imaging and sensing unit 111 comprises one or more camera device.
  • the imaging and sensing unit 111 comprises one or more sensors such as, but not limiting to, a camera sensor, an image sensor, and the like in addition to the camera device.
  • the MCU 101 and the imaging and sensing unit 111 communicate using the communication network 109.
  • the communication network 109 may include, but is not limited to, an e-commerce network, a Peer to Peer (P2P) network, Local Area Network (LAN), Wide Area Network (WAN), wireless network (for example, using Wireless Application Protocol), Internet, Wi Fi, Bluetooth, cellular network, Aircraft Data Network (ARINC664), Transport Layer Security (TLS), and the like.
  • P2P Peer to Peer
  • LAN Local Area Network
  • WAN Wide Area Network
  • wireless network for example, using Wireless Application Protocol
  • Internet for example, using Wireless Application Protocol
  • Wi Fi Wireless Application Protocol
  • Bluetooth Wireless Application Protocol
  • cellular network for example, Wi Fi
  • ARINC664 Aircraft Data Network
  • TLS Transport Layer Security
  • the MCU 101 communicates with the imaging and sensing unit 111 to authenticate the camera device of the imaging and sensing unit 111 and camera data.
  • the MCU 101 includes an I/O interface 103, a memory 105 and a processor 107.
  • the I/O interface 103 is configured to communicate with the imaging and sensing unit 111.
  • the I/O interface 103 may employ communication protocols/methods such as, without limitation, audio, analog, digital, monaural, Radio Corporation of America (RCA) connector, stereo, IEEE® 1394 high speed serial bus, serial bus, Universal Serial Bus (USB), infrared, Personal System/2 (PS/2) port, Bayonet Neill Concelman (BNC) connector, coaxial, component, composite, Digital Visual Interface (DVI), High Definition Multimedia Interface (HDMI®), Radio Frequency (RF) antennas, S Video, Video Graphics Array (VGA), IEEE® 802.11 b/g/n/x, Bluetooth, cellular e.g., Code Division Multiple Access (CDMA), High Speed Packet Access (HSPA+), Global System for Mobile communications (GSM®), Long Term Evolution (LTE®), Worldwide interoperability for Microwave access (WiMax®), Aircraft Data Network (ARINC664), Transport Layer Security (TLS), or the like.
  • the memory 105 is communicatively coupled to the processor 107 of the MCU 101
  • the memory 105 also, stores processor instructions which cause the processor 107 to execute the instructions for authenticating the camera device and camera data.
  • the processor 107 includes at least one data processor for authenticating the camera device and camera data.
  • an access request is transmitted by the imaging and sensing unit 111.
  • the MCU 101 receives the access request from the imaging and sensing unit 111.
  • the MCU 101 authenticates the access request using at least one of a key exchange technique, a challenge response technique, a general authentication technique, and a mutual authentication technique.
  • the key exchange technique comprises one of an Elliptic-curve algorithm, and a Rivesst-Shamir- Adleman (RSA) algorithm.
  • the challenge response technique comprises one of a Challenge-Handshake Authentication Protocol (CHAP), an OATH Challenge- Response Algorithm (OCRA), and a Salted Challenge Response Authentication Mechanism (SCRAM).
  • CHAP Challenge-Handshake Authentication Protocol
  • OCRA OATH Challenge- Response Algorithm
  • SCRAM Salted Challenge Response Authentication Mechanism
  • the general authentication technique comprises a passwordbased authentication technique, a multi-factor authentication technique, a certificatebased authentication technique, a biometric authentication technique, and a tokenbased authentication technique.
  • the mutual authentication technique is, also, referred as a two-way authentication.
  • the mutual authentication technique comprises components such as Burrows-Abadi-Needham logic, digital certificate, X.509 certificate, secure sockets layer and transport layer security.
  • the mutual authentication technique is, not limited to, Hash-based Message Authentication Code (HMAC) or Secure Hash Algorithm (SHA)-256 algorithm. If the authentication fails, the MCU 101 stops processing further. Upon (successful) authenticating the access request, the MCU 101 receives the camera data from the imaging and sensing unit 111.
  • HMAC Hash-based Message Authentication Code
  • SHA Secure Hash Algorithm
  • the MCU 101 authenticates the camera data using at least one of a hash value verification of partial data, and a timestamp verification technique.
  • the MCU 101 receives a part of a first hash value and the camera data.
  • the first hash value is generated by the imaging and sensing unit 111 using a hash generator application based on at least one of a timestamp and a part of the camera data.
  • the MCU 101 generates a second hash value using the hash generator application based on at least one of the timestamp and the part of the camera data.
  • the hash generator application is one of SHA-256 algorithm, SHA-O algorithm, SHA- 1 algorithm, SHA-2 algorithm, and SHA-3 algorithm.
  • the MCU 101 accepts the camera data. If the part of the first hash value and a part of the second hash value match, the MCU 101 accepts the camera data. If the part of the first hash value and the part of the second hash value do not match, the MCU 101 rejects the camera data.
  • the MCU 101 For the timestamp verification, the MCU 101 generates a timestamp value using a timestamp received with the camera data and a predetermined factor.
  • the predetermined factor is a partial camera pixel data. To perform hashing for all camera data is a huge overhead due the size of camera data. Therefore, to overcome huge data overhead, the partial camera pixel data is considered and hashed. This approach of using the partial camera pixel data, also, ensures real-time transactions and data integrity.
  • the MCU 101 When the timestamp value is equal to a predetermined threshold value, the MCU 101 accepts the camera data. When the timestamp value is not equal to the predetermined threshold value, the MCU 101 rejects the camera data.
  • the predetermined threshold value is a user-defined numeric value or a real-time numeric value. For example, the real-time numeric value may be the timestamp value plus numeric value 1 or 2. If the authentication fails, the MCU 101 stops processing further. Upon (successful) authenticating the camera data, the MCU 101 identifies presence of noise in the camera data for processing the camera data. If no noise identified (or present) in the camera data, the MCU 101 performs no further processing and considers the camera data to be original camera data.
  • the MCU 101 processes the camera data using a denoising technique for retrieving original camera data.
  • the denoising technique comprises one of a box averaging technique, a simple convolution technique, a gaussian convolution technique, a Gaussian filter, and denoising by averaging noisy images.
  • the original camera data may be transmitted to the vehicle or to the Al module or to the loT system for further processing.
  • Figure 2 shows a detailed block diagram of a Micro Control Unit (MCU) in accordance with some embodiments of the present disclosure.
  • MCU Micro Control Unit
  • the MCU 101 in addition to the I/O interface 103 and the processor 107 described above, includes data 201 and one or more modules 211 , which are described herein in detail.
  • the data 201 is stored within the memory 105.
  • the data 201 includes, for example, access request data 203, camera data 205, and other data 207.
  • the access request data 203 includes the access request received from the imaging and sensing unit 111.
  • the access request is a signal requesting to access the image sensor and/or camera sensor of the imaging and sensing unit 111.
  • the camera data 205 includes the camera data received from the imaging and sensing unit 111.
  • the camera data is in pixel format.
  • the other data 207 may store data, including temporary data and temporary files, generated by one or more modules 211 for performing the various functions of the MCU 101.
  • the data 201 in the memory 105 is processed by the one or more modules 211 present within the memory 105 of the MCU 101.
  • the one or more modules 211 may be implemented as dedicated hardware units.
  • the term module refers to an Application Specific Integrated Circuit (ASIC), an electronic circuit, a Field Programmable Gate Arrays (FPGA), Programmable System on Chip (PSoC), a combinational logic circuit, and/or other suitable components that provide the described functionality.
  • the one or more modules 211 are communicatively coupled to the processor 107 for performing one or more functions of the MCU 101. The one or more modules 211 when configured with the functionality defined in the present disclosure will result in a novel hardware.
  • the one or more modules 211 include, but are not limited to, a receiving module 213, an authenticating module 215, an identifying module 217, and a generating module 219.
  • the one or more modules 211 also, include other modules 221 to perform various miscellaneous functionalities of the MCU 101.
  • the receiving module 213 receives the access request from the imaging and sensing unit 111 through the I/O interface 103.
  • the receiving module 213 receives a part of the first hash value and the camera data through the I/O interface 103.
  • the first hash value is generated by the imaging and sensing unit 111 using a hash generator application based on at least one of a timestamp and a part of the camera data.
  • the authenticating module 215 authenticates (or verifies) the access request using at least one of a key exchange technique, a challenge response technique, a general authentication technique and a mutual authentication technique.
  • the authenticating module 215 authenticates (or verifies) the camera data using at least one of a hash value verification of partial data, and a timestamp verification technique.
  • the authenticating module 215 accepts the camera data if the part of the first hash value and a part of the second hash value match and rejects the camera data if the part of the first hash value and the part of the second hash value do not match.
  • the authenticating module 215 accepts the camera data when the timestamp value is equal to a predetermined threshold value and rejects the camera data when the timestamp value is not equal to the predetermined threshold value.
  • the identifying module 217 identifies presence of noise in the camera data for processing the camera data upon authenticating the camera data. Further, the identifying module 217 processes the camera data using a denoising technique for retrieving original camera data.
  • the generating module 219 during the hash value verification of partial data, generates a second hash value using the hash generator application based on at least one of the timestamp and the part of the camera data. The generating module 219, during the timestamp verification, generates a timestamp value using a timestamp received with the camera data and a predetermined factor.
  • Figure 3a illustrates a flowchart showing a method for authenticating a camera device and camera data in accordance with some embodiments of present disclosure.
  • the method 300a includes one or more blocks for authenticating a camera device and camera data.
  • the method 300a may be described in the general context of computer executable instructions.
  • computer executable instructions can include routines, programs, objects, components, data structures, procedures, modules, and functions, which perform particular functions or implement particular abstract data types.
  • the order in which the method 300a is described is not intended to be construed as a limitation, and any number of the described method blocks can be combined in any order to implement the method. Additionally, individual blocks may be deleted from the methods without departing from the scope of the subject matter described herein. Furthermore, the method can be implemented in any suitable hardware, software, firmware, or combination thereof.
  • the receiving module 213 of the MCU 101 receives an access request from the imaging and sensing unit 111.
  • the authenticating module 215 of the MCU 101 authenticates the access request using at least one of a key exchange technique, a challenge response technique, a general authentication technique and a mutual authentication technique.
  • the key exchange technique comprises one of an Elliptic-curve algorithm, and a Rivesst-Shamir-Adleman (RSA) algorithm.
  • the challenge response technique comprises one of a Challenge-Handshake Authentication Protocol (CHAP), an OATH Challenge-Response Algorithm (OCRA), and a Salted Challenge Response Authentication Mechanism (SCRAM).
  • CHAP Challenge-Handshake Authentication Protocol
  • OCRA OATH Challenge-Response Algorithm
  • SCRAM Salted Challenge Response Authentication Mechanism
  • the receiving module 213 of the MCU 101 receives the camera data from the imaging and sensing unit 111 upon authenticating the access request.
  • the authenticating module 215 of the MCU 101 authenticates the camera data using at least one of a hash value verification of partial data, and a timestamp verification technique.
  • the identifying module 217 of the MCU 101 identifies presence of noise in the camera data for processing the camera data upon authenticating the camera data. Thereafter, the identifying module 217 of the MCU 101 processes the camera data using a denoising technique for retrieving original camera data.
  • Figure 3b illustrates a flowchart showing a method for hash value verification of partial data in accordance with some embodiments of present disclosure.
  • the receiving module 213 of the MCU 101 receives a part of a first hash value and the camera data.
  • the first hash value is generated by the imaging and sensing unit 111 using a hash generator application based on at least one of a timestamp and a part of the camera data.
  • the hash generator application is one of a Secure Hash Algorithm (SHA)-256 algorithm, SHA-0 algorithm, SHA-1 algorithm, SHA-2 algorithm and SHA-3 algorithm.
  • SHA Secure Hash Algorithm
  • the generating module 219 of the MCU 101 generates a second hash value using the hash generator application based on at least one of the timestamp and the part of the camera data.
  • the authenticating module 215 of the MCU 101 accepts the camera data if the part of the first hash value and a part of the second hash value match.
  • the authenticating module 215 of the MCU 101 rejects the camera data if the part of the first hash value and the part of the second hash value do not match.
  • Figure 3c illustrates a flowchart showing a method for timestamp verification in accordance with some embodiments of present disclosure.
  • the generating module 219 of the MCU 101 generates a timestamp value using a timestamp received with the camera data and a predetermined factor.
  • the authenticating module 215 of the MCU 101 accepts the camera data when the timestamp value is equal to a predetermined threshold value.
  • the authenticating module 215 of the MCU 101 rejects the camera data when the timestamp value is not equal to the predetermined threshold value.
  • the MCU 101 of the present disclosure prevents (1 ) third party counterfeits by authenticating access request from the imaging and sensing unit 111 , (2) spoofing and reply attacks by authenticating camera data, and (3) physical attacks by identifying presence of noise in the camera data and processing the camera data using a denoising technique for retrieving original camera data.
  • the method steps of the present disclosure provide a robust and sequential authentication process for the camera device and camera data wherein if any of method (i.e., authentication) steps fails, the MCU 101 stops processing further and prevents illegitimate access to the system connected to the camera device.
  • Figure 4 illustrates a block diagram of an exemplary computer system for implementing embodiments consistent with the present disclosure.
  • the computer system 400 is used to implement the MCU 101.
  • the computer system 400 includes a Central Processing Unit (“CPU” or “processor”) 402.
  • the processor 402 includes at least one data processor for authenticating a camera device and camera data.
  • the processor 402 includes specialized processing units such as, integrated system (bus) controllers, memory management control units, floating point units, graphics processing units, digital signal processing units, and the like.
  • the processor 402 is disposed in communication with one or more Input/Output (I/O) devices (not shown in Figure 4) via I/O interface 401.
  • the I/O interface 401 employs communication protocols/methods such as, without limitation, audio, analog, digital, monaural, Radio Corporation of America (RCA) connector, stereo, IEEE® 1394 high speed serial bus, serial bus, Universal Serial Bus (USB), infrared, Personal System/2 (PS/2) port, Bayonet Neill Concelman (BNC) connector, coaxial, component, composite, Digital Visual Interface (DVI), High Definition Multimedia Interface (HDMI®), Radio Frequency (RF) antennas, S Video, Video Graphics Array (VGA), IEEE® 802.11 b/g/n/x, Bluetooth, cellular e.g., Code Division Multiple Access (CDMA), High Speed Packet Access (HSPA+), Global System for Mobile communications (GSM®), Long Term Evolution (LTE®), Worldwide interoperability for Microwave access (WiMax®), Aircraft
  • the computer system 400 uses the I/O interface 401 to communicate with one or more I/O devices such as input devices 412 and output devices 413.
  • the input devices 412 may be an antenna, keyboard, mouse, joystick, (infrared) remote control, camera, card reader, fax machine, dongle, biometric reader, microphone, touch screen, touchpad, trackball, stylus, scanner, storage device, transceiver, video device/source, and the like.
  • the output devices 413 may be a printer, fax machine, video display (e.g., Cathode Ray Tube (CRT), Liquid Crystal Display (LCD), Light Emitting Diode (LED), plasma, Plasma Display Panel (PDP), Organic Light Emitting Diode display (OLED) or the like), audio speaker, and the like.
  • video display e.g., Cathode Ray Tube (CRT), Liquid Crystal Display (LCD), Light Emitting Diode (LED), plasma, Plasma Display Panel (PDP), Organic Light Emitting Diode display (OLED) or the like
  • audio speaker e.g., a printer, fax machine, video display (e.g., Cathode Ray Tube (CRT), Liquid Crystal Display (LCD), Light Emitting Diode (LED), plasma, Plasma Display Panel (PDP), Organic Light Emitting Diode display (OLED) or the like), audio speaker, and the like.
  • CTR Cathode Ray Tube
  • LCD Liquid Crystal
  • the computer system 400 consists of the MCU 101.
  • the processor 402 is disposed in communication with the communication network 109 via a network interface 403.
  • the network interface 403 communicates with the communication network 109.
  • the network interface 403 employs connection protocols including, without limitation, direct connect, Ethernet (e.g., twisted pair 10/100/1000 Base T), Transmission Control Protocol/lnternet Protocol (TCP/IP), token ring, IEEE® 802.11 a/b/g/n/x and the like.
  • the communication network 109 includes, without limitation, a direct interconnection, Local Area Network (LAN), Wide Area Network (WAN), wireless network (e.g., using Wreless Application Protocol), the Internet and the like.
  • the network interface 403 uses the network interface 403 and the communication network 109 to communicate with the imaging and sensing unit 111.
  • the network interface 403 employs connection protocols that include, but not limited to, direct connect, Ethernet (e.g., twisted pair 10/100/1000 Base T), Transmission Control Protocol/lnternet Protocol (TCP/IP), token ring, IEEE® 802.11 a/b/g/n/x and the like.
  • the communication network 109 includes, but is not limited to, a direct interconnection, a Peer to Peer (P2P) network, Local Area Network (LAN), Wide Area Network (WAN), wireless network (e.g., using Wireless Application Protocol), the Internet, Wi Fi, Aircraft Data Network (ARINC664), Transport Layer Security (TLS), and the like.
  • P2P Peer to Peer
  • LAN Local Area Network
  • WAN Wide Area Network
  • wireless network e.g., using Wireless Application Protocol
  • Wi Fi Wireless Application Protocol
  • ARINC664 Aircraft Data Network
  • TLS Transport Layer Security
  • the processor 402 is disposed in communication with a memory 405 (e.g., RAM, ROM, and the like not shown in Figure 4) via a storage interface 404.
  • the storage interface 404 connects to memory 405 including, without limitation, memory drives, removable disc drives and the like, employing connection protocols such as, Serial Advanced Technology Attachment (SATA), Integrated Drive Electronics (IDE), IEEE® 1394, Universal Serial Bus (USB), fiber channel, Small Computer Systems Interface (SCSI) and the like.
  • the memory drives further include a drum, magnetic disc drive, magnetooptical drive, optical drive, Redundant Array of Independent Discs (RAID), solid state memory devices, solid state drives, and the like.
  • the memory 405 stores a collection of program or database components, including, without limitation, user interface 406, an operating system 407, and the like.
  • computer system 400 stores user/application data, such as, the data, variables, records, and the like, as described in this disclosure.
  • databases may be implemented as fault tolerant, relational, scalable, secure databases such as Oracle or Sybase.
  • the operating system 407 facilitates resource management and operation of the computer system 400.
  • operating systems include, without limitation, APPLE® MACINTOSH® OS X®, UNIX®, UNIX like system distributions (e.g., BERKELEY SOFTWARE DISTRIBUTION® (BSD), FREEBSD®, NETBSD®, OPENBSD and the like), LINUX® DISTRIBUTIONS (E.G., RED HAT®, UBUNTU®, KUBUNTU® and the like), IBM®OS/2®, MICROSOFT® WINDOWS® (XP®, VISTA®/7/8, 10 and the like), APPLE® IOS®, GOOGLETM ANDROIDTM, BLACKBERRY® OS, or the like.
  • APPLE® MACINTOSH® OS X® e.g., BERKELEY SOFTWARE DISTRIBUTION® (BSD), FREEBSD®, NETBSD®, OPENBSD and the
  • the computer system 400 implements web browser 408 stored program components.
  • Web browser 408 is a hypertext viewing application, such as MICROSOFT® INTERNET EXPLORER®, GOOGLETM CHROMETM, MOZILLA® FIREFOX®, APPLE® SAFARI® and the like. Secure web browsing is provided using Secure Hypertext Transport Protocol (HTTPS), Secure Sockets Layer (SSL), Transport Layer Security (TLS) and the like. Web browsers 408 utilizes facilities such as AJAX, DHTML, ADOBE® FLASH®, JAVASCRIPT®, JAVA®, Application Programming Interfaces (APIs), and the like.
  • HTTPS Secure Hypertext Transport Protocol
  • SSL Secure Sockets Layer
  • TLS Transport Layer Security
  • Web browsers 408 utilizes facilities such as AJAX, DHTML, ADOBE® FLASH®, JAVASCRIPT®, JAVA®, Application Programming Interfaces (APIs), and the like.
  • the computer system 400 implements a mail server (not shown in Figure 4) stored program component.
  • the mail server is an Internet mail server such as Microsoft Exchange, or the like.
  • the mail server utilizes facilities such as ASP, ACTIVEX®, ANSI® C++/C#, MICROSOFT®, NET, CGI SCRIPTS, JAVA®, JAVASCRIPT®, PERL®, PHP, PYTHON®, WEBOBJECTS® and the like.
  • the mail server utilizes communication protocols such as Internet Message Access Protocol (IMAP), Messaging Application Programming Interface (MAPI), MICROSOFT® exchange, Post Office Protocol (POP), Simple Mail Transfer Protocol (SMTP), or the like.
  • the computer system 400 implements a mail client (not shown in Figure 4) stored program component.
  • the mail client is a mail viewing application, such as APPLE® MAIL, MICROSOFT® ENTOURAGE®, MICROSOFT® OUTLOOK®, MOZILLA® THUNDERBIRD®, and the like.
  • a computer readable storage medium refers to any type of physical memory on which information or data readable by a processor may be stored.
  • a computer readable storage medium stores instructions for execution by one or more processors, including instructions for causing the processor(s) to perform steps or stages consistent with the embodiments described herein.
  • Computer readable medium should be understood to include tangible items and exclude carrier waves and transient signals, i.e. , be non-transitory. Examples include Random Access Memory (RAM), Read Only Memory (ROM), volatile memory, non-volatile memory, hard drives, CD ROMs, DVDs, flash drives, disks, and any other known physical storage media.
  • RAM Random Access Memory
  • ROM Read Only Memory
  • volatile memory non-volatile memory
  • hard drives CD ROMs, DVDs, flash drives, disks, and any other known physical storage media.
  • the described operations may be implemented as a method, an individual unit, system, or article of manufacture using standard programming and/or engineering techniques to produce software, firmware, hardware, or any combination thereof.
  • the described operations may be implemented as code maintained in a “non-transitory computer readable medium”, where a processor may read and execute the code from the computer readable medium.
  • the processor is at least one of a microprocessor and a processor capable of processing and executing the queries.
  • a non-transitory computer readable medium may include media such as magnetic storage medium (e.g., hard disk drives, floppy disks, tape and the like), optical storage (CD ROMs, DVDs, optical disks and the like), volatile and non-volatile memory devices (e.g., EEPROMs, ROMs, PROMs, RAMs, DRAMs, SRAMs, Flash Memory, firmware, programmable logic and the like) and the like. Further, non-transitory computer readable media include all computer readable media except for a transitory. The code implementing the described operations may further be implemented in hardware logic (e.g., an integrated circuit chip, Programmable Gate Array (PGA), Application Specific Integrated Circuit (ASIC) and the like).
  • PGA Programmable Gate Array
  • ASIC Application Specific Integrated Circuit
  • an embodiment means “one or more (but not all) embodiments of the invention(s)” unless expressly specified otherwise.
  • the terms “including”, “comprising”, “having” and variations thereof mean “including but not limited to”, unless expressly specified otherwise.
  • Figures 3a, 3b and 3c show certain events occurring in a certain order. In alternative embodiments, certain operations may be performed in a different order, modified, or removed. Moreover, steps may be added to the abovedescribed logic and still conform to the described embodiments. Further, operations described herein may occur sequentially or certain operations may be processed in parallel. Yet further, operations may be performed by a single processing unit or by distributed processing units. Finally, the language used in the specification has been principally selected for readability and instructional purposes, and it may not have been selected to delineate or circumscribe the inventive subject matter. It is therefore intended that the scope of the invention be limited not by this detailed description, but rather by any claims that issue on an application based here on.

Landscapes

  • Engineering & Computer Science (AREA)
  • Computer Security & Cryptography (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Signal Processing (AREA)
  • Computer Hardware Design (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Computing Systems (AREA)
  • Software Systems (AREA)
  • General Physics & Mathematics (AREA)
  • Physics & Mathematics (AREA)
  • Studio Devices (AREA)
  • Storage Device Security (AREA)

Abstract

Present invention discloses a method, a Micro Control Unit (MCU) (101) and a system for authenticating a camera device and camera data. The MCU (101) receives an access request from an imaging and sensing unit (111). Thereafter, the MCU (101) authenticates the access request using at least one of a key exchange technique, a challenge response technique, a general authentication technique and a mutual authentication technique. Upon authenticating the access request, the MCU (101) receives the camera data from the imaging and sensing unit (111) and authenticates the camera data using at least one of a hash value verification of partial data, and a timestamp verification technique. Lastly, the MCU (101) identifies presence of noise in the camera data for processing the camera data upon authenticating the camera data.

Description

METHOD AND SYSTEM TO AUTHENTICATE CAMERA DEVICE AND CAMERA
DATA FROM COMMON ATTACKS
TECHNICAL FIELD
The present subject matter is generally related to the field of security, more particularly, but not exclusively, to a method, a Micro Control Unit (MCU), and a system for authenticating a camera device and camera data.
BACKGROUND
A monitoring system comprises of software and/or hardware components that help to monitor or supervise infrastructure, traffic conditions, applications, cabin of a vehicle, and the like. The monitoring system, also, help in alerting disruptions (for example, disruption in traffic movements), malfunctions (for example, lift malfunction in an infrastructure), and the like. Although the monitoring system provides safety and security, advancement in technology has made these monitoring systems vulnerable to attacks such as third-party counterfeits, spoofing and reply attacks, and physical attacks (also, referred as perturbation attack).
The information disclosed in this background of the disclosure section is for enhancement of understanding of the general background of the invention and should not be taken as an acknowledgement or any form of suggestion that this information forms the prior art already known to a person skilled in the art.
SUMMARY
In an embodiment, the present disclosure relates to a method of authenticating a camera device and camera data. The method comprising receiving an access request from an imaging and sensing unit. Thereafter, the method comprising authenticating the access request using at least one of a key exchange technique, a challenge response technique, a general authentication technique and a mutual authentication technique. Subsequently, the method comprising receiving the camera data from the imaging and sensing unit upon authenticating the access request. The method comprising authenticating the camera data using at least one of a hash value verification of partial data, and a timestamp verification technique. Lastly, the method comprising identifying presence of noise in the camera data for processing the camera data upon authenticating the camera data.
In another embodiment, the present disclosure relates to a Micro Control Unit (MCU) for authenticating a camera device and camera data. The MCU comprising a processor and a memory communicatively coupled to the processor, wherein the memory stores processor executable instructions, which on execution, cause the processor to receive an access request from an imaging and sensing unit. Thereafter, the processor is configured to authenticate the access request using at least one of a key exchange technique, a challenge response technique, a general authentication technique and a mutual authentication technique. In the subsequent step, the processor is configured to receive the camera data from the imaging and sensing unit upon authenticating the access request. The processor is configured to authenticate the camera data using at least one of a hash value verification of partial data, and a timestamp verification technique. Lastly, the processor is configured to identify presence of noise in the camera data for processing the camera data upon authenticating the camera data.
In yet another embodiment, the present disclosure relates to a system for authenticating a camera device and camera data. The system comprising an imaging and sensing unit 111 comprising the camera device and a Micro Control Unit (MCU) communicatively coupled to the imaging and sensing unit. The MCU is configured to receive an access request from an imaging and sensing unit. Thereafter, the MCU is configured to authenticate the access request using at least one of a key exchange technique, a challenge response technique, a general authentication technique and a mutual authentication technique. In the subsequent step, the MCU is configured to receive the camera data from the imaging and sensing unit upon authenticating the access request. The MCU is configured to authenticate the camera data using at least one of a hash value verification of partial data, and a timestamp verification technique. Lastly, the MCU is configured to identify presence of noise in the camera data for processing the camera data upon authenticating the camera data.
Embodiments of the disclosure according to the above-mentioned method, the MCU, and the system bring about following technical advantages.
The MCU 101 of the present disclosure prevents (1 ) third party counterfeits by authenticating access request from the imaging and sensing unit 111 , (2) spoofing and reply attacks by authenticating camera data, and (3) physical attacks by identifying presence of noise in the camera data and processing the camera data using a denoising technique for retrieving original camera data.
The method steps of the present disclosure provide a robust and sequential authentication process for the camera device and camera data wherein if any of method (i.e., authentication) steps fails, the MCU 101 stops processing further and prevents illegitimate access to the system connected to the camera device.
The foregoing summary is illustrative only and is not intended to be in any way limiting. In addition to the illustrative aspects, embodiments, and features described above, further aspects, embodiments, and features will become apparent by reference to the drawings and the following detailed description.
BRIEF DESCRIPTION OF THE DRAWINGS
The accompanying drawings, which are incorporated in and constitute a part of this disclosure, illustrate exemplary embodiments and together with the description, serve to explain the disclosed principles. In the figures, the left most digit(s) of a reference number identifies the figure in which the reference number first appears. The same numbers are used throughout the figures to reference like features and components. Some embodiments of system and/or methods in accordance with embodiments of the present subject matter are now described below, by way of example only, and with reference to the accompanying figures. Figure 1 illustrates an exemplary environment for authenticating a camera device and camera data in accordance with some embodiments of the present disclosure.
Figure 2 shows a detailed block diagram of a monitoring control unit in accordance with some embodiments of the present disclosure.
Figure 3a illustrates a flowchart showing a method for authenticating a camera device and camera data in accordance with some embodiments of present disclosure.
Figure 3b illustrates a flowchart showing a method for hash value verification of partial data in accordance with some embodiments of present disclosure.
Figure 3c illustrates a flowchart showing a method for timestamp verification in accordance with some embodiments of present disclosure.
Figure 4 illustrates a block diagram of an exemplary computer system for implementing embodiments consistent with the present disclosure.
It should be appreciated by those skilled in the art that any block diagrams herein represent conceptual views of illustrative systems embodying the principles of the present subject matter. Similarly, it will be appreciated that any flowcharts, flow diagrams, state transition diagrams, pseudo code, and the like represent various processes which may be substantially represented in computer readable medium and executed by a computer or processor, whether or not such computer or processor is explicitly shown.
DETAILED DESCRIPTION
In the present document, the word "exemplary" is used herein to mean "serving as an example, instance, or illustration." Any embodiment or implementation of the present subject matter described herein as "exemplary" is not necessarily to be construed as preferred or advantageous over other embodiments. While the disclosure is susceptible to various modifications and alternative forms, specific embodiment thereof has been shown by way of example in the drawings and will be described in detail below. It should be understood, however that it is not intended to limit the disclosure to the particular forms disclosed, but on the contrary, the disclosure is to cover all modifications, equivalents, and alternatives falling within the scope of the disclosure.
The terms “comprises”, “comprising”, or any other variations thereof, are intended to cover a non-exclusive inclusion, such that a setup, device or method that comprises a list of components or steps does not include only those components or steps but may include other components or steps not expressly listed or inherent to such setup or device or method. In other words, one or more elements in a system or apparatus proceeded by “comprises... a” does not, without more constraints, preclude the existence of other elements or additional elements in the system or method.
In the following detailed description of the embodiments of the disclosure, reference is made to the accompanying drawings that form a part hereof, and in which are shown by way of illustration specific embodiments in which the disclosure may be practiced. These embodiments are described in sufficient detail to enable those skilled in the art to practice the disclosure, and it is to be understood that other embodiments may be utilized and that changes may be made without departing from the scope of the present disclosure. The following description is, therefore, not to be taken in a limiting sense.
Embodiment of the present disclosure provides a solution for authenticating a camera device and camera data. The present disclosure discloses a method, a Micro Control Unit (MCU) and a system to authenticate the camera device and camera data. The MCU may, also, be referred as an Electric Control Unit (ECU). The present disclosure involves three authentication steps: first step is authenticating an access request, received from an imaging and sensing unit, using at least one of a key exchange technique, a challenge response technique, a general authentication technique and a mutual authentication technique. This aspect of authentication prevents third-party counterfeit attack i.e., a camera of the imaging and sensing unit is substituted by a fake camera by an attacker/criminal and fake video captured using the fake camera is transferred to the MCU, which make wrong decision according to the fake video received as an input. If the authentication fails at this step, the MCU stops further processing of the input, thereby, preventing the third- party counterfeit attack. Second step is authenticating a camera data, received from the imaging and sensing unit, using at least one of a hash value verification of partial data, and a timestamp verification technique. This aspect of authentication prevents spoofing and reply attacks i.e., a spoofing attack is a situation in which a person or program successfully identifies another entity, which is spoofed or non-genuine, as legitimate and a replay attack (also, referred as a repeat attack or a playback attack) is a form of network attack in which valid data transmission is maliciously or fraudulently repeated or delayed. These attacks are carried out either by an originator or by an adversary who intercepts the data and re-transmits it. The attacker uses spoofing attack to get access to data from the camera of the imaging and sensing unit and then replay a past recorded video. The past recorded video data does not reflect current real-time situation, which leads the MCU to make wrong decisions. If the authentication fails at this step, the MCU stops further processing of the input, thereby, preventing the spoofing and reply attacks. Third step is identifying presence of noise in the camera data for processing the camera data and processing the camera data using a denoising technique for retrieving original camera data. This aspect of authentication (i.e., identifying step) prevents physical attack (also, referred as perturbation attack) i.e., embedded perturbation or disturbance in a source input data of an image to fool machine learning models (for example, classifiers) to output wrong image. If the authentication (i.e., identifying step) fails at this step, the MCU stops further processing of the input, thereby, preventing the physical attack.
Figure 1 illustrates an exemplary environment for authenticating a camera device and camera data in accordance with some embodiments of the present disclosure.
As shown in the Figure 1 , the environment 100 includes a Micro Control Unit (MCU) 101 , a communication network 109, and an imaging and sensing unit 111. In one embodiment, the MCU 101 and the imaging and sensing unit 111 together form a system for authenticating a camera device and camera data. The MCU 101 is a part of a vehicle, or part of a cloud system or a backend system for authenticating a camera device and camera data, or part of an Internet of Things (loT) system, or a part of Artificial Intelligence (Al) module located on a local server or a cloud server or a remote server. The imaging and sensing unit 111 comprises one or more camera device. In one embodiment, the imaging and sensing unit 111 comprises one or more sensors such as, but not limiting to, a camera sensor, an image sensor, and the like in addition to the camera device. The MCU 101 and the imaging and sensing unit 111 communicate using the communication network 109.
The communication network 109 may include, but is not limited to, an e-commerce network, a Peer to Peer (P2P) network, Local Area Network (LAN), Wide Area Network (WAN), wireless network (for example, using Wireless Application Protocol), Internet, Wi Fi, Bluetooth, cellular network, Aircraft Data Network (ARINC664), Transport Layer Security (TLS), and the like.
The MCU 101 communicates with the imaging and sensing unit 111 to authenticate the camera device of the imaging and sensing unit 111 and camera data. The MCU 101 includes an I/O interface 103, a memory 105 and a processor 107. The I/O interface 103 is configured to communicate with the imaging and sensing unit 111. The I/O interface 103 may employ communication protocols/methods such as, without limitation, audio, analog, digital, monaural, Radio Corporation of America (RCA) connector, stereo, IEEE® 1394 high speed serial bus, serial bus, Universal Serial Bus (USB), infrared, Personal System/2 (PS/2) port, Bayonet Neill Concelman (BNC) connector, coaxial, component, composite, Digital Visual Interface (DVI), High Definition Multimedia Interface (HDMI®), Radio Frequency (RF) antennas, S Video, Video Graphics Array (VGA), IEEE® 802.11 b/g/n/x, Bluetooth, cellular e.g., Code Division Multiple Access (CDMA), High Speed Packet Access (HSPA+), Global System for Mobile communications (GSM®), Long Term Evolution (LTE®), Worldwide interoperability for Microwave access (WiMax®), Aircraft Data Network (ARINC664), Transport Layer Security (TLS), or the like. The memory 105 is communicatively coupled to the processor 107 of the MCU 101.
The memory 105, also, stores processor instructions which cause the processor 107 to execute the instructions for authenticating the camera device and camera data.
The processor 107 includes at least one data processor for authenticating the camera device and camera data.
Hereafter, the operation of the MCU 101 for authenticating the camera device and camera data is described.
When the camera device of the imaging and sensing unit 111 wants to transmit camera data to the MCU 101 , an access request is transmitted by the imaging and sensing unit 111. The MCU 101 receives the access request from the imaging and sensing unit 111. The MCU 101 authenticates the access request using at least one of a key exchange technique, a challenge response technique, a general authentication technique, and a mutual authentication technique. The key exchange technique comprises one of an Elliptic-curve algorithm, and a Rivesst-Shamir- Adleman (RSA) algorithm. The challenge response technique comprises one of a Challenge-Handshake Authentication Protocol (CHAP), an OATH Challenge- Response Algorithm (OCRA), and a Salted Challenge Response Authentication Mechanism (SCRAM). The general authentication technique comprises a passwordbased authentication technique, a multi-factor authentication technique, a certificatebased authentication technique, a biometric authentication technique, and a tokenbased authentication technique. The mutual authentication technique is, also, referred as a two-way authentication. The mutual authentication technique comprises components such as Burrows-Abadi-Needham logic, digital certificate, X.509 certificate, secure sockets layer and transport layer security. In one embodiment, the mutual authentication technique is, not limited to, Hash-based Message Authentication Code (HMAC) or Secure Hash Algorithm (SHA)-256 algorithm. If the authentication fails, the MCU 101 stops processing further. Upon (successful) authenticating the access request, the MCU 101 receives the camera data from the imaging and sensing unit 111. The MCU 101 authenticates the camera data using at least one of a hash value verification of partial data, and a timestamp verification technique. For performing the hash value verification of partial data, the MCU 101 receives a part of a first hash value and the camera data. The first hash value is generated by the imaging and sensing unit 111 using a hash generator application based on at least one of a timestamp and a part of the camera data. Thereafter, the MCU 101 generates a second hash value using the hash generator application based on at least one of the timestamp and the part of the camera data. The hash generator application is one of SHA-256 algorithm, SHA-O algorithm, SHA- 1 algorithm, SHA-2 algorithm, and SHA-3 algorithm. If the part of the first hash value and a part of the second hash value match, the MCU 101 accepts the camera data. If the part of the first hash value and the part of the second hash value do not match, the MCU 101 rejects the camera data. For the timestamp verification, the MCU 101 generates a timestamp value using a timestamp received with the camera data and a predetermined factor. In one embodiment, the predetermined factor is a partial camera pixel data. To perform hashing for all camera data is a huge overhead due the size of camera data. Therefore, to overcome huge data overhead, the partial camera pixel data is considered and hashed. This approach of using the partial camera pixel data, also, ensures real-time transactions and data integrity. When the timestamp value is equal to a predetermined threshold value, the MCU 101 accepts the camera data. When the timestamp value is not equal to the predetermined threshold value, the MCU 101 rejects the camera data. The predetermined threshold value is a user-defined numeric value or a real-time numeric value. For example, the real-time numeric value may be the timestamp value plus numeric value 1 or 2. If the authentication fails, the MCU 101 stops processing further. Upon (successful) authenticating the camera data, the MCU 101 identifies presence of noise in the camera data for processing the camera data. If no noise identified (or present) in the camera data, the MCU 101 performs no further processing and considers the camera data to be original camera data. If noise is identified (or present) in the camera data, the MCU 101 processes the camera data using a denoising technique for retrieving original camera data. The denoising technique comprises one of a box averaging technique, a simple convolution technique, a gaussian convolution technique, a Gaussian filter, and denoising by averaging noisy images. The original camera data may be transmitted to the vehicle or to the Al module or to the loT system for further processing. Figure 2 shows a detailed block diagram of a Micro Control Unit (MCU) in accordance with some embodiments of the present disclosure.
The MCU 101 , in addition to the I/O interface 103 and the processor 107 described above, includes data 201 and one or more modules 211 , which are described herein in detail. In an embodiment, the data 201 is stored within the memory 105. The data 201 includes, for example, access request data 203, camera data 205, and other data 207.
The access request data 203 includes the access request received from the imaging and sensing unit 111. The access request is a signal requesting to access the image sensor and/or camera sensor of the imaging and sensing unit 111.
The camera data 205 includes the camera data received from the imaging and sensing unit 111. The camera data is in pixel format.
The other data 207 may store data, including temporary data and temporary files, generated by one or more modules 211 for performing the various functions of the MCU 101.
In an embodiment, the data 201 in the memory 105 is processed by the one or more modules 211 present within the memory 105 of the MCU 101. In an embodiment, the one or more modules 211 may be implemented as dedicated hardware units. As used herein, the term module refers to an Application Specific Integrated Circuit (ASIC), an electronic circuit, a Field Programmable Gate Arrays (FPGA), Programmable System on Chip (PSoC), a combinational logic circuit, and/or other suitable components that provide the described functionality. In some implementations, the one or more modules 211 are communicatively coupled to the processor 107 for performing one or more functions of the MCU 101. The one or more modules 211 when configured with the functionality defined in the present disclosure will result in a novel hardware. In one implementation, the one or more modules 211 include, but are not limited to, a receiving module 213, an authenticating module 215, an identifying module 217, and a generating module 219. The one or more modules 211 , also, include other modules 221 to perform various miscellaneous functionalities of the MCU 101.
The receiving module 213 receives the access request from the imaging and sensing unit 111 through the I/O interface 103. The receiving module 213, upon authenticating the access request, receives the camera data from the imaging and sensing unit 111. During the hash value verification of partial data, the receiving module 213 receives a part of the first hash value and the camera data through the I/O interface 103. The first hash value is generated by the imaging and sensing unit 111 using a hash generator application based on at least one of a timestamp and a part of the camera data.
The authenticating module 215 authenticates (or verifies) the access request using at least one of a key exchange technique, a challenge response technique, a general authentication technique and a mutual authentication technique. The authenticating module 215 authenticates (or verifies) the camera data using at least one of a hash value verification of partial data, and a timestamp verification technique. During the hash value verification of partial data, the authenticating module 215 accepts the camera data if the part of the first hash value and a part of the second hash value match and rejects the camera data if the part of the first hash value and the part of the second hash value do not match. During the timestamp verification, the authenticating module 215 accepts the camera data when the timestamp value is equal to a predetermined threshold value and rejects the camera data when the timestamp value is not equal to the predetermined threshold value.
The identifying module 217 identifies presence of noise in the camera data for processing the camera data upon authenticating the camera data. Further, the identifying module 217 processes the camera data using a denoising technique for retrieving original camera data. The generating module 219, during the hash value verification of partial data, generates a second hash value using the hash generator application based on at least one of the timestamp and the part of the camera data. The generating module 219, during the timestamp verification, generates a timestamp value using a timestamp received with the camera data and a predetermined factor.
Figure 3a illustrates a flowchart showing a method for authenticating a camera device and camera data in accordance with some embodiments of present disclosure.
As illustrated in Figure 3a, the method 300a includes one or more blocks for authenticating a camera device and camera data. The method 300a may be described in the general context of computer executable instructions. Generally, computer executable instructions can include routines, programs, objects, components, data structures, procedures, modules, and functions, which perform particular functions or implement particular abstract data types.
The order in which the method 300a is described is not intended to be construed as a limitation, and any number of the described method blocks can be combined in any order to implement the method. Additionally, individual blocks may be deleted from the methods without departing from the scope of the subject matter described herein. Furthermore, the method can be implemented in any suitable hardware, software, firmware, or combination thereof.
At block 301 , the receiving module 213 of the MCU 101 receives an access request from the imaging and sensing unit 111.
At block 303, the authenticating module 215 of the MCU 101 authenticates the access request using at least one of a key exchange technique, a challenge response technique, a general authentication technique and a mutual authentication technique. The key exchange technique comprises one of an Elliptic-curve algorithm, and a Rivesst-Shamir-Adleman (RSA) algorithm. The challenge response technique comprises one of a Challenge-Handshake Authentication Protocol (CHAP), an OATH Challenge-Response Algorithm (OCRA), and a Salted Challenge Response Authentication Mechanism (SCRAM).
At block 305, the receiving module 213 of the MCU 101 receives the camera data from the imaging and sensing unit 111 upon authenticating the access request.
At block 307, the authenticating module 215 of the MCU 101 authenticates the camera data using at least one of a hash value verification of partial data, and a timestamp verification technique.
At block 309, the identifying module 217 of the MCU 101 identifies presence of noise in the camera data for processing the camera data upon authenticating the camera data. Thereafter, the identifying module 217 of the MCU 101 processes the camera data using a denoising technique for retrieving original camera data.
Figure 3b illustrates a flowchart showing a method for hash value verification of partial data in accordance with some embodiments of present disclosure.
At block 321 , the receiving module 213 of the MCU 101 receives a part of a first hash value and the camera data. The first hash value is generated by the imaging and sensing unit 111 using a hash generator application based on at least one of a timestamp and a part of the camera data. The hash generator application is one of a Secure Hash Algorithm (SHA)-256 algorithm, SHA-0 algorithm, SHA-1 algorithm, SHA-2 algorithm and SHA-3 algorithm.
At block 323, the generating module 219 of the MCU 101 generates a second hash value using the hash generator application based on at least one of the timestamp and the part of the camera data.
At block 325, the authenticating module 215 of the MCU 101 accepts the camera data if the part of the first hash value and a part of the second hash value match. At block 327, the authenticating module 215 of the MCU 101 rejects the camera data if the part of the first hash value and the part of the second hash value do not match.
Figure 3c illustrates a flowchart showing a method for timestamp verification in accordance with some embodiments of present disclosure.
At block 331 , the generating module 219 of the MCU 101 generates a timestamp value using a timestamp received with the camera data and a predetermined factor.
At block 333, the authenticating module 215 of the MCU 101 accepts the camera data when the timestamp value is equal to a predetermined threshold value.
At block 335, the authenticating module 215 of the MCU 101 rejects the camera data when the timestamp value is not equal to the predetermined threshold value.
Some of the technical advantages of the present disclosure are listed below.
The MCU 101 of the present disclosure prevents (1 ) third party counterfeits by authenticating access request from the imaging and sensing unit 111 , (2) spoofing and reply attacks by authenticating camera data, and (3) physical attacks by identifying presence of noise in the camera data and processing the camera data using a denoising technique for retrieving original camera data.
The method steps of the present disclosure provide a robust and sequential authentication process for the camera device and camera data wherein if any of method (i.e., authentication) steps fails, the MCU 101 stops processing further and prevents illegitimate access to the system connected to the camera device.
Figure 4 illustrates a block diagram of an exemplary computer system for implementing embodiments consistent with the present disclosure.
In an embodiment, the computer system 400 is used to implement the MCU 101. The computer system 400 includes a Central Processing Unit (“CPU” or “processor”) 402. The processor 402 includes at least one data processor for authenticating a camera device and camera data. The processor 402 includes specialized processing units such as, integrated system (bus) controllers, memory management control units, floating point units, graphics processing units, digital signal processing units, and the like.
The processor 402 is disposed in communication with one or more Input/Output (I/O) devices (not shown in Figure 4) via I/O interface 401. The I/O interface 401 employs communication protocols/methods such as, without limitation, audio, analog, digital, monaural, Radio Corporation of America (RCA) connector, stereo, IEEE® 1394 high speed serial bus, serial bus, Universal Serial Bus (USB), infrared, Personal System/2 (PS/2) port, Bayonet Neill Concelman (BNC) connector, coaxial, component, composite, Digital Visual Interface (DVI), High Definition Multimedia Interface (HDMI®), Radio Frequency (RF) antennas, S Video, Video Graphics Array (VGA), IEEE® 802.11 b/g/n/x, Bluetooth, cellular e.g., Code Division Multiple Access (CDMA), High Speed Packet Access (HSPA+), Global System for Mobile communications (GSM®), Long Term Evolution (LTE®), Worldwide interoperability for Microwave access (WiMax®), Aircraft Data Network (ARINC664), or the like.
Using the I/O interface 401 , the computer system 400 communicates with one or more I/O devices such as input devices 412 and output devices 413. For example, the input devices 412 may be an antenna, keyboard, mouse, joystick, (infrared) remote control, camera, card reader, fax machine, dongle, biometric reader, microphone, touch screen, touchpad, trackball, stylus, scanner, storage device, transceiver, video device/source, and the like. The output devices 413 may be a printer, fax machine, video display (e.g., Cathode Ray Tube (CRT), Liquid Crystal Display (LCD), Light Emitting Diode (LED), plasma, Plasma Display Panel (PDP), Organic Light Emitting Diode display (OLED) or the like), audio speaker, and the like.
In some embodiments, the computer system 400 consists of the MCU 101. The processor 402 is disposed in communication with the communication network 109 via a network interface 403. The network interface 403 communicates with the communication network 109. The network interface 403 employs connection protocols including, without limitation, direct connect, Ethernet (e.g., twisted pair 10/100/1000 Base T), Transmission Control Protocol/lnternet Protocol (TCP/IP), token ring, IEEE® 802.11 a/b/g/n/x and the like. The communication network 109 includes, without limitation, a direct interconnection, Local Area Network (LAN), Wide Area Network (WAN), wireless network (e.g., using Wreless Application Protocol), the Internet and the like. Using the network interface 403 and the communication network 109, the computer system 400 communicates with the imaging and sensing unit 111. The network interface 403 employs connection protocols that include, but not limited to, direct connect, Ethernet (e.g., twisted pair 10/100/1000 Base T), Transmission Control Protocol/lnternet Protocol (TCP/IP), token ring, IEEE® 802.11 a/b/g/n/x and the like.
The communication network 109 includes, but is not limited to, a direct interconnection, a Peer to Peer (P2P) network, Local Area Network (LAN), Wide Area Network (WAN), wireless network (e.g., using Wireless Application Protocol), the Internet, Wi Fi, Aircraft Data Network (ARINC664), Transport Layer Security (TLS), and the like.
In some embodiments, the processor 402 is disposed in communication with a memory 405 (e.g., RAM, ROM, and the like not shown in Figure 4) via a storage interface 404. The storage interface 404 connects to memory 405 including, without limitation, memory drives, removable disc drives and the like, employing connection protocols such as, Serial Advanced Technology Attachment (SATA), Integrated Drive Electronics (IDE), IEEE® 1394, Universal Serial Bus (USB), fiber channel, Small Computer Systems Interface (SCSI) and the like. The memory drives further include a drum, magnetic disc drive, magnetooptical drive, optical drive, Redundant Array of Independent Discs (RAID), solid state memory devices, solid state drives, and the like.
The memory 405 stores a collection of program or database components, including, without limitation, user interface 406, an operating system 407, and the like. In some embodiments, computer system 400 stores user/application data, such as, the data, variables, records, and the like, as described in this disclosure. Such databases may be implemented as fault tolerant, relational, scalable, secure databases such as Oracle or Sybase.
The operating system 407 facilitates resource management and operation of the computer system 400. Examples of operating systems include, without limitation, APPLE® MACINTOSH® OS X®, UNIX®, UNIX like system distributions (e.g., BERKELEY SOFTWARE DISTRIBUTION® (BSD), FREEBSD®, NETBSD®, OPENBSD and the like), LINUX® DISTRIBUTIONS (E.G., RED HAT®, UBUNTU®, KUBUNTU® and the like), IBM®OS/2®, MICROSOFT® WINDOWS® (XP®, VISTA®/7/8, 10 and the like), APPLE® IOS®, GOOGLE™ ANDROID™, BLACKBERRY® OS, or the like.
In some embodiments, the computer system 400 implements web browser 408 stored program components. Web browser 408 is a hypertext viewing application, such as MICROSOFT® INTERNET EXPLORER®, GOOGLE™ CHROME™, MOZILLA® FIREFOX®, APPLE® SAFARI® and the like. Secure web browsing is provided using Secure Hypertext Transport Protocol (HTTPS), Secure Sockets Layer (SSL), Transport Layer Security (TLS) and the like. Web browsers 408 utilizes facilities such as AJAX, DHTML, ADOBE® FLASH®, JAVASCRIPT®, JAVA®, Application Programming Interfaces (APIs), and the like. The computer system 400 implements a mail server (not shown in Figure 4) stored program component. The mail server is an Internet mail server such as Microsoft Exchange, or the like. The mail server utilizes facilities such as ASP, ACTIVEX®, ANSI® C++/C#, MICROSOFT®, NET, CGI SCRIPTS, JAVA®, JAVASCRIPT®, PERL®, PHP, PYTHON®, WEBOBJECTS® and the like. The mail server utilizes communication protocols such as Internet Message Access Protocol (IMAP), Messaging Application Programming Interface (MAPI), MICROSOFT® exchange, Post Office Protocol (POP), Simple Mail Transfer Protocol (SMTP), or the like. The computer system 400 implements a mail client (not shown in Figure 4) stored program component. The mail client is a mail viewing application, such as APPLE® MAIL, MICROSOFT® ENTOURAGE®, MICROSOFT® OUTLOOK®, MOZILLA® THUNDERBIRD®, and the like. Furthermore, one or more computer readable storage media may be utilized in implementing embodiments consistent with the present disclosure. A computer readable storage medium refers to any type of physical memory on which information or data readable by a processor may be stored. Thus, a computer readable storage medium stores instructions for execution by one or more processors, including instructions for causing the processor(s) to perform steps or stages consistent with the embodiments described herein. The term “computer readable medium” should be understood to include tangible items and exclude carrier waves and transient signals, i.e. , be non-transitory. Examples include Random Access Memory (RAM), Read Only Memory (ROM), volatile memory, non-volatile memory, hard drives, CD ROMs, DVDs, flash drives, disks, and any other known physical storage media.
The described operations may be implemented as a method, an individual unit, system, or article of manufacture using standard programming and/or engineering techniques to produce software, firmware, hardware, or any combination thereof. The described operations may be implemented as code maintained in a “non-transitory computer readable medium”, where a processor may read and execute the code from the computer readable medium. The processor is at least one of a microprocessor and a processor capable of processing and executing the queries. A non-transitory computer readable medium may include media such as magnetic storage medium (e.g., hard disk drives, floppy disks, tape and the like), optical storage (CD ROMs, DVDs, optical disks and the like), volatile and non-volatile memory devices (e.g., EEPROMs, ROMs, PROMs, RAMs, DRAMs, SRAMs, Flash Memory, firmware, programmable logic and the like) and the like. Further, non-transitory computer readable media include all computer readable media except for a transitory. The code implementing the described operations may further be implemented in hardware logic (e.g., an integrated circuit chip, Programmable Gate Array (PGA), Application Specific Integrated Circuit (ASIC) and the like).
The terms “an embodiment”, “embodiment”, “embodiments”, “the embodiment”, “the embodiments”, “one or more embodiments”, “some embodiments”, and “one embodiment” mean “one or more (but not all) embodiments of the invention(s)” unless expressly specified otherwise. The terms “including”, “comprising”, “having” and variations thereof mean “including but not limited to”, unless expressly specified otherwise.
The enumerated listing of items does not imply that any or all of the items are mutually exclusive, unless expressly specified otherwise.
The terms “a”, “an” and “the” mean “one or more”, unless expressly specified otherwise.
A description of an embodiment with several components in communication with each other does not imply that all such components are required. On the contrary, a variety of optional components are described to illustrate the wide variety of possible embodiments of the invention.
When a single device or article is described herein, it will be readily apparent that more than one device/article (whether or not they cooperate) may be used in place of a single device/article. Similarly, where more than one device or article is described herein (whether or not they cooperate), it will be readily apparent that a single device/article may be used in place of the more than one device or article, or a different number of devices/articles may be used instead of the shown number of devices or programs. The functionality and/or the features of a device may be alternatively embodied by one or more other devices which are not explicitly described as having such functionality/features. Thus, other embodiments of the invention need not include the device itself.
The illustrated operations of Figures 3a, 3b and 3c show certain events occurring in a certain order. In alternative embodiments, certain operations may be performed in a different order, modified, or removed. Moreover, steps may be added to the abovedescribed logic and still conform to the described embodiments. Further, operations described herein may occur sequentially or certain operations may be processed in parallel. Yet further, operations may be performed by a single processing unit or by distributed processing units. Finally, the language used in the specification has been principally selected for readability and instructional purposes, and it may not have been selected to delineate or circumscribe the inventive subject matter. It is therefore intended that the scope of the invention be limited not by this detailed description, but rather by any claims that issue on an application based here on. Accordingly, the disclosure of the embodiments of the invention is intended to be illustrative, but not limiting, of the scope of the invention, which is set forth in the following claims. While various aspects and embodiments have been disclosed herein, other aspects and embodiments will be apparent to those skilled in the art. The various aspects and embodiments disclosed herein are for purposes of illustration and are not intended to be limiting, with the scope being indicated by the following claims.
REFERRAL NUMERALS
Figure imgf000022_0001

Claims

1 . A method of authenticating a camera device and camera data, the method comprising: receiving (301 ) an access request from an imaging and sensing unit (111 ); authenticating (303) the access request using at least one of a key exchange technique, a challenge response technique, a general authentication technique and a mutual authentication technique; receiving (305) the camera data from the imaging and sensing unit (111 ) upon authenticating the access request; authenticating (307) the camera data using at least one of a hash value verification of partial data, and a timestamp verification technique; and identifying (309) presence of noise in the camera data for processing the camera data upon authenticating the camera data.
2. The method of claim 1 , wherein identifying presence of noise in the camera data further comprising: processing the camera data using a denoising technique for retrieving original camera data.
3. The method of any of the claims 1 to 2, wherein the key exchange technique comprises one of an Elliptic-curve algorithm, and a Rivesst-Shamir-Adleman (RSA) algorithm.
4. The method of any of the claims 1 to 3, wherein the challenge response technique comprises one of a Challenge-Handshake Authentication Protocol (CHAP), an OATH Challenge-Response Algorithm (OCRA), and a Salted Challenge Response Authentication Mechanism (SCRAM).
5. The method of claim any of the claims 1 to 4, wherein the hash value verification of partial data comprises: receiving (321 ) a part of a first hash value and the camera data, wherein the first hash value is generated by the imaging and sensing unit (111 ) using a hash generator application based on at least one of a timestamp and a part of the camera data; generating (323) a second hash value using the hash generator application based on at least one of the timestamp and the part of the camera data; accepting (325) the camera data if the part of the first hash value and a part of the second hash value match; and rejecting (327) the camera data if the part of the first hash value and the part of the second hash value do not match.
6. The method of claim 5, wherein the hash generator application is one of a Secure Hash Algorithm (SHA)-256 algorithm, SHA-0 algorithm, SHA-1 algorithm, SHA-2 algorithm and SHA-3 algorithm.
7. The method of any of the claims 1 to 6, wherein the timestamp verification technique comprises: generating (331 ) a timestamp value using a timestamp received with the camera data and a predetermined factor; accepting (333) the camera data when the timestamp value is equal to a predetermined threshold value; and rejecting (335) the camera data when the timestamp value is not equal to the predetermined threshold value.
8. A Micro Control Unit (MCU) (101 ) for authenticating a camera device and camera data, the MCU (101 ) comprising: a processor; and a memory communicatively coupled to the processor, wherein the memory stores processor-executable instructions, which on execution, cause the processor to: receive an access request from an imaging and sensing unit (111 ); authenticate the access request using at least one of a key exchange technique, a challenge response technique, a general authentication technique and a mutual authentication technique; receive the camera data from the imaging and sensing unit (111 ) upon authenticating the access request; authenticate the camera data using at least one of a hash value verification of partial data, and a timestamp verification technique; and identify presence of noise in the camera data for processing the camera data upon authenticating the camera data.
9. The MCU (101 ) of claim 8, wherein the MCU (101 ) is configured to: process the camera data using a denoising technique for retrieving original camera data.
10. The MCU (101 ) of any of the claims 8 to 9, wherein the key exchange technique comprises one of an Elliptic-curve algorithm and a Rivest-Shamir- Adleman (RSA) algorithm.
11 . The MCU (101 ) of any of the claims 8 to 10, wherein the challenge response technique comprises one of a Challenge-Handshake Authentication Protocol (CHAP), an OATH Challenge-Response Algorithm (OCRA), and a Salted Challenge Response Authentication Mechanism (SCRAM).
12. The MCU (101 ) of any of the claims 8 to 11 , wherein the MCU (101 ) is configured to: receive a part of a first hash value and the camera data, wherein the first hash value is generated by the imaging and sensing device unit using a hash generator application based on at least one of a timestamp and a part of the camera data; generate a second hash value using the hash generator application based on at least one of the timestamp and the part of the camera data; accept the camera data if the part of the first hash value and a part of the second hash value match; and reject the camera data if the part of the first hash value and the part of the second hash value do not match.
13. The MCU (101 ) of claim 12, wherein the hash generator application is one of a Secure Hash Algorithm (SHA)-256 algorithm, SHA-O algorithm, SHA-1 algorithm, SHA-2 algorithm and SHA-3 algorithm.
14. The MCU (101 ) of any of the claims 8 to 13, wherein the MCU (101 ) is configured to: generate a timestamp value using a timestamp received with the camera data and a predetermined factor; accept the camera data when the timestamp value is equal to a predetermined threshold value; and reject the camera data when the timestamp value is not equal to the predetermined threshold value.
15. A system for authenticating a camera device and camera data, the system comprising: an imaging and sensing unit (111 ) comprising the camera device; a Micro Control Unit (MCU) (101 ) communicatively coupled to the imaging and sensing unit (111 ), the MCU (101 ) is configured to: receive an access request from the imaging and sensing unit (111 ); authenticate the access request using at least one of a key exchange technique, a challenge response technique, a general authentication technique and a mutual authentication technique; receive the camera data from the imaging and sensing unit (111 ) upon authenticating the access request; authenticate the camera data using at least one of a hash value verification of partial data, and a timestamp verification technique; and identify presence of noise in the camera data for processing the camera data upon authenticating the camera data.
PCT/EP2023/072408 2022-09-14 2023-08-14 Method and system to authenticate camera device and camera data from common attacks WO2024056293A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
GB2213411.8A GB2622579A (en) 2022-09-14 2022-09-14 Method and system to authenticate camera device and camera data from common attacks
GB2213411.8 2022-09-14

Publications (1)

Publication Number Publication Date
WO2024056293A1 true WO2024056293A1 (en) 2024-03-21

Family

ID=83945154

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/EP2023/072408 WO2024056293A1 (en) 2022-09-14 2023-08-14 Method and system to authenticate camera device and camera data from common attacks

Country Status (2)

Country Link
GB (1) GB2622579A (en)
WO (1) WO2024056293A1 (en)

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111460526B (en) * 2020-04-17 2021-10-12 支付宝(杭州)信息技术有限公司 Image data recording, acquiring and verifying method and device based on block chain
US20210344675A1 (en) * 2019-04-08 2021-11-04 Tencent Technology (Shenzhen) Company Limited Identity verification method and apparatus, storage medium, and computer device
US20220046000A1 (en) * 2017-10-10 2022-02-10 Truepic Inc. Methods for authenticating photographic image data

Family Cites Families (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20030095180A1 (en) * 2001-11-21 2003-05-22 Montgomery Dennis L. Method and system for size adaptation and storage minimization source noise correction, and source watermarking of digital data frames
CN105847243B (en) * 2016-03-18 2021-02-26 北京小米移动软件有限公司 Method and device for accessing intelligent camera
US20220053123A1 (en) * 2020-08-12 2022-02-17 Comcast Cable Communications, Llc Method and apparatus for independent authentication of video

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20220046000A1 (en) * 2017-10-10 2022-02-10 Truepic Inc. Methods for authenticating photographic image data
US20210344675A1 (en) * 2019-04-08 2021-11-04 Tencent Technology (Shenzhen) Company Limited Identity verification method and apparatus, storage medium, and computer device
CN111460526B (en) * 2020-04-17 2021-10-12 支付宝(杭州)信息技术有限公司 Image data recording, acquiring and verifying method and device based on block chain

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
NIMMY K ET AL: "A Novel Multi-factor Authentication Protocol for Smart Home Environments", 5 December 2018, ADVANCES IN DATABASES AND INFORMATION SYSTEMS; [LECTURE NOTES IN COMPUTER SCIENCE; LECT.NOTES COMPUTER], SPRINGER INTERNATIONAL PUBLISHING, CHAM, PAGE(S) 44 - 63, ISBN: 978-3-319-10403-4, XP047497201 *

Also Published As

Publication number Publication date
GB2622579A (en) 2024-03-27
GB202213411D0 (en) 2022-10-26

Similar Documents

Publication Publication Date Title
US11848939B2 (en) System and method for managing and securing a distributed ledger for a decentralized peer-to-peer network
US11438361B2 (en) Method and system for predicting an attack path in a computer network
CN110413908B (en) Method and device for classifying uniform resource locators based on website content
US20210258330A1 (en) Detecting compromised credentials in a credential stuffing attack
US10404687B2 (en) Method and system for providing a pre-launched virtual desktop session
WO2019067993A1 (en) Phishing attack detection
US11093931B2 (en) Method and system for authenticating digital transactions
US20160373480A1 (en) Method and device for evaluating security assessment of an application
WO2020016906A1 (en) Method and system for intrusion detection in an enterprise
US20230122389A1 (en) Method and apparatus for managing security context related to ue
WO2022183832A1 (en) User account risk measurement method and related apparatus
US20180270260A1 (en) Method and a System for Facilitating Network Security
WO2024056293A1 (en) Method and system to authenticate camera device and camera data from common attacks
US10848462B2 (en) Method and system for managing policies in a network security system
US20230316256A1 (en) Method and system for enabling communication between electronic devices using a printer application
US11171931B2 (en) Method and system for providing a light-weight secure communication for computing devices
US20180013482A1 (en) Method and unit for mapping information bits to polarization angles of a wave
US10769430B2 (en) Method and system for correcting fabrication in a document
US20240056298A1 (en) Systems and methods for linking an authentication account to a device
US11797983B2 (en) Method and system of authenticating a payment transaction using a user device
WO2023089366A1 (en) Method for verifying identity of industrial device, an industrial device and a computing system
WO2020162316A1 (en) System and method for heterogeneous asymmetric systems which manage mixed critical functionality
WO2024041897A1 (en) Secure communication for unmanned aerial vehicle in integrated ecosystem
EP4086766A1 (en) Computer-implemented method and system for providing dynamic endpoints for performing data transactions
Singh et al. Check for updates A Study of Implementing a Blockchain-Based Forensic Model Integration (BBFMI) for IoT Devices in Digital Forensics

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 23755405

Country of ref document: EP

Kind code of ref document: A1