WO2015173001A1 - Verification of images captured using a timestamp decoded from illumination from a modulated light source - Google Patents

Verification of images captured using a timestamp decoded from illumination from a modulated light source Download PDF

Info

Publication number
WO2015173001A1
WO2015173001A1 PCT/EP2015/059116 EP2015059116W WO2015173001A1 WO 2015173001 A1 WO2015173001 A1 WO 2015173001A1 EP 2015059116 W EP2015059116 W EP 2015059116W WO 2015173001 A1 WO2015173001 A1 WO 2015173001A1
Authority
WO
WIPO (PCT)
Prior art keywords
timestamp
images
image
light
location
Prior art date
Application number
PCT/EP2015/059116
Other languages
French (fr)
Inventor
Alexander Henricus Waltherus Van Eeuwijk
William Nathan CHELTON
Original Assignee
Koninklijke Philips N.V.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Koninklijke Philips N.V. filed Critical Koninklijke Philips N.V.
Priority to CN201580038006.0A priority Critical patent/CN106471795B/en
Priority to JP2016567213A priority patent/JP6545192B2/en
Priority to RU2016148406A priority patent/RU2016148406A/en
Priority to EP15717921.9A priority patent/EP3143759A1/en
Priority to US15/311,160 priority patent/US10074030B2/en
Publication of WO2015173001A1 publication Critical patent/WO2015173001A1/en

Links

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04BTRANSMISSION
    • H04B10/00Transmission systems employing electromagnetic waves other than radio-waves, e.g. infrared, visible or ultraviolet light, or employing corpuscular radiation, e.g. quantum communication
    • H04B10/11Arrangements specific to free-space transmission, i.e. transmission through air or vacuum
    • H04B10/114Indoor or close-range type systems
    • H04B10/116Visible light communication
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F21/00Security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
    • G06F21/60Protecting data
    • G06F21/602Providing cryptographic facilities or services
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/16Human faces, e.g. facial parts, sketches or expressions
    • G06V40/172Classification, e.g. identification
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N1/00Scanning, transmission or reproduction of documents or the like, e.g. facsimile transmission; Details thereof
    • H04N1/32Circuits or arrangements for control or supervision between transmitter and receiver or between image input and image output device, e.g. between a still-image camera and its memory or between a still-image camera and a printer device
    • H04N1/32101Display, printing, storage or transmission of additional information, e.g. ID code, date and time or title
    • H04N1/32128Display, printing, storage or transmission of additional information, e.g. ID code, date and time or title attached to the image data, e.g. file header, transmitted message header, information on the same page or in the same computer file as the image
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N1/00Scanning, transmission or reproduction of documents or the like, e.g. facsimile transmission; Details thereof
    • H04N1/32Circuits or arrangements for control or supervision between transmitter and receiver or between image input and image output device, e.g. between a still-image camera and its memory or between a still-image camera and a printer device
    • H04N1/32101Display, printing, storage or transmission of additional information, e.g. ID code, date and time or title
    • H04N1/32144Display, printing, storage or transmission of additional information, e.g. ID code, date and time or title embedded in the image data, i.e. enclosed or integrated in the image, e.g. watermark, super-imposed logo or stamp
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N1/00Scanning, transmission or reproduction of documents or the like, e.g. facsimile transmission; Details thereof
    • H04N1/32Circuits or arrangements for control or supervision between transmitter and receiver or between image input and image output device, e.g. between a still-image camera and its memory or between a still-image camera and a printer device
    • H04N1/32101Display, printing, storage or transmission of additional information, e.g. ID code, date and time or title
    • H04N1/32144Display, printing, storage or transmission of additional information, e.g. ID code, date and time or title embedded in the image data, i.e. enclosed or integrated in the image, e.g. watermark, super-imposed logo or stamp
    • H04N1/32149Methods relating to embedding, encoding, decoding, detection or retrieval operations
    • H04N1/32203Spatial or amplitude domain methods
    • H04N1/32208Spatial or amplitude domain methods involving changing the magnitude of selected pixels, e.g. overlay of information or super-imposition
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N1/00Scanning, transmission or reproduction of documents or the like, e.g. facsimile transmission; Details thereof
    • H04N1/32Circuits or arrangements for control or supervision between transmitter and receiver or between image input and image output device, e.g. between a still-image camera and its memory or between a still-image camera and a printer device
    • H04N1/32101Display, printing, storage or transmission of additional information, e.g. ID code, date and time or title
    • H04N1/32144Display, printing, storage or transmission of additional information, e.g. ID code, date and time or title embedded in the image data, i.e. enclosed or integrated in the image, e.g. watermark, super-imposed logo or stamp
    • H04N1/32149Methods relating to embedding, encoding, decoding, detection or retrieval operations
    • H04N1/32203Spatial or amplitude domain methods
    • H04N1/32251Spatial or amplitude domain methods in multilevel data, e.g. greyscale or continuous tone data
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N1/00Scanning, transmission or reproduction of documents or the like, e.g. facsimile transmission; Details thereof
    • H04N1/32Circuits or arrangements for control or supervision between transmitter and receiver or between image input and image output device, e.g. between a still-image camera and its memory or between a still-image camera and a printer device
    • H04N1/32101Display, printing, storage or transmission of additional information, e.g. ID code, date and time or title
    • H04N1/32144Display, printing, storage or transmission of additional information, e.g. ID code, date and time or title embedded in the image data, i.e. enclosed or integrated in the image, e.g. watermark, super-imposed logo or stamp
    • H04N1/32149Methods relating to embedding, encoding, decoding, detection or retrieval operations
    • H04N1/32309Methods relating to embedding, encoding, decoding, detection or retrieval operations in colour image data
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/56Cameras or camera modules comprising electronic image sensors; Control thereof provided with illuminating means
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2221/00Indexing scheme relating to security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
    • G06F2221/21Indexing scheme relating to G06F21/00 and subgroups addressing additional information or applications relating to security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
    • G06F2221/2151Time stamp
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N2201/00Indexing scheme relating to scanning, transmission or reproduction of documents or the like, and to details thereof
    • H04N2201/0077Types of the still picture apparatus
    • H04N2201/0084Digital still camera
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N2201/00Indexing scheme relating to scanning, transmission or reproduction of documents or the like, and to details thereof
    • H04N2201/32Circuits or arrangements for control or supervision between transmitter and receiver or between image input and image output device, e.g. between a still-image camera and its memory or between a still-image camera and a printer device
    • H04N2201/3201Display, printing, storage or transmission of additional information, e.g. ID code, date and time or title
    • H04N2201/3204Display, printing, storage or transmission of additional information, e.g. ID code, date and time or title of data relating to a user, sender, addressee, machine or electronic recording medium
    • H04N2201/3205Display, printing, storage or transmission of additional information, e.g. ID code, date and time or title of data relating to a user, sender, addressee, machine or electronic recording medium of identification information, e.g. name or ID code
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N2201/00Indexing scheme relating to scanning, transmission or reproduction of documents or the like, and to details thereof
    • H04N2201/32Circuits or arrangements for control or supervision between transmitter and receiver or between image input and image output device, e.g. between a still-image camera and its memory or between a still-image camera and a printer device
    • H04N2201/3201Display, printing, storage or transmission of additional information, e.g. ID code, date and time or title
    • H04N2201/3212Display, printing, storage or transmission of additional information, e.g. ID code, date and time or title of data relating to a job, e.g. communication, capture or filing of an image
    • H04N2201/3214Display, printing, storage or transmission of additional information, e.g. ID code, date and time or title of data relating to a job, e.g. communication, capture or filing of an image of a date
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N2201/00Indexing scheme relating to scanning, transmission or reproduction of documents or the like, and to details thereof
    • H04N2201/32Circuits or arrangements for control or supervision between transmitter and receiver or between image input and image output device, e.g. between a still-image camera and its memory or between a still-image camera and a printer device
    • H04N2201/3201Display, printing, storage or transmission of additional information, e.g. ID code, date and time or title
    • H04N2201/3212Display, printing, storage or transmission of additional information, e.g. ID code, date and time or title of data relating to a job, e.g. communication, capture or filing of an image
    • H04N2201/3215Display, printing, storage or transmission of additional information, e.g. ID code, date and time or title of data relating to a job, e.g. communication, capture or filing of an image of a time or duration
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N2201/00Indexing scheme relating to scanning, transmission or reproduction of documents or the like, and to details thereof
    • H04N2201/32Circuits or arrangements for control or supervision between transmitter and receiver or between image input and image output device, e.g. between a still-image camera and its memory or between a still-image camera and a printer device
    • H04N2201/3201Display, printing, storage or transmission of additional information, e.g. ID code, date and time or title
    • H04N2201/3225Display, printing, storage or transmission of additional information, e.g. ID code, date and time or title of data relating to an image, a page or a document
    • H04N2201/3233Display, printing, storage or transmission of additional information, e.g. ID code, date and time or title of data relating to an image, a page or a document of authentication information, e.g. digital signature, watermark
    • H04N2201/3235Checking or certification of the authentication information, e.g. by comparison with data stored independently
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N2201/00Indexing scheme relating to scanning, transmission or reproduction of documents or the like, and to details thereof
    • H04N2201/32Circuits or arrangements for control or supervision between transmitter and receiver or between image input and image output device, e.g. between a still-image camera and its memory or between a still-image camera and a printer device
    • H04N2201/3201Display, printing, storage or transmission of additional information, e.g. ID code, date and time or title
    • H04N2201/3225Display, printing, storage or transmission of additional information, e.g. ID code, date and time or title of data relating to an image, a page or a document
    • H04N2201/3253Position information, e.g. geographical position at time of capture, GPS data
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N2201/00Indexing scheme relating to scanning, transmission or reproduction of documents or the like, and to details thereof
    • H04N2201/32Circuits or arrangements for control or supervision between transmitter and receiver or between image input and image output device, e.g. between a still-image camera and its memory or between a still-image camera and a printer device
    • H04N2201/3201Display, printing, storage or transmission of additional information, e.g. ID code, date and time or title
    • H04N2201/3273Display
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N2201/00Indexing scheme relating to scanning, transmission or reproduction of documents or the like, and to details thereof
    • H04N2201/32Circuits or arrangements for control or supervision between transmitter and receiver or between image input and image output device, e.g. between a still-image camera and its memory or between a still-image camera and a printer device
    • H04N2201/3201Display, printing, storage or transmission of additional information, e.g. ID code, date and time or title
    • H04N2201/328Processing of the additional information
    • H04N2201/3281Encryption; Ciphering

Definitions

  • the present disclosure relates to the fields of image capture and image processing, for instance by a CCTV camera or other camera such as a handheld consumer camera, and particularly to recording a captured image along with a timestamp recording the time at which the image was captured.
  • Image capture may involve capturing a single still image or a series of images over time. Either way, when the image or images are processed, this may involve associating a respective timestamp with each of the images.
  • the timestamp is added by the camera (e.g. CCTV camera), by tagging or stamping the image or image file with the time from its own internal clock. This provides an indication of the time at which the image was captured.
  • an image capturing system comprising a light source for illuminating an environment with visible light, and image capturing apparatus for capturing and recording images of the environment.
  • the light source is configured to encode a current indication of time into its light output
  • the image capturing apparatus comprises a camera configured to capture one or more images of the environment.
  • the image capturing system further comprises an image processing module configured to process one or more of the images to decode a timestamp encoded into the light, and to output an indication of the decoded timestamp; and a verification module arranged to receive the indication of the decoded timestamp output by the image processing module, and based thereon to perform an at least partially automated verification of a time at which the one or more images were captured.
  • the light source thus effectively has a dual function; i.e. the light source has a primary illumination function and a secondary data communication function.
  • a processor (which may be part of, or separate from, the camera) uses coded- light image processing techniques to determine the timestamp therefrom, and based on this performs an at least partially automated verification of the time at which the image was captured.
  • the determined time can be associated with the image as an indication of when the image was captured, or it can be used to verify a pre-existing timestamp associated with the image.
  • the technique can be seen as a kind of "watermarking" which enables verification as to when an image was captured.
  • the timestamp combined with the image material itself can represent a simple "proof of being there". For example when the image depicts a person at a particular location, say near an ATM or near a landmark, the time-stamp in the image together with the image provides a proof that this person was at that location at the timestamp.
  • the data modulated into the light further comprises information indicative of a location of the light source or
  • this information may comprise coordinates or a name of the location explicitly encoded into the light; or may comprise an identifier of the light source that is mapped to a location in a location database, the image processing module being configured to look up the location from the location database based on this identifier.
  • the "watermarking" enables verification of when and where an image was captured.
  • the verification module is configured to perform said at-least-partially automated verification by comparing the timestamp as currently decoded from said image with a previously determined version of the timestamp, e.g. as determined at the time of capture.
  • the image processing module may be configured to store the one or more images in at least one image file, wherein said outputting may comprise storing the decoded timestamp in the image file as metadata of the image file; the image processing module may be configured to decode the timestamp again later from the one or more stored images; and the verification module may be configured to perform said at-least- partially automated verification by comparing the later decoded timestamp with the stored timestamp in the metadata in order to verify the stored timestamp.
  • the camera may add its own timestamp to at least one of said one or more the images
  • the verification module may be configured to perform said at-least-partially automated verification by comparing the camera's own timestamp with the decoded timestamp in order to verify the camera's timestamp.
  • a device comprising: an input for receiving images captured by a camera, of an environment illuminated by visible light from a light source; an image processing module; and a verification module.
  • the image processing module is configured to process one or more of the images to decode a timestamp encoded into the light, and to output an indication of the decoded timestamp.
  • the verification module is arranged to receive the indication of the decoded timestamp output by the image processing module, and based thereon to perform an at least partially automated verification of a time at which the one or more images were captured.
  • a computer-program product comprising code embodied on a computer-readable medium and configured so as when executed on a device to implement the operations of the image processing module.
  • a method comprising: using light emitted from a light source to illuminate an environment; encoding a timestamp indicating a current time value into the light emitted by the light source; using a camera to capture one or more of said environment; processing at least one of the images to determine the timestamp encoded into said light; and based on the time stamp as determined by said processing, performing an at least partially automated verification of a time at which the at least one image was captured.
  • Figure 1 schematically illustrates a system for emitting light and capturing an image
  • Figure 2 illustrates another system for emitting light and capturing an image
  • Figure 3 is a schematic block diagram of a light source and camera
  • Figure 4 is a flow chart illustrating a method of emitting light
  • Figure 5 is a flow chart illustrating a method of capturing and processing an image
  • Figure 6 schematically illustrates a captured image with incorporated graphical element
  • Figure 7 schematically illustrates an image file with metadata
  • Figure 8 schematically illustrates an image capture element of a rolling-shutter camera
  • Figure 9 schematically illustrates the capture of modulated light by rolling shutter.
  • Figures 1 to 3 show an image capturing system comprising a light source 2 for illuminating an environment comprising one or more objects of interest 6, and image capturing apparatus 4 comprising a camera for capturing images of the environment
  • the camera 4 may be a security camera such as a CCTV (closed circuit television) camera.
  • the light source 2 and the camera 4 may be separate units, as shown in Figure 1, or alternatively may be integrated into the same unit as shown in Figure 2.
  • the light source 2 may be a luminaire that is installed or otherwise placed in the environment for a primary purpose of illuminating the environment generally.
  • the light source 2 may be a dedicated light source integrated with the camera for the purpose of illuminating the scene to be captured.
  • the object of interest 6 may be a person whose presence at a certain time and/or place is desired to be verified, or any other object desired to be verified as being present at the time and/or place in question.
  • the light source 2 is configured to encode a current indication of time into its light output.
  • the light source 2 comprises a lighting element 14 (e.g. comprising one or more LEDs), and a controller 10 connected via a driver 12 to control the emission of light from the lighting element 14.
  • the light source 2 also comprises or has access to a clock or other form of timer 16, from which it is arranged to derive a current time value indicating a current time (e.g. current date and/or time of day, or time elapsed since some event or defined point).
  • the light source 12 may comprise a memory 18 such as a magnetic (e.g. hard disk) or electronic (e.g. flash) memory.
  • the controller 10 is configured to read the current time value from the clock or timer 16 and encode this into the light emitted by the lighting element 14 by modulating the light using a coded light technique.
  • the controller 10 may optionally also include other information in the coded light, such as an ID of the light source 2. This other information may be retrieved from the memory 18, or received over a network.
  • Coded light refers to techniques whereby a signal is embedded in the visible light emitted by a light source.
  • the light thus comprises both a visible illumination contribution for illuminating a target environment such as room (typically the primary purpose of the light), and an embedded signal for providing information into the target environment.
  • the light is modulated at a certain modulation frequency or frequencies.
  • the amplitude of the light may be switched between different levels at high frequency to represent different symbols (amplitude keying), or the modulation frequency may be switched between two or more different frequencies to represent symbols (frequency shift keying).
  • Techniques for encoding data into light will in themselves be familiar to a person skilled in the art.
  • the controller 10 may be implemented in software stored on a memory of the light source 2 (e.g. memory 18) and arranged for execution on a processor of the light source. Alternatively it is not excluded that some or all of the functionality of the controller 10 could be implemented in dedicated hardware circuitry, or configurable or reconfigurable hardware such as a PGA or FPGA.
  • the clock or timer 16 may comprise a general purpose clock or timer of the processor in combination with software of the controller configured to determine the relevant representation of the current time based thereon.
  • the controller 10 and/or the clock or timer 16 may be integrated into the same unit or fixture as the light source 2, or may be external to it (e.g. providing instructions to the light source 2 over a network).
  • the image-capturing apparatus 4 comprises a camera 20 which is configured to capture an image or a series of images of the environment.
  • the image capturing apparatus 4 also comprises an image processing module 22, connected to receive the captured images from the output of the camera 20 and supply them to the image processing module 22.
  • the image capturing apparatus comprises a memory 24 where the captured images are stored.
  • the image processing module 22 may be implemented as a portion of software stored on a memory of the image capturing apparatus 4 (e.g. memory 24) and arranged for execution on a processor of the image capturing apparatus 4. Alternatively, it is not excluded that some or all of the functionality of the image processing module 22 could be implemented in dedicated hardware circuitry, or configurable or reconfigurable hardware such as a PGA or FPGA.
  • the image processing module 22 may be integrated into the same unit as the camera 20, or may be separate from it (e.g. running on a separate computer).
  • the image processing module 22 locates the light footprint 8 of the light source in the image, and then uses coded-light image processing techniques to determine therefrom a time at which the image was captured.
  • coded-light image processing techniques For instance, EP 2,503,852 describes how rolling-shutter type techniques can be used not only to detect the data signaled by the coded light component, but also the spatial footprint of an individual coded light component separated from amongst any other light components in the
  • the light source 2 itself to appear in the captured image, as long as light from the light source 2 is captured in the image (e.g. the light may comprise reflected light, or light received through a diffuser).
  • Figure 8 represents the image capture element 36 of the camera 20.
  • the image capture element 36 comprises an array of pixels for capturing signals representative of light incident on each pixel, e.g. typically a square or rectangular array of square or rectangular pixels.
  • the pixels are arranged into a plurality of lines, e.g.
  • the system as presented in WO 2012/127439 illuminates its environment and at the same time superimposes a modulation on the light output that although substantially imperceptible to the unaided human eye, is visible for the rolling shutter camera.
  • the visible light communication system of WO2012/127439 is merely exemplary of the systems that can be used in conjunction with the present invention. This is illustrated schematically in Figure 9. As each successive line 38 is exposed, it is exposed at a slightly different time and therefore (if the line rate is high enough compared to the modulation frequency) at a slightly different phase of the modulation. Thus each line 38 is exposed to a respective instantaneous level of the modulated light. This results in a pattern of stripes which undulates or cycles with the modulation over a given frame. Based on this principle, the image processing module 22 is able to detect coded light components modulated into light received by the camera 20.
  • the controller 10 of the light source 2 retrieves an ID of the light source from its memory 18 (or receives it over a network).
  • the ID uniquely identifies the light source amongst a plurality of other light sources within a certain system (which in embodiments may each emit light encoded with their own respective IDs), e.g. within a room, building or complex.
  • the controller 10 also obtains an indication of the current time from the clock or timer 16 of the light source 2 (or receives this over a network).
  • the controller 10 then controls the lighting element 14 of the light source 2 to emit light modulated according to a suitable coded light technique, so as to encode the ID and time into the emitted light.
  • the light source may provide a cryptographically verifiable timestamp.
  • the timestamp may be encrypted using a symmetric or an
  • the timestamp may be combined with a digitally signed cryptographic message digest or hash of the message. In this manner the time-stamp remains in clear form and parties in possession of the proper key can verify message authenticity, by matching the timestamp with the accompanying message digest or hash.
  • the particular solution to choose may depend on the application and requirements. For example, should the timestamp be available in the clear, the choice may depend on whether the coded light information channel has sufficient bandwidth to accommodate the overhead of a digitally signed hash. For still images that have limited bandwidth, encryption may be most suitable because it does not add additional bits, whereas for higher bandwidth application such as video (which consists of multiple image frames) the inclusion of a plain text timestamp with a digitally signed hash may be beneficial because it also provide the timestamp in the clear.
  • the camera 20 receives the light (directly and/or reflected) from the light source 2 and thereby captures an image (or images) of the environment.
  • the image is stored in a memory, e.g. memory 18.
  • the image processing module 22 locates the part of the image which comprises the coded light from the light source 2 in question based on its ID (i.e. locates the "footprint" 8 of the coded light). For example this may be performed in accordance with the techniques set out in EP 2,503,852.
  • the image processing module 22 decodes the coded light as appearing in the image, in order to read the timestamp.
  • the image processing module then outputs an indication of the timestamp in association with the image, so that at step T40 it can be used to verify the time at which the image (or images) was (or were) captured, and hence that the one or more object of interest 6 appearing in the image were present in the environment in question at that time.
  • the indication of the timestamp may be output verbatim in the form it was received, or may be interpreted or reformatted to be output in a different form. For instance it could be interpreted from a computer-readable form to a human-readable form, and/or reduced from a higher time resolution to a lower timer resolution (e.g. if the timestamp is accurate to a higher degree than of interest, e.g. second or milliseconds, the image processing module 22 may output in terms of only, say, minutes, hours or even just date).
  • the image processing module 22 may be configured to store an indication of the timestamp in metadata 34 of the image file 30.
  • the image processing module 22 captures the one or more images 22, it records them by storing them in at least one image file 30 (e.g. multiple images per file in the case of a video, or a different file per image in the case of still images).
  • the image file 30 comprises a portion of image data 32 which represents the actual graphical part of the image that will be viewed on screen, i.e. the actual pixels.
  • the image file 30 also comprises a metadata portion 34 (e.g. a header or side information) comprising data which is not part of the image per se, but which is associated with it.
  • the timestamp is not (or at least is not only) output to the user for the user to verify the time, but is output to a verification module 23 to perform an at least partially automated verification procedure (refer again to Figure 3).
  • the verification module 23 may be implemented as another portion of software stored on a memory of the image capturing apparatus 4 (e.g. memory 24) and arranged for execution on a processor of the image capturing apparatus 4. Alternatively, again it is not excluded that some or all of the functionality of the verification module 23 could be implemented in dedicated hardware circuitry, or configurable or reconfigurable hardware such as a PGA or FPGA.
  • the verification module 23 may be integrated into the same unit as the camera 20 and/or image processing module 22, or may be separate from one or both of them.
  • the timestamp obtained from the coded light can be used to verify a pre-existing timestamp associated with the image.
  • the image processing module 22 is configured to decode the timestamp again later from the one or more stored images
  • the verification module 23 is configured to compare this with the stored timestamp in the metadata 34 in order to verify the stored timestamp. For instance it may be desired to check that the image has not been tampered with. In that case, the timestamp as previously decoded from the image (e.g. at the time of capture) and stored in association with it (e.g. in metadata 34 in the image file 30) should match the timestamp as decoded from that (allegedly) same image on a subsequent occasion.
  • the verification module 23 is arranged to check this by comparing a stored version of the timestamp with a version of the (supposedly) same timestamp now extracted from the stored image.
  • the camera 20 is one which adds its own timestamp to at least one of the one or more the images
  • the verification module 23 is configured to compare the camera's own timestamp with the decoded timestamp in order to verify the camera's timestamp. For instance it may not be certain whether the camera's time-stamping is reliable or trustworthy, or it may not be known whether the timestamp added by the camera has been tampered with.
  • the verification module 23 may be configured to verify this by checking the camera's (apparent) timestamp against the coded light timestamp extracted from the image in question.
  • the timestamp is encrypted using a symmetric or asymmetric cipher, such that the timestamp is provided in encrypted format and a corresponding decryption key is required to access the timestamp. In this manner it will be difficult for a third party to provide a "fake" timestamp.
  • the timestamp as modulated into the light is signed using a private key of a trusted party, e.g. a private key of an operator or supplier of the light source 2, or an associate thereof.
  • the verification module 23 is configured to verify the decoded timestamp using a corresponding public key.
  • the message comprising the timestamp also comprises a cryptographic hash signed with a private key of a private -public key pair (and optionally a digital certificate to aid verification).
  • the verification module 23 can then verify authenticity of the data using the corresponding public key, i.e. the device can verify that the timestamp did indeed originate from a light source 2 of the trusted operator, supplier or associate.
  • proof of authenticity is provided using a cryptographically signed hash of the timestamp in conjunction with the time stamp.
  • Timestamp i Timestamp
  • E PK Principal key, f as h(Timestamp)
  • the fhash represents a cryptographic hash function
  • Timestamp i above allows the verification of the Timestamp by decrypting the hash using the corresponding public key of the cryptographic key-pair, calculating the cryptographic hash of the timestamp, and comparing the decrypted and calculated hash values.
  • the image processing module 22 may be configured to remove the modulation from said one or more images once the timestamp has been decoded, stored or verified.
  • the image processing module 22 may be configured to detect when the timestamp has been used (e.g. by detecting it has now been displayed to the user, incorporated as a graphical element 28 in the image, stored as metadata 34 and/or verified by the verification module 23), and then subsequently cancel the coded light component from the image data 32 as stored in the stored image file 30 and/or as displayed on screen to the user.
  • the coded light signal when the coded light signal is modulated in a manner analogous to WO 2012/127439, then the coded light signal typically leaves a signal around the lights or on the area illuminated by the lights. This manifestation of the data from the coded light signal within the resulting image may be considered as undesirable by the viewer.
  • post-processing may be used to remove part of the resulting patterns.
  • Part of the coded light detection is to determine the footprint of the coded light; i.e. where in the image the coded light signal is available. After decoding the very same footprint may be used to filter out the coded light signal from the image, e.g. by means of a filter that primarily filters in the direction orthogonal to the lines of the rolling shutter, thereby reducing the visibility of the coded light signal within the footprint.
  • the information (timestamp, optional location data and/or identifier, or similar data) stored in the metadata is stored in a manner that uses cryptographic techniques to prevent or at least complicate tampering.
  • the camera use cryptographic techniques to secure the timestamp, or the entire metadata, using cryptographic techniques, thereby rendering it more difficult to tamper with the timestamp data.
  • the timestamp could be encrypted using a symmetric or asymmetric cipher, such that only an application with the corresponding keys can access the timestamp.
  • the camera may provide the timestamp combined with a cryptographically signed hash covering at least the timestamp (and optionally a digital certificate to aid verification). By using these cryptographic techniques the authenticity of the timestamp can be verified even after any visible artefacts in the still or video images have been removed through post-processing.
  • proof of authenticity is provided using a cryptographically signed hash of the timestamp (or the entire metadata) in conjunction with the time stamp.
  • Metadatai Timestamp
  • Epi Principal key, fhash(Timestamp)
  • the f ash represents a cryptographic hash function
  • the ⁇ (.) function represents an asymmetric public key cipher.
  • Metadatai above allows the verification of the Timestamp by decrypting the hash using the corresponding public key of the cryptographic key-pair, calculating the cryptographic hash of the timestamp, and comparing the decrypted and calculated hash values.
  • the first is in the coded light signal (which is as secure as the coded light system is secure), and the second is in the proof as generated in the camera (which is as secure as the camera is secure). There are some differences between the first and second cases, the main difference being the available bandwidth.
  • cryptographically signed hash is typically fairly short - it is kind of a checksum that is difficult to fake. But to verify the checksum, this requires the public key and certificate.
  • the certificate is the public key needed to decrypt the hash which in turn is signed by a trusted third party. So the certificate allows for checking that the public key (needed to check the hash) is approved of by the trusted third party.
  • This certificate may take up quite a number of bits, which can be readily append to a file like a JPG, but in some systems may be more than can be readily transmitted over coded light (e.g. would require a long detection time). Hence in some embodiments with limited coded light bandwidth, the inclusion of the certificate with the timestamp may be more applicable to the second case where the hash is applied by the receiving camera rather than the first case where it is applied in the coded light signal.
  • the public key and/or certificate could be distributed by a separate means (e.g. via a network or memory stick).
  • the timestamp may optionally also be output to a user, allowing the user him- or herself to verify the time at which the image was captured.
  • what is being verified is the user's own perception, impression, memory or belief of when the image was captured or purported or alleged to have been captured.
  • the user may recognize the event in the image but wish to have confirmed his or her memory of when the event took place.
  • the user may be shown the image by another who reports the event to have taken place at a certain time, but the user wishes to check that other person's assertion.
  • the image processing module 22 may be configured to modify the actual image 26 - i.e. the actual pixels of the image - to incorporate an indication of the timestamp as a graphical element 28 in the image, and to store the image in this modified form.
  • the image processing module 22 may also write-protect or encrypt the modified image 26 to prevent tampering.
  • the image processing module 22 may be configured to store an indication of the timestamp in the metadata 34 of the image file 30.
  • the image data 32 which is modified to indicate the timestamp rather than the timestamp being included as metadata, but in the embodiment of Figure 7 the timestamp is indicated in the metadata 34.
  • the user him- or herself can also view the timestamp from the metadata 34, e.g. by viewing file details of the file 30 in a file system, operating system or browser on a computer.
  • the controller 10 of the light source 2 may also be configured to encode an indication of the light source's location into the light it emits.
  • the image processing module 22 at the receive side is further able to obtain, from the processing of the one or more images, information indicative of the location at which the one or more images were captured.
  • the location information may comprise geographic coordinates or a name of the location embedded directly into the light, e.g. as the GPS coordinates of the light source 2 (the light source 2 may even be equipped with a GPS receiver for this purpose) or name of the location that is meaningful to a human (e.g. name of a room, building, street, etc.).
  • the location information embedded in the light may comprise an identifier of the light source 2 that is mapped to a location in a location database (not shown). In this case the image processing module being configured to look up the location from the location database based on said identifier.
  • the disclosed system can provide a kind of "watermarking" which creates a proof of when and optionally where an image was captured, e.g. allowing a user to demonstrate "I was there".
  • any of the verification techniques discussed above in relation to the time stamp may be applied to this "location stamp".
  • the captured image stored in the image file 30 may be modified to incorporate the location as a graphical element 28 analogous to Figure 6, and/or the location may be included in the metadata 32 of the stored image file 30 as in Figure 7.
  • the information indicative of the location may be encrypted or crypto graphically signed in the coded light signal, and/or encrypted or cryptographically signed at the camera, in similar manner to any of the options discussed above in relation to the timestamp.
  • Such a device would be similar to the claimed device in that it would be a device comprising an input for receiving images captured by a camera, of an environment illuminated by light from a visible light source; and an image processing module configured to process one or more of the images to decode a data representing a location modulated into said light, and to output an indication of the decoded location for verification of a location where the one or more images were captured.
  • any of the above techniques may be combined with a facial recognition algorithm or other image recognition algorithm in order to additionally identify a person (or even multiple people) in any of the one or more images from which the timestamp (and in embodiments location stamp) has been decoded - i.e. to detect the presence of a person (human) and to identify who that person is. For instance, if a face or part of a face can be discerned in one or more of the images, this may allow the facial recognition algorithm to identify the person to whom that face belongs.
  • the person may be identified in terms of any suitable identifier, such as a personal name ("Dave Smith" or the like), or a nickname, or an email address, or a username of a certain communication system such as a username of a social networking system or VoIP system.
  • a personal name (“Dave Smith” or the like)
  • a nickname or an email address
  • a username of a certain communication system such as a username of a social networking system or VoIP system.
  • the facial and/or other image recognition algorithm may be combined into an app or other application along with the timestamp detection and location stamp detection, to log the fact that the identified person or person was at a certain place at a certain time (and therefore enable a user to prove that fact).
  • this could be used as part of one or more of: a location-based game (where a result in the game depends on the person or persons being at one or more specified locations at a specified time or times), a retail rewards scheme (where a reward to the user depends on the user having been at one or more specified locations at a specified time or times), a social networking system (where the user may wish to prove that his or her activity is real), and/or a celebrity spotting application (e.g.
  • the application may log the fact of the identified person or person being at the verified location at the verified time in a number of ways. For example this may be logged locally on a user device of the user, e.g. on portable user device such as a smart phone, tablet or laptop comprising the camera 20, image processing module 22 and verification module 23 incorporated into that user device. This way, the user can show one or more others (e.g.
  • the fact of the identified person or persons being at the verified location at the verified time could be logged on a server or other computer via a network.
  • the user may submit a captured image from the camera 20 over a wide area network (WAN) such as the internet or a mobile cellular network (e.g. 3GPP network) to a server, at which the verification module 23 is implemented.
  • WAN wide area network
  • the verification module 23 is implemented.
  • this could be a server of a social networking system, or a server hosting a location-based game. This way the verification can be performed centrally and made available to one or more other users to see.
  • the need for verification may exist in a game. Take for example a game in which players have to travel according to a predetermined itinerary (predetermined schedule for a predetermined journey, route or set of locations). Evidence that the players were indeed present at locations which are part of the itinerary at certain time slots can be verified by means of captured images.
  • the images may be created by means of fixed cameras at the locations in question, or alternatively the images may be captured by means of mobile devices of the players. More and more mobiles with cameras are designed to take so-called "selfies" which contain the face of the person who took the image.
  • the image which is captured by the player of the game at a certain location at the certain moment can then be verified automatically by a device or application according to the invention.
  • This device or application could be implemented in the mobile user terminal which was used to capture the image (e.g. as part of a "selfie contest").
  • the device or application may be (at least partly)
  • the device or application may then, e.g. shortly after the capturing of the image, verify whether the image was indeed made at the relevant location within a certain time slot based on knowledge of the codes in the emitted light of the light sources at that location.
  • the device or application can compare the code which is embedded in the emitted light with codes which have been provided as part of the game.
  • the device or application is also capable of facial recognition. By combining code verification and facial recognition the device can verify that the user who is recognized in the image was present at the location at the specific time.
  • the code in the emitted light comprises or is based on a timestamp and preferably also related to a kind of game.
  • based on it is meant that the code which is transmitted does not have to be a direct indication of the actual time. It also means that the signal in the emitted light is not constant over time, but rather fluctuates as function of time.
  • the game is based on collecting points by being present at a certain location at a certain moment. For instance in bars or pubs, there are spots which are typically occupied by the most senior guys or the alpha types. Showing by means of an image that the user (or another person) was at such spot at a certain time can be important in the social game.
  • Another use of creating evidence that the user was at a certain location at a certain time may be found in retail. For instance, customers may receive discounts, or faster service, by providing evidence that they were in the shop or at another location. Again the image must be captured in the neighborhood of light sources emitting specific (time based) codes or with the light sources even directly visible in the images.
  • Yet another possible application is to use facial recognition for identifying or authenticating an alleged identity of a further person (say a celebrity) in the same image or sequence of images, and automatically verifying that the image supports an assertion that the particular/multiple persons in question was/were present (together) at the same location at the same time.
  • the carrier of a computer program may be any entity or device capable of carrying the program.
  • the carrier may include a storage medium, such as a ROM, for example, a CD ROM or a semiconductor ROM, or a magnetic recording medium, for example, a hard disk.
  • the carrier may be an integrated circuit in which the program is embedded, the integrated circuit being adapted to perform, or used in the performance of, the relevant method.
  • a computer program may be stored/distributed on a suitable medium, such as an optical storage medium or a solid-state medium supplied together with or as part of other hardware, but may also be distributed in other forms, such as via the Internet or other wired or wireless telecommunication systems. Any reference signs in the claims should not be construed as limiting the scope.

Landscapes

  • Engineering & Computer Science (AREA)
  • Signal Processing (AREA)
  • Multimedia (AREA)
  • Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • General Health & Medical Sciences (AREA)
  • Electromagnetism (AREA)
  • Health & Medical Sciences (AREA)
  • Bioethics (AREA)
  • Computer Hardware Design (AREA)
  • Computer Security & Cryptography (AREA)
  • Software Systems (AREA)
  • Human Computer Interaction (AREA)
  • Oral & Maxillofacial Surgery (AREA)
  • Studio Devices (AREA)
  • Optical Communication System (AREA)
  • Editing Of Facsimile Originals (AREA)
  • Image Analysis (AREA)
  • Image Processing (AREA)

Abstract

A device comprising: an input for receiving images captured by a camera, of an environment illuminated by light from a light source; an image processing module; and a verification module. The image processing module is configured to process one or more of the images to decode a timestamp modulated into the light, and to output an indication of the decoded timestamp. The verification module is arranged to receive the indication of the decoded timestamp output by the image processing module, and based thereon to perform an at least partially automated verification of a time at which the one or more images were captured.

Description

VERIFICATION OF IMAGES CAPTURED USING A TIMESTAMP DECODED FROM
ILLUMINATION FROM A MODULATED LIGHT SOURCE
TECHNICAL FIELD
The present disclosure relates to the fields of image capture and image processing, for instance by a CCTV camera or other camera such as a handheld consumer camera, and particularly to recording a captured image along with a timestamp recording the time at which the image was captured.
BACKGROUND
Image capture may involve capturing a single still image or a series of images over time. Either way, when the image or images are processed, this may involve associating a respective timestamp with each of the images. The timestamp is added by the camera (e.g. CCTV camera), by tagging or stamping the image or image file with the time from its own internal clock. This provides an indication of the time at which the image was captured.
However, in the case where a time stamp has been associated with an image, there may be a concern that the timestamp has been tampered with. In the case where no time stamp has been associated with an image, this may cause problems in that the time at which the image was captured is not known.
SUMMARY
According to one aspect disclosed herein, there is provided an image capturing system comprising a light source for illuminating an environment with visible light, and image capturing apparatus for capturing and recording images of the environment. The light source is configured to encode a current indication of time into its light output, and the image capturing apparatus comprises a camera configured to capture one or more images of the environment. The image capturing system further comprises an image processing module configured to process one or more of the images to decode a timestamp encoded into the light, and to output an indication of the decoded timestamp; and a verification module arranged to receive the indication of the decoded timestamp output by the image processing module, and based thereon to perform an at least partially automated verification of a time at which the one or more images were captured. The light source thus effectively has a dual function; i.e. the light source has a primary illumination function and a secondary data communication function. Further, for the/each image, a processor (which may be part of, or separate from, the camera) uses coded- light image processing techniques to determine the timestamp therefrom, and based on this performs an at least partially automated verification of the time at which the image was captured.
The determined time can be associated with the image as an indication of when the image was captured, or it can be used to verify a pre-existing timestamp associated with the image. Thus, the technique can be seen as a kind of "watermarking" which enables verification as to when an image was captured. The timestamp combined with the image material itself can represent a simple "proof of being there". For example when the image depicts a person at a particular location, say near an ATM or near a landmark, the time-stamp in the image together with the image provides a proof that this person was at that location at the timestamp.
In embodiments, in addition to the timestamp, the data modulated into the light further comprises information indicative of a location of the light source or
environment, which can be used to verify the location at which the one or more images were captured. For example, this information may comprise coordinates or a name of the location explicitly encoded into the light; or may comprise an identifier of the light source that is mapped to a location in a location database, the image processing module being configured to look up the location from the location database based on this identifier.
In this case the "watermarking" enables verification of when and where an image was captured.
In further embodiments, the verification module is configured to perform said at-least-partially automated verification by comparing the timestamp as currently decoded from said image with a previously determined version of the timestamp, e.g. as determined at the time of capture. For example, the image processing module may be configured to store the one or more images in at least one image file, wherein said outputting may comprise storing the decoded timestamp in the image file as metadata of the image file; the image processing module may be configured to decode the timestamp again later from the one or more stored images; and the verification module may be configured to perform said at-least- partially automated verification by comparing the later decoded timestamp with the stored timestamp in the metadata in order to verify the stored timestamp. Alternatively or additionally, as another example the camera may add its own timestamp to at least one of said one or more the images, and the verification module may be configured to perform said at-least-partially automated verification by comparing the camera's own timestamp with the decoded timestamp in order to verify the camera's timestamp.
According to another aspect disclosed herein, there is provided a device comprising: an input for receiving images captured by a camera, of an environment illuminated by visible light from a light source; an image processing module; and a verification module. The image processing module is configured to process one or more of the images to decode a timestamp encoded into the light, and to output an indication of the decoded timestamp. The verification module is arranged to receive the indication of the decoded timestamp output by the image processing module, and based thereon to perform an at least partially automated verification of a time at which the one or more images were captured.
According to another aspect, there is provided a computer-program product comprising code embodied on a computer-readable medium and configured so as when executed on a device to implement the operations of the image processing module.
According to yet another aspect disclosed herein, there is provided a method comprising: using light emitted from a light source to illuminate an environment; encoding a timestamp indicating a current time value into the light emitted by the light source; using a camera to capture one or more of said environment; processing at least one of the images to determine the timestamp encoded into said light; and based on the time stamp as determined by said processing, performing an at least partially automated verification of a time at which the at least one image was captured.
BRIEF DESCRIPTION OF THE DRAWINGS
To assist understanding of the present disclosure and to show how embodiments may be put into effect, reference is made by way of example to the accompanying drawings in which:
Figure 1 schematically illustrates a system for emitting light and capturing an image,
Figure 2 illustrates another system for emitting light and capturing an image,
Figure 3 is a schematic block diagram of a light source and camera,
Figure 4 is a flow chart illustrating a method of emitting light, Figure 5 is a flow chart illustrating a method of capturing and processing an image, Figure 6 schematically illustrates a captured image with incorporated graphical element,
Figure 7 schematically illustrates an image file with metadata,
Figure 8 schematically illustrates an image capture element of a rolling-shutter camera, and
Figure 9 schematically illustrates the capture of modulated light by rolling shutter.
DETAILED DESCRIPTION OF EMBODIMENTS
Figures 1 to 3 show an image capturing system comprising a light source 2 for illuminating an environment comprising one or more objects of interest 6, and image capturing apparatus 4 comprising a camera for capturing images of the environment
(including the object or objects of interest). For example the camera 4 may be a security camera such as a CCTV (closed circuit television) camera. The light source 2 and the camera 4 may be separate units, as shown in Figure 1, or alternatively may be integrated into the same unit as shown in Figure 2. For instance in the case of Figure 1, the light source 2 may be a luminaire that is installed or otherwise placed in the environment for a primary purpose of illuminating the environment generally. In the case of Figure 2, the light source 2 may be a dedicated light source integrated with the camera for the purpose of illuminating the scene to be captured. The object of interest 6 may be a person whose presence at a certain time and/or place is desired to be verified, or any other object desired to be verified as being present at the time and/or place in question.
The light source 2 is configured to encode a current indication of time into its light output. Referring to Figure 3, the light source 2 comprises a lighting element 14 (e.g. comprising one or more LEDs), and a controller 10 connected via a driver 12 to control the emission of light from the lighting element 14. The light source 2 also comprises or has access to a clock or other form of timer 16, from which it is arranged to derive a current time value indicating a current time (e.g. current date and/or time of day, or time elapsed since some event or defined point). Furthermore the light source 12 may comprise a memory 18 such as a magnetic (e.g. hard disk) or electronic (e.g. flash) memory. The controller 10 is configured to read the current time value from the clock or timer 16 and encode this into the light emitted by the lighting element 14 by modulating the light using a coded light technique. In embodiments, the controller 10 may optionally also include other information in the coded light, such as an ID of the light source 2. This other information may be retrieved from the memory 18, or received over a network.
Coded light refers to techniques whereby a signal is embedded in the visible light emitted by a light source. The light thus comprises both a visible illumination contribution for illuminating a target environment such as room (typically the primary purpose of the light), and an embedded signal for providing information into the
environment. To do this, the light is modulated at a certain modulation frequency or frequencies. For instance, the amplitude of the light may be switched between different levels at high frequency to represent different symbols (amplitude keying), or the modulation frequency may be switched between two or more different frequencies to represent symbols (frequency shift keying). Techniques for encoding data into light will in themselves be familiar to a person skilled in the art.
In embodiments the controller 10 may be implemented in software stored on a memory of the light source 2 (e.g. memory 18) and arranged for execution on a processor of the light source. Alternatively it is not excluded that some or all of the functionality of the controller 10 could be implemented in dedicated hardware circuitry, or configurable or reconfigurable hardware such as a PGA or FPGA. In the case of a partially or wholly software-based implementation, the clock or timer 16 may comprise a general purpose clock or timer of the processor in combination with software of the controller configured to determine the relevant representation of the current time based thereon.
The controller 10 and/or the clock or timer 16 may be integrated into the same unit or fixture as the light source 2, or may be external to it (e.g. providing instructions to the light source 2 over a network).
At the receiver side, the image-capturing apparatus 4 comprises a camera 20 which is configured to capture an image or a series of images of the environment. The image capturing apparatus 4 also comprises an image processing module 22, connected to receive the captured images from the output of the camera 20 and supply them to the image processing module 22. Furthermore, the image capturing apparatus comprises a memory 24 where the captured images are stored.
The image processing module 22 may be implemented as a portion of software stored on a memory of the image capturing apparatus 4 (e.g. memory 24) and arranged for execution on a processor of the image capturing apparatus 4. Alternatively, it is not excluded that some or all of the functionality of the image processing module 22 could be implemented in dedicated hardware circuitry, or configurable or reconfigurable hardware such as a PGA or FPGA. The image processing module 22 may be integrated into the same unit as the camera 20, or may be separate from it (e.g. running on a separate computer).
For each captured image, the image processing module 22 locates the light footprint 8 of the light source in the image, and then uses coded-light image processing techniques to determine therefrom a time at which the image was captured. For instance EP 2,503,852 describes how rolling-shutter type techniques can be used not only to detect the data signaled by the coded light component, but also the spatial footprint of an individual coded light component separated from amongst any other light components in the
environment.
Note that it is not necessary for the light source 2 itself to appear in the captured image, as long as light from the light source 2 is captured in the image (e.g. the light may comprise reflected light, or light received through a diffuser).
Figure 8 represents the image capture element 36 of the camera 20. The image capture element 36 comprises an array of pixels for capturing signals representative of light incident on each pixel, e.g. typically a square or rectangular array of square or rectangular pixels. In a rolling-shutter camera, the pixels are arranged into a plurality of lines, e.g.
horizontal rows 38, which are exposed in sequence line -by- line. That is, to capture a given frame, first one line 38 is exposed to the light in the target environment, then the next line in the sequence is exposed starting at a slightly later time, and so forth. Typically the sequence "rolls" in order across the frame, e.g. in rows top to bottom, hence the name "rolling shutter". When used to capture coded light, this means different lines within a frame capture the light at different times and therefore, if the line rate is high enough relative to the modulation frequency, at different phases of the modulation waveform. Thus the modulation in the light can be detected.
In WO 2012/127439 for example, it has been disclosed how coded light can be detected using a "rolling shutter" camera of this type. The signal detection exploits the rolling shutter image capture, which causes temporal light modulations to translate to spatial intensity variations over successive image rows of pixels.
The system as presented in WO 2012/127439 illuminates its environment and at the same time superimposes a modulation on the light output that although substantially imperceptible to the unaided human eye, is visible for the rolling shutter camera. The visible light communication system of WO2012/127439 is merely exemplary of the systems that can be used in conjunction with the present invention. This is illustrated schematically in Figure 9. As each successive line 38 is exposed, it is exposed at a slightly different time and therefore (if the line rate is high enough compared to the modulation frequency) at a slightly different phase of the modulation. Thus each line 38 is exposed to a respective instantaneous level of the modulated light. This results in a pattern of stripes which undulates or cycles with the modulation over a given frame. Based on this principle, the image processing module 22 is able to detect coded light components modulated into light received by the camera 20.
In Figures 1 and 2, the encoding of the light is depicted as causing "stripes" in the light footprint 8 of the light source 2. It will be appreciated that the encoding will be imperceptible to the human eye, but will be apparent as stripes in an image captured by the camera 20 due to its use of a rolling-shutter image sensor 36.
In alternative embodiments, it is also possible to capture coded light using a global shutter camera if the frame rate is high enough relative to the modulation frequency of the data in the coded light signal.
The steps performed by the light source 2 are summarized in Figure 4. At step
S10 the controller 10 of the light source 2 retrieves an ID of the light source from its memory 18 (or receives it over a network). The ID uniquely identifies the light source amongst a plurality of other light sources within a certain system (which in embodiments may each emit light encoded with their own respective IDs), e.g. within a room, building or complex. At step S20 the controller 10 also obtains an indication of the current time from the clock or timer 16 of the light source 2 (or receives this over a network). At step S30, the controller 10 then controls the lighting element 14 of the light source 2 to emit light modulated according to a suitable coded light technique, so as to encode the ID and time into the emitted light.
Notably, instead of (or in addition to) transmitting a unique identifier and a current indication of time, the light source may provide a cryptographically verifiable timestamp. For example the timestamp may be encrypted using a symmetric or an
asymmetric cipher, thereby requiring receiving parties to decrypt the timestamp.
Alternatively the timestamp may be combined with a digitally signed cryptographic message digest or hash of the message. In this manner the time-stamp remains in clear form and parties in possession of the proper key can verify message authenticity, by matching the timestamp with the accompanying message digest or hash.
The particular solution to choose may depend on the application and requirements. For example, should the timestamp be available in the clear, the choice may depend on whether the coded light information channel has sufficient bandwidth to accommodate the overhead of a digitally signed hash. For still images that have limited bandwidth, encryption may be most suitable because it does not add additional bits, whereas for higher bandwidth application such as video (which consists of multiple image frames) the inclusion of a plain text timestamp with a digitally signed hash may be beneficial because it also provide the timestamp in the clear.
The steps performed by the receive side apparatus 4 are summarized in Figure 5. At step T10 the camera 20 receives the light (directly and/or reflected) from the light source 2 and thereby captures an image (or images) of the environment. The image is stored in a memory, e.g. memory 18. At step T20, the image processing module 22 locates the part of the image which comprises the coded light from the light source 2 in question based on its ID (i.e. locates the "footprint" 8 of the coded light). For example this may be performed in accordance with the techniques set out in EP 2,503,852. At step T30, the image processing module 22 decodes the coded light as appearing in the image, in order to read the timestamp.
The image processing module then outputs an indication of the timestamp in association with the image, so that at step T40 it can be used to verify the time at which the image (or images) was (or were) captured, and hence that the one or more object of interest 6 appearing in the image were present in the environment in question at that time.
The indication of the timestamp may be output verbatim in the form it was received, or may be interpreted or reformatted to be output in a different form. For instance it could be interpreted from a computer-readable form to a human-readable form, and/or reduced from a higher time resolution to a lower timer resolution (e.g. if the timestamp is accurate to a higher degree than of interest, e.g. second or milliseconds, the image processing module 22 may output in terms of only, say, minutes, hours or even just date).
There are a number of options for how the outputted timestamp is used.
For example as illustrated in Figure 7, the image processing module 22 may be configured to store an indication of the timestamp in metadata 34 of the image file 30. When the image processing module 22 captures the one or more images 22, it records them by storing them in at least one image file 30 (e.g. multiple images per file in the case of a video, or a different file per image in the case of still images). The image file 30 comprises a portion of image data 32 which represents the actual graphical part of the image that will be viewed on screen, i.e. the actual pixels. But the image file 30 also comprises a metadata portion 34 (e.g. a header or side information) comprising data which is not part of the image per se, but which is associated with it. In embodiments, the timestamp is not (or at least is not only) output to the user for the user to verify the time, but is output to a verification module 23 to perform an at least partially automated verification procedure (refer again to Figure 3). The verification module 23 may be implemented as another portion of software stored on a memory of the image capturing apparatus 4 (e.g. memory 24) and arranged for execution on a processor of the image capturing apparatus 4. Alternatively, again it is not excluded that some or all of the functionality of the verification module 23 could be implemented in dedicated hardware circuitry, or configurable or reconfigurable hardware such as a PGA or FPGA. The verification module 23 may be integrated into the same unit as the camera 20 and/or image processing module 22, or may be separate from one or both of them.
There are a number of possibilities for the automated verification by the verification module 23.
In embodiments the timestamp obtained from the coded light can be used to verify a pre-existing timestamp associated with the image. In one such embodiment the image processing module 22 is configured to decode the timestamp again later from the one or more stored images, and the verification module 23 is configured to compare this with the stored timestamp in the metadata 34 in order to verify the stored timestamp. For instance it may be desired to check that the image has not been tampered with. In that case, the timestamp as previously decoded from the image (e.g. at the time of capture) and stored in association with it (e.g. in metadata 34 in the image file 30) should match the timestamp as decoded from that (allegedly) same image on a subsequent occasion. The verification module 23 is arranged to check this by comparing a stored version of the timestamp with a version of the (supposedly) same timestamp now extracted from the stored image.
As another example, the camera 20 is one which adds its own timestamp to at least one of the one or more the images, and the verification module 23 is configured to compare the camera's own timestamp with the decoded timestamp in order to verify the camera's timestamp. For instance it may not be certain whether the camera's time-stamping is reliable or trustworthy, or it may not be known whether the timestamp added by the camera has been tampered with. The verification module 23 may be configured to verify this by checking the camera's (apparent) timestamp against the coded light timestamp extracted from the image in question.
In one example the timestamp is encrypted using a symmetric or asymmetric cipher, such that the timestamp is provided in encrypted format and a corresponding decryption key is required to access the timestamp. In this manner it will be difficult for a third party to provide a "fake" timestamp.
In yet another example, the timestamp as modulated into the light is signed using a private key of a trusted party, e.g. a private key of an operator or supplier of the light source 2, or an associate thereof. In this example, the verification module 23 is configured to verify the decoded timestamp using a corresponding public key.
In one advantageous implementation the message comprising the timestamp also comprises a cryptographic hash signed with a private key of a private -public key pair (and optionally a digital certificate to aid verification). The verification module 23 can then verify authenticity of the data using the corresponding public key, i.e. the device can verify that the timestamp did indeed originate from a light source 2 of the trusted operator, supplier or associate.
Preferably proof of authenticity is provided using a cryptographically signed hash of the timestamp in conjunction with the time stamp.
Timestamp i = Timestamp | EPK(Private key, f ash(Timestamp)) wherein
the "I" operator signifies concatenation of data,
the fhash represents a cryptographic hash function, and
the EPK(.) function represents an asymmetric public key cipher. Timestamp i above allows the verification of the Timestamp by decrypting the hash using the corresponding public key of the cryptographic key-pair, calculating the cryptographic hash of the timestamp, and comparing the decrypted and calculated hash values.
In further embodiments, optionally the image processing module 22 may be configured to remove the modulation from said one or more images once the timestamp has been decoded, stored or verified. Although the modulation in the coded light is generally too high frequency to be visible, it is possible in some situations that it could leave artefacts in the captured image. To avoid this, the image processing module 22 may be configured to detect when the timestamp has been used (e.g. by detecting it has now been displayed to the user, incorporated as a graphical element 28 in the image, stored as metadata 34 and/or verified by the verification module 23), and then subsequently cancel the coded light component from the image data 32 as stored in the stored image file 30 and/or as displayed on screen to the user.
For example when the coded light signal is modulated in a manner analogous to WO 2012/127439, then the coded light signal typically leaves a signal around the lights or on the area illuminated by the lights. This manifestation of the data from the coded light signal within the resulting image may be considered as undesirable by the viewer. To this end post-processing may be used to remove part of the resulting patterns. Part of the coded light detection is to determine the footprint of the coded light; i.e. where in the image the coded light signal is available. After decoding the very same footprint may be used to filter out the coded light signal from the image, e.g. by means of a filter that primarily filters in the direction orthogonal to the lines of the rolling shutter, thereby reducing the visibility of the coded light signal within the footprint.
In particular in the latter case, preferably care should be taken that the information (timestamp, optional location data and/or identifier, or similar data) stored in the metadata is stored in a manner that uses cryptographic techniques to prevent or at least complicate tampering.
One possible solution to this effect would be to have the camera use cryptographic techniques to secure the timestamp, or the entire metadata, using cryptographic techniques, thereby rendering it more difficult to tamper with the timestamp data. The timestamp could be encrypted using a symmetric or asymmetric cipher, such that only an application with the corresponding keys can access the timestamp. Alternatively the camera may provide the timestamp combined with a cryptographically signed hash covering at least the timestamp (and optionally a digital certificate to aid verification). By using these cryptographic techniques the authenticity of the timestamp can be verified even after any visible artefacts in the still or video images have been removed through post-processing.
Preferably proof of authenticity is provided using a cryptographically signed hash of the timestamp (or the entire metadata) in conjunction with the time stamp.
Metadatai = Timestamp | Epi (Private key, fhash(Timestamp))
wherein
the "I" operator signifies concatenation of data,
the f ash represents a cryptographic hash function, and
the Ερκ(.) function represents an asymmetric public key cipher.
Metadatai above allows the verification of the Timestamp by decrypting the hash using the corresponding public key of the cryptographic key-pair, calculating the cryptographic hash of the timestamp, and comparing the decrypted and calculated hash values.
Note therefore, there are two possibilities given in the above description for the use of a cryptographically signed hash. The first is in the coded light signal (which is as secure as the coded light system is secure), and the second is in the proof as generated in the camera (which is as secure as the camera is secure). There are some differences between the first and second cases, the main difference being the available bandwidth. The
cryptographically signed hash is typically fairly short - it is kind of a checksum that is difficult to fake. But to verify the checksum, this requires the public key and certificate. The certificate is the public key needed to decrypt the hash which in turn is signed by a trusted third party. So the certificate allows for checking that the public key (needed to check the hash) is approved of by the trusted third party. This certificate may take up quite a number of bits, which can be readily append to a file like a JPG, but in some systems may be more than can be readily transmitted over coded light (e.g. would require a long detection time). Hence in some embodiments with limited coded light bandwidth, the inclusion of the certificate with the timestamp may be more applicable to the second case where the hash is applied by the receiving camera rather than the first case where it is applied in the coded light signal.
Alternatively, in either case, the public key and/or certificate could be distributed by a separate means (e.g. via a network or memory stick).
In further embodiments, in addition to one or more of the above forms of automated verification by the verification module 23, the timestamp may optionally also be output to a user, allowing the user him- or herself to verify the time at which the image was captured. In this case, what is being verified is the user's own perception, impression, memory or belief of when the image was captured or purported or alleged to have been captured. For instance the user may recognize the event in the image but wish to have confirmed his or her memory of when the event took place. Or as another example, the user may be shown the image by another who reports the event to have taken place at a certain time, but the user wishes to check that other person's assertion.
For example as illustrated in Figure 6, the image processing module 22 may be configured to modify the actual image 26 - i.e. the actual pixels of the image - to incorporate an indication of the timestamp as a graphical element 28 in the image, and to store the image in this modified form. When the user retrieves this modified, stored image to view on screen, he or she will see the timestamp 28 integrated graphically into the pixels of the image (N.B. "graphical element" here does not necessarily exclude text included as image data as part of an image). The image processing module 22 may also write-protect or encrypt the modified image 26 to prevent tampering.
Another example is based on the scenario shown in Figure 7. As described above, the image processing module 22 may be configured to store an indication of the timestamp in the metadata 34 of the image file 30. In the embodiment of Figure 6 it is the actual image data 32 which is modified to indicate the timestamp rather than the timestamp being included as metadata, but in the embodiment of Figure 7 the timestamp is indicated in the metadata 34. As well as this being able to be used by the verification module 23, the user him- or herself can also view the timestamp from the metadata 34, e.g. by viewing file details of the file 30 in a file system, operating system or browser on a computer.
As an addition to any of the above techniques, optionally the controller 10 of the light source 2 may also be configured to encode an indication of the light source's location into the light it emits. Thus the image processing module 22 at the receive side is further able to obtain, from the processing of the one or more images, information indicative of the location at which the one or more images were captured.
This information may take a number of possible forms. In embodiments, the location information may comprise geographic coordinates or a name of the location embedded directly into the light, e.g. as the GPS coordinates of the light source 2 (the light source 2 may even be equipped with a GPS receiver for this purpose) or name of the location that is meaningful to a human (e.g. name of a room, building, street, etc.). Alternatively, the location information embedded in the light may comprise an identifier of the light source 2 that is mapped to a location in a location database (not shown). In this case the image processing module being configured to look up the location from the location database based on said identifier.
Thus, in accordance with everything discussed above, the disclosed system can provide a kind of "watermarking" which creates a proof of when and optionally where an image was captured, e.g. allowing a user to demonstrate "I was there".
In embodiments, any of the verification techniques discussed above in relation to the time stamp may be applied to this "location stamp". For example the captured image stored in the image file 30 may be modified to incorporate the location as a graphical element 28 analogous to Figure 6, and/or the location may be included in the metadata 32 of the stored image file 30 as in Figure 7. Alternatively or additionally, the information indicative of the location may be encrypted or crypto graphically signed in the coded light signal, and/or encrypted or cryptographically signed at the camera, in similar manner to any of the options discussed above in relation to the timestamp.
Although the invention has been primarily been presented with the verification of timestamps in mind, possibly in combination with location information, it may equally be advantageous to create a system wherein the coded light only comprises location information rather than both timestamp and location information. Such a system would provide a further ability to verify where the image/footage was recorded, instead of when it was recorded. Such a device would be similar to the claimed device in that it would be a device comprising an input for receiving images captured by a camera, of an environment illuminated by light from a visible light source; and an image processing module configured to process one or more of the images to decode a data representing a location modulated into said light, and to output an indication of the decoded location for verification of a location where the one or more images were captured.
In yet further embodiments, any of the above techniques may be combined with a facial recognition algorithm or other image recognition algorithm in order to additionally identify a person (or even multiple people) in any of the one or more images from which the timestamp (and in embodiments location stamp) has been decoded - i.e. to detect the presence of a person (human) and to identify who that person is. For instance, if a face or part of a face can be discerned in one or more of the images, this may allow the facial recognition algorithm to identify the person to whom that face belongs. Or if the person is wearing a distinctive item such as a special badge, or distinctive clothing, optionally combined with other factors such as height, hair color, build, etc., it may be possible for an image recognition algorithm other than a facial recognition algorithm to identify the person (or both types of algorithm could be combined for a more robust identification). Whatever type of recognition algorithm is used, the person may be identified in terms of any suitable identifier, such as a personal name ("Dave Smith" or the like), or a nickname, or an email address, or a username of a certain communication system such as a username of a social networking system or VoIP system.
In embodiments, the facial and/or other image recognition algorithm may be combined into an app or other application along with the timestamp detection and location stamp detection, to log the fact that the identified person or person was at a certain place at a certain time (and therefore enable a user to prove that fact). For example this could be used as part of one or more of: a location-based game (where a result in the game depends on the person or persons being at one or more specified locations at a specified time or times), a retail rewards scheme (where a reward to the user depends on the user having been at one or more specified locations at a specified time or times), a social networking system (where the user may wish to prove that his or her activity is real), and/or a celebrity spotting application (e.g. to automatically detect celebrities in images, or prove that the user was present in an image with the celebrity at a certain place and time). The application may log the fact of the identified person or person being at the verified location at the verified time in a number of ways. For example this may be logged locally on a user device of the user, e.g. on portable user device such as a smart phone, tablet or laptop comprising the camera 20, image processing module 22 and verification module 23 incorporated into that user device. This way, the user can show one or more others (e.g.
friends, retail personnel, security personal, etc.) that the identified person or person was found at the time and place in question by showing them the logged result on a local user interface (e.g. screen) of the portable user device. As another example, the fact of the identified person or persons being at the verified location at the verified time could be logged on a server or other computer via a network. E.g. the user may submit a captured image from the camera 20 over a wide area network (WAN) such as the internet or a mobile cellular network (e.g. 3GPP network) to a server, at which the verification module 23 is implemented. E.g. this could be a server of a social networking system, or a server hosting a location-based game. This way the verification can be performed centrally and made available to one or more other users to see.
There are various possible applications of the above techniques. For instance, the need for verification may exist in a game. Take for example a game in which players have to travel according to a predetermined itinerary (predetermined schedule for a predetermined journey, route or set of locations). Evidence that the players were indeed present at locations which are part of the itinerary at certain time slots can be verified by means of captured images. The images may be created by means of fixed cameras at the locations in question, or alternatively the images may be captured by means of mobile devices of the players. More and more mobiles with cameras are designed to take so-called "selfies" which contain the face of the person who took the image. The image which is captured by the player of the game at a certain location at the certain moment can then be verified automatically by a device or application according to the invention. This device or application could be implemented in the mobile user terminal which was used to capture the image (e.g. as part of a "selfie contest"). Alternatively the device or application may be (at least partly)
implemented remotely. The device or application may then, e.g. shortly after the capturing of the image, verify whether the image was indeed made at the relevant location within a certain time slot based on knowledge of the codes in the emitted light of the light sources at that location. The device or application can compare the code which is embedded in the emitted light with codes which have been provided as part of the game. Preferably the device or application is also capable of facial recognition. By combining code verification and facial recognition the device can verify that the user who is recognized in the image was present at the location at the specific time.
As another example application, many people want to show to others (friends, family, etc.) that they have been at certain location - not as part of a specific game, but just as part of the social interaction with the others. The above-disclosed techniques can be used to support this type of social interaction.
In another example, the code in the emitted light comprises or is based on a timestamp and preferably also related to a kind of game. By "based on" here, it is meant that the code which is transmitted does not have to be a direct indication of the actual time. It also means that the signal in the emitted light is not constant over time, but rather fluctuates as function of time. The game is based on collecting points by being present at a certain location at a certain moment. For instance in bars or pubs, there are spots which are typically occupied by the most senior guys or the alpha types. Showing by means of an image that the user (or another person) was at such spot at a certain time can be important in the social game.
Another use of creating evidence that the user was at a certain location at a certain time may be found in retail. For instance, customers may receive discounts, or faster service, by providing evidence that they were in the shop or at another location. Again the image must be captured in the neighborhood of light sources emitting specific (time based) codes or with the light sources even directly visible in the images.
Yet another possible application is to use facial recognition for identifying or authenticating an alleged identity of a further person (say a celebrity) in the same image or sequence of images, and automatically verifying that the image supports an assertion that the particular/multiple persons in question was/were present (together) at the same location at the same time.
It will be appreciated that the above description for clarity has described embodiments of the invention with reference to different functional units and/or processors. However, it will be apparent that any suitable distribution of functionality between different functional units or processors may be used without deviating from the invention. For example, functionality illustrated to be performed by separate units, processors or controllers may be performed by the same processor or controllers. Hence, references to specific functional units are only to be seen as references to suitable means for providing the described functionality rather than indicative of a strict logical or physical structure or organization. The invention can be implemented in any suitable form including hardware, software, firmware or any combination of these. It will be appreciated that the invention also applies to computer programs, particularly computer programs on or in a carrier, adapted to put the invention into practice. The program may be in the form of a source code, an object code, a code intermediate source and an object code such as in a partially compiled form, or in any other form suitable for use in the implementation of the method according to the invention.
Another embodiment relating to a computer program product comprises computer-executable instructions corresponding to each means of at least one of the systems and/or products set forth herein. These instructions may be sub-divided into sub-routines and/or stored in one or more files that may be linked statically or dynamically. When provided on a carrier, the carrier of a computer program may be any entity or device capable of carrying the program. For example, the carrier may include a storage medium, such as a ROM, for example, a CD ROM or a semiconductor ROM, or a magnetic recording medium, for example, a hard disk. Alternatively, the carrier may be an integrated circuit in which the program is embedded, the integrated circuit being adapted to perform, or used in the performance of, the relevant method.
It will be appreciated that the above embodiments have been described by way of example . Other variations to the disclosed embodiments can be understood and effected by those skilled in the art in practicing the claimed invention, from a study of the drawings, the disclosure, and the appended claims. In the claims, the word "comprising" does not exclude other elements or steps, and the indefinite article "a" or "an" does not exclude a plurality. A single processor or other unit may fulfill the functions of several items recited in the claims. The mere fact that certain measures are recited in mutually different dependent claims does not indicate that a combination of these measures cannot be used to advantage. A computer program may be stored/distributed on a suitable medium, such as an optical storage medium or a solid-state medium supplied together with or as part of other hardware, but may also be distributed in other forms, such as via the Internet or other wired or wireless telecommunication systems. Any reference signs in the claims should not be construed as limiting the scope.

Claims

CLAIMS:
1. A device comprising:
an input for receiving images captured by a camera (20), of an environment illuminated by light from a visible light source (2);
an image processing module (22) configured to process one or more of the images to decode a timestamp modulated into said light, and to output an indication of the decoded timestamp; and
a verification module (23) arranged to receive the indication of the decoded timestamp output by the image processing module (22), and based thereon to perform an at least partially automated verification of a time at which the one or more images were captured.
2. The device of claim 1, wherein the image processing module (22) is further configured to decode, from the processing of the one or more images, information modulated into the light indicative of a location of the light source or environment, and to output an indication of the location for verification of a location at which the one or more images were captured.
3. The device of claim 2, wherein said information comprises:
coordinates or a name of the location explicitly encoded into the light; or an identifier of the light source (2) that is mapped to a location in a location database, the image processing module being configured to look up the location from the location database based on said identifier.
4. The device of any preceding claim, wherein the image processing module (22) is further configured to apply a facial recognition and/or other image recognition algorithm to at least one of the captured images, and to thereby identify a person or persons appearing in the at least one image.
5. The device of any preceding claim, wherein the timestamp is cryptographically secured, and the image processing module (22) is configured to use a cryptographic key to access the timestamp.
6. The device of any preceding claim, wherein:
the image processing module (22) is configured to store the one or more images in at least one image file (30), wherein said outputting comprises storing the decoded timestamp in the image file as metadata (34) of the image file;
the image processing module (22) is configured to decode the timestamp again later from the one or more stored images; and
the verification module (23) is configured to perform said at-least-partially automated verification by comparing the later decoded timestamp with the stored timestamp in the metadata in order to verify the stored timestamp.
7. The device of claim 6 wherein the timestamp in the metadata or the metadata as a whole is cryptographically secured, and the verification module (23) is configured to use a cryptographic key to access the timestamp or metadata.
8. The device of any preceding claim, wherein the camera (20) adds its own timestamp to at least one of said one or more the images, and the verification module (23) is configured to perform said at-least-partially automated verification by comparing the camera's own timestamp with the decoded timestamp in order to verify the camera's timestamp.
9. The device of any preceding claim, wherein the timestamp as modulated into said light is signed using a private key, and the verification module (23) is configured to perform said at least-partially-automated verification by verifying the decoded timestamp using a corresponding public key.
10. The device of any preceding claim, wherein the image processing module (22) is configured to remove the modulation from said one or more images once the timestamp has been decoded, stored or verified.
11. An image capturing system comprising:
the device of any preceding claim;
the light source (2), being arranged to emit light to illuminate said environment, and configured to modulate the light emitted by the light source with said timestamp indicating a current time value; and
the camera (20), being arranged to capture said one or more images of said environment.
12. A computer program product for processing images, the computer program product comprising code embodied on a computer-readable storage medium and configured so as when executed to perform operations of:
receiving images captured by a camera (20), of an environment illuminated by light from a light source (2); and
processing one or more of the images to decode a timestamp modulated into said light indicating a time at which the one or more images were captured; and
based on the timestamp as determined by said processing, performing an at least partially automated verification of the time at which the one or more images were captured.
13. The computer program product of claim 12, wherein the computer program product is further configured to:
decode, from the processing of the one or more images, information modulated into the light indicative of a location of the light source or environment, and to output an indication of the location for verification of a location at which the one or more images were captured;
apply a facial recognition and/or other image recognition algorithm to at least one of the captured images, and to thereby identify a person or persons appearing in the at least one image; and
log the fact that the identified person or the identified persons are verified to have been at said location at said time.
14. The computer program product of claim 13, wherein said logging comprises logging said fact as part of one or more of:
a location-based game, a retail rewards scheme,
a social networking system, and/or
a celebrity spotting application.
15. A method comprising :
using light emitted from a light source (2) to illuminate an environment; encoding a timestamp indicating a current time value into the light emitted by the light source;
using a camera (20) to capture one or more of said environment; processing at least one of the images to determine the timestamp encoded into said light; and
based on the timestamp as determined by said processing, performing an at least partially automated verification of a time at which the at least one image was captured.
PCT/EP2015/059116 2014-05-12 2015-04-28 Verification of images captured using a timestamp decoded from illumination from a modulated light source WO2015173001A1 (en)

Priority Applications (5)

Application Number Priority Date Filing Date Title
CN201580038006.0A CN106471795B (en) 2014-05-12 2015-04-28 Use the verifying from the illumination timestamp captured image decoded from modulated light source
JP2016567213A JP6545192B2 (en) 2014-05-12 2015-04-28 Verification of captured images using timestamps decoded from illumination from modulated light sources
RU2016148406A RU2016148406A (en) 2014-05-12 2015-04-28 CHECKING IMAGES CAPTURED USING A TEMPORARY LABEL DECODED FROM LIGHTING FROM A MODULATED LIGHT SOURCE
EP15717921.9A EP3143759A1 (en) 2014-05-12 2015-04-28 Verification of images captured using a timestamp decoded from illumination from a modulated light source
US15/311,160 US10074030B2 (en) 2014-05-12 2015-04-28 Verification of images captured using a timestamp decoded from illumination from a modulated light source

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
EP14167836.7 2014-05-12
EP14167836 2014-05-12

Publications (1)

Publication Number Publication Date
WO2015173001A1 true WO2015173001A1 (en) 2015-11-19

Family

ID=50687317

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/EP2015/059116 WO2015173001A1 (en) 2014-05-12 2015-04-28 Verification of images captured using a timestamp decoded from illumination from a modulated light source

Country Status (6)

Country Link
US (1) US10074030B2 (en)
EP (1) EP3143759A1 (en)
JP (1) JP6545192B2 (en)
CN (1) CN106471795B (en)
RU (1) RU2016148406A (en)
WO (1) WO2015173001A1 (en)

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP3346442A1 (en) * 2017-01-10 2018-07-11 QuantifiCare S.A. Method and device for timestamping digital images
CN108388649A (en) * 2018-02-28 2018-08-10 深圳市科迈爱康科技有限公司 Handle method, system, equipment and the storage medium of audio and video
WO2018106619A3 (en) * 2016-12-06 2019-01-24 G Tort Selfee social media nan0-influencer tracking and reward system and method
CN109792472A (en) * 2016-10-12 2019-05-21 富士通株式会社 Signal adjustment programme, sig-nal-conditioning unit and signal adjusting method
US10541752B2 (en) 2016-09-22 2020-01-21 Signify Holding B.V. Associating a mobile device with a group

Families Citing this family (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10009100B2 (en) * 2014-06-18 2018-06-26 Qualcomm Incorporated Transmission of identifiers using visible light communication
JP7170278B2 (en) * 2017-05-17 2022-11-14 パナソニックIpマネジメント株式会社 Video recording device, video recording verification system, video recording method, and video verification method
WO2019168547A1 (en) * 2018-03-02 2019-09-06 Visa International Service Association Dynamic lighting for image-based verification processing
CN108694765A (en) * 2018-05-11 2018-10-23 京东方科技集团股份有限公司 A kind of visitor's recognition methods and device, access control system
CN108768525A (en) * 2018-07-16 2018-11-06 北京邮电大学 The visible light application device of data is obtained by camera scanning light source
CN112889350A (en) * 2018-09-18 2021-06-01 Ai智者有限公司 System and process for identification and illumination of anatomical parts of a person and of items at such parts
KR102565422B1 (en) * 2018-10-19 2023-08-09 라인 가부시키가이샤 Method, computer apparatus, and computer program for providing authentication data
FR3097708B1 (en) * 2019-06-19 2024-04-12 Oledcomm Face detection and wireless optical communication module
CN114003874A (en) * 2020-02-26 2022-02-01 支付宝(杭州)信息技术有限公司 Authenticity verification method, device and equipment for image data
CN111835419B (en) * 2020-07-14 2021-08-27 长安大学 Data secret transmission method for visible light communication of CMOS camera
CN111935102B (en) * 2020-07-16 2022-08-05 重庆京像微电子有限公司 Data processing method and device
CN115708116A (en) * 2021-08-10 2023-02-21 瑞庭网络技术(上海)有限公司 Video verification method, device and system, electronic equipment and storage medium
US20230328312A1 (en) * 2022-04-11 2023-10-12 Volvo Truck Corporation Compressed timestamp in a videostream

Citations (19)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2003009217A1 (en) * 2001-07-16 2003-01-30 Trustcopy Pte Ltd. Electronic signing of documents
US20030204738A1 (en) * 2002-04-30 2003-10-30 Morgan Stephen Paul System and method for secure distribution of digital content via a network
US20040125208A1 (en) * 2002-09-30 2004-07-01 Malone Michael F. Forensic communication apparatus and method
EP1460841A1 (en) * 2003-03-19 2004-09-22 Koninklijke Philips Electronics N.V. Method of preventing making of unauthorized recordings
US20080129758A1 (en) * 2002-10-02 2008-06-05 Harry Fox Method and system for utilizing a JPEG compatible image and icon
US20080233984A1 (en) * 2007-03-23 2008-09-25 Jeffrey Michael Franklin Cross-carrier content upload, social network and promotional platform
US20080310850A1 (en) * 2000-11-15 2008-12-18 Federal Law Enforcement Development Services, Inc. Led light communication system
US20090067667A1 (en) * 2007-09-12 2009-03-12 General Electric Company Method and system for image integrity determination
US20090184982A1 (en) * 2008-01-17 2009-07-23 Sony Corporation Program, image data processing method, and image data processing apparatus
US20090280838A1 (en) * 2008-05-09 2009-11-12 Ajay Malik Device and Method for Adding Location Data to Images
US20090284366A1 (en) * 2008-05-14 2009-11-19 Sony Ericsson Mobile Communications Ab System and method for determining positioning information via modulated light
GB2462163A (en) * 2008-06-12 2010-02-03 Redzebra Mobile Ltd Digital Camera with automatic image and metadata encryption.
US20100241723A1 (en) * 2009-03-17 2010-09-23 Hugh Dornbush Computer-Implemented Delivery of Real-Time Participatory Experience of Localized Events
US20110288917A1 (en) * 2010-05-21 2011-11-24 James Wanek Systems and methods for providing mobile targeted advertisements
EP2503852A1 (en) * 2011-03-22 2012-09-26 Koninklijke Philips Electronics N.V. Light detection system and method
US20120249586A1 (en) * 2011-03-31 2012-10-04 Nokia Corporation Method and apparatus for providing collaboration between remote and on-site users of indirect augmented reality
US20120310720A1 (en) * 2011-03-31 2012-12-06 Nokia Corporation Method and apparatus for processing coupons/purchases based on radio frequency memory tag detection
WO2013108166A1 (en) * 2012-01-17 2013-07-25 Koninklijke Philips N.V. Visible light communications using a remote control
US20140006162A1 (en) * 2012-06-30 2014-01-02 Empire Technology Development, Llc Profile construction based on asserted interest and actual participation in associated activities

Family Cites Families (22)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE60144222D1 (en) 2000-01-13 2011-04-28 Digimarc Corp AUTHENTICATIVE METADATA AND EMBEDDING OF METADATA IN WATERMARK OF MEDIA SIGNALS
JP3575679B2 (en) * 2000-03-31 2004-10-13 日本電気株式会社 Face matching method, recording medium storing the matching method, and face matching device
DE10143518A1 (en) 2001-09-05 2003-03-27 Siemens Ag Device for correcting varying light intensity caused by network voltage with artificial sources of light while taking pictures with a camera has a light-sensitive sensor to convert light intensity into an electric signal.
CN100555914C (en) * 2004-05-31 2009-10-28 卡西欧计算机株式会社 Information receiver, the information transmission system and message receiving method
JP4797415B2 (en) * 2005-03-25 2011-10-19 株式会社ニコン Illumination device, photographing device, and photographing system
US9100124B2 (en) * 2007-05-24 2015-08-04 Federal Law Enforcement Development Services, Inc. LED Light Fixture
US9258864B2 (en) * 2007-05-24 2016-02-09 Federal Law Enforcement Development Services, Inc. LED light control and management system
US7866557B2 (en) * 2007-09-27 2011-01-11 Symbol Technologies, Inc. Imaging-based bar code reader utilizing modified rolling shutter operation
EP2643659B1 (en) 2010-11-19 2019-12-25 Apple Inc. Depth mapping using time-coded illumination
US8538596B2 (en) * 2010-12-20 2013-09-17 Redwood Systems, Inc. Light timeout optimization
JP2012195762A (en) * 2011-03-16 2012-10-11 Seiwa Electric Mfg Co Ltd Movement line information acquisition system
US9287976B2 (en) * 2011-07-26 2016-03-15 Abl Ip Holding Llc Independent beacon based light position system
US10070100B2 (en) 2011-09-22 2018-09-04 Philips Lighting Holding B.V. Imaging service using outdoor lighting networks
WO2013072826A1 (en) 2011-11-15 2013-05-23 Koninklijke Philips Electronics N.V. Coded light transmission and reception for light scene creation
US9264627B2 (en) * 2012-01-09 2016-02-16 Lifetouch Inc. Video photography system
US9768958B2 (en) * 2012-05-07 2017-09-19 Kuang-Chi Innovative Technology Ltd. Visible-light communication-based encryption, decryption and encryption/decryption method and system
US9453904B2 (en) * 2013-07-18 2016-09-27 Golba Llc Hybrid multi-camera based positioning
US9646384B2 (en) * 2013-09-11 2017-05-09 Google Technology Holdings LLC 3D feature descriptors with camera pose information
US9496955B2 (en) * 2013-09-19 2016-11-15 eocys, LLC Devices and methods to produce and receive an encoded light signature
US9554744B2 (en) * 2013-12-19 2017-01-31 International Business Machines Corporation Mining social media for ultraviolet light exposure analysis
US9240839B2 (en) * 2014-01-07 2016-01-19 Nokia Corporation Transmitting data to a rolling shutter sensor array via a light emitter array
US9542390B2 (en) * 2014-04-30 2017-01-10 Adobe Systems Incorporated Method and apparatus for mitigating face aging errors when performing facial recognition

Patent Citations (19)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20080310850A1 (en) * 2000-11-15 2008-12-18 Federal Law Enforcement Development Services, Inc. Led light communication system
WO2003009217A1 (en) * 2001-07-16 2003-01-30 Trustcopy Pte Ltd. Electronic signing of documents
US20030204738A1 (en) * 2002-04-30 2003-10-30 Morgan Stephen Paul System and method for secure distribution of digital content via a network
US20040125208A1 (en) * 2002-09-30 2004-07-01 Malone Michael F. Forensic communication apparatus and method
US20080129758A1 (en) * 2002-10-02 2008-06-05 Harry Fox Method and system for utilizing a JPEG compatible image and icon
EP1460841A1 (en) * 2003-03-19 2004-09-22 Koninklijke Philips Electronics N.V. Method of preventing making of unauthorized recordings
US20080233984A1 (en) * 2007-03-23 2008-09-25 Jeffrey Michael Franklin Cross-carrier content upload, social network and promotional platform
US20090067667A1 (en) * 2007-09-12 2009-03-12 General Electric Company Method and system for image integrity determination
US20090184982A1 (en) * 2008-01-17 2009-07-23 Sony Corporation Program, image data processing method, and image data processing apparatus
US20090280838A1 (en) * 2008-05-09 2009-11-12 Ajay Malik Device and Method for Adding Location Data to Images
US20090284366A1 (en) * 2008-05-14 2009-11-19 Sony Ericsson Mobile Communications Ab System and method for determining positioning information via modulated light
GB2462163A (en) * 2008-06-12 2010-02-03 Redzebra Mobile Ltd Digital Camera with automatic image and metadata encryption.
US20100241723A1 (en) * 2009-03-17 2010-09-23 Hugh Dornbush Computer-Implemented Delivery of Real-Time Participatory Experience of Localized Events
US20110288917A1 (en) * 2010-05-21 2011-11-24 James Wanek Systems and methods for providing mobile targeted advertisements
EP2503852A1 (en) * 2011-03-22 2012-09-26 Koninklijke Philips Electronics N.V. Light detection system and method
US20120249586A1 (en) * 2011-03-31 2012-10-04 Nokia Corporation Method and apparatus for providing collaboration between remote and on-site users of indirect augmented reality
US20120310720A1 (en) * 2011-03-31 2012-12-06 Nokia Corporation Method and apparatus for processing coupons/purchases based on radio frequency memory tag detection
WO2013108166A1 (en) * 2012-01-17 2013-07-25 Koninklijke Philips N.V. Visible light communications using a remote control
US20140006162A1 (en) * 2012-06-30 2014-01-02 Empire Technology Development, Llc Profile construction based on asserted interest and actual participation in associated activities

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
See also references of EP3143759A1 *

Cited By (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10541752B2 (en) 2016-09-22 2020-01-21 Signify Holding B.V. Associating a mobile device with a group
CN109792472A (en) * 2016-10-12 2019-05-21 富士通株式会社 Signal adjustment programme, sig-nal-conditioning unit and signal adjusting method
CN109792472B (en) * 2016-10-12 2020-11-03 富士通株式会社 Signal adjustment program, signal adjustment device, and signal adjustment method
WO2018106619A3 (en) * 2016-12-06 2019-01-24 G Tort Selfee social media nan0-influencer tracking and reward system and method
EP3346442A1 (en) * 2017-01-10 2018-07-11 QuantifiCare S.A. Method and device for timestamping digital images
FR3061792A1 (en) * 2017-01-10 2018-07-13 Quantificare METHOD AND DEVICE FOR HORODATING DIGITAL IMAGES
CN108388649A (en) * 2018-02-28 2018-08-10 深圳市科迈爱康科技有限公司 Handle method, system, equipment and the storage medium of audio and video
CN108388649B (en) * 2018-02-28 2021-06-22 深圳市科迈爱康科技有限公司 Method, system, device and storage medium for processing audio and video

Also Published As

Publication number Publication date
EP3143759A1 (en) 2017-03-22
JP6545192B2 (en) 2019-07-17
CN106471795A (en) 2017-03-01
CN106471795B (en) 2019-03-12
JP2017526195A (en) 2017-09-07
US20170124421A1 (en) 2017-05-04
RU2016148406A (en) 2018-06-15
US10074030B2 (en) 2018-09-11

Similar Documents

Publication Publication Date Title
US10074030B2 (en) Verification of images captured using a timestamp decoded from illumination from a modulated light source
US10361780B2 (en) Information processing program, reception program, and information processing apparatus
US10142020B2 (en) Reproduction method for reproducing contents
US9871587B2 (en) Information processing method for generating encoded signal for visible light communication
Li et al. Hilight: Hiding bits in pixel translucency changes
CN106792267B (en) A kind of picture and video information authenticity mark and the system and method that identify
JP2017526195A5 (en)
WO2013147908A1 (en) Methods and systems for cryptographic access control of video
JP7170278B2 (en) Video recording device, video recording verification system, video recording method, and video verification method
WO2023004159A1 (en) Systems and methods employing scene embedded markers for verifying media
KR20130114037A (en) Masking and recovering method of privacy region
CN107241585B (en) Video monitoring method and system
CN109644187B (en) Optical label anti-counterfeiting method and system based on active response
WO2018041131A1 (en) Anti-counterfeiting method and system for optical communication device
US12010320B2 (en) Encoding of modified video
CN110335433A (en) Applied to the management method of data center, device, medium and equipment
EP3217644B1 (en) Information processing device
WO2018019553A1 (en) Monitoring an area using illumination
Abe et al. Screen-camera communication via matrix barcode utilizing imperceptible color vibration
WO2019120050A1 (en) Optical communication apparatus and corresponding anti-counterfeiting method and system
WO2019120049A1 (en) Optical communication apparatus and corresponding anti-counterfeiting method and system
JP2015192264A (en) Device, method and system for information notification, and computer program
SG177037A1 (en) System for real-time information transfer from movie to remote device
Bo Privacy preservation in mobile computing and networking: Accessing, sharing and broadcasting
WO2016135445A1 (en) Method and apparatus for data verification in a mixed reality system

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 15717921

Country of ref document: EP

Kind code of ref document: A1

ENP Entry into the national phase

Ref document number: 2016567213

Country of ref document: JP

Kind code of ref document: A

REEP Request for entry into the european phase

Ref document number: 2015717921

Country of ref document: EP

WWE Wipo information: entry into national phase

Ref document number: 2015717921

Country of ref document: EP

NENP Non-entry into the national phase

Ref country code: DE

WWE Wipo information: entry into national phase

Ref document number: 15311160

Country of ref document: US

ENP Entry into the national phase

Ref document number: 2016148406

Country of ref document: RU

Kind code of ref document: A