WO2018063532A1 - Data processing and authentication of light communication sources - Google Patents

Data processing and authentication of light communication sources Download PDF

Info

Publication number
WO2018063532A1
WO2018063532A1 PCT/US2017/046611 US2017046611W WO2018063532A1 WO 2018063532 A1 WO2018063532 A1 WO 2018063532A1 US 2017046611 W US2017046611 W US 2017046611W WO 2018063532 A1 WO2018063532 A1 WO 2018063532A1
Authority
WO
WIPO (PCT)
Prior art keywords
data
modulated light
source
image
image data
Prior art date
Application number
PCT/US2017/046611
Other languages
French (fr)
Inventor
Richard D. Roberts
Parmoon SEDDIGHRAD
Javier Perez-Ramirez
Roni Abiri
Original Assignee
Intel Corporation
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Intel Corporation filed Critical Intel Corporation
Priority to EP17856991.9A priority Critical patent/EP3520251B1/en
Priority to KR1020197005012A priority patent/KR102488828B1/en
Priority to CN201780053468.9A priority patent/CN109644052B/en
Priority to JP2019507929A priority patent/JP7172004B2/en
Publication of WO2018063532A1 publication Critical patent/WO2018063532A1/en

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/50Context or environment of the image
    • G06V20/56Context or environment of the image exterior to a vehicle by using sensors mounted on the vehicle
    • G06V20/58Recognition of moving objects or obstacles, e.g. vehicles or pedestrians; Recognition of traffic objects, e.g. traffic signs, traffic lights or roads
    • G06V20/584Recognition of moving objects or obstacles, e.g. vehicles or pedestrians; Recognition of traffic objects, e.g. traffic signs, traffic lights or roads of vehicle lights or traffic lights
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04BTRANSMISSION
    • H04B10/00Transmission systems employing electromagnetic waves other than radio-waves, e.g. infrared, visible or ultraviolet light, or employing corpuscular radiation, e.g. quantum communication
    • H04B10/80Optical aspects relating to the use of optical transmission for specific applications, not provided for in groups H04B10/03 - H04B10/70, e.g. optical power feeding or optical transmission through water
    • H04B10/85Protection from unauthorised access, e.g. eavesdrop protection
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/20Scenes; Scene-specific elements in augmented reality scenes
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04BTRANSMISSION
    • H04B10/00Transmission systems employing electromagnetic waves other than radio-waves, e.g. infrared, visible or ultraviolet light, or employing corpuscular radiation, e.g. quantum communication
    • H04B10/11Arrangements specific to free-space transmission, i.e. transmission through air or vacuum
    • H04B10/114Indoor or close-range type systems
    • H04B10/116Visible light communication
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N7/00Television systems
    • H04N7/18Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast
    • H04N7/183Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast for receiving images from a single remote source
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04WWIRELESS COMMUNICATION NETWORKS
    • H04W12/00Security arrangements; Authentication; Protecting privacy or anonymity
    • H04W12/06Authentication
    • H04W12/069Authentication using certificates or pre-shared keys
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2218/00Aspects of pattern recognition specially adapted for signal processing

Definitions

  • Embodiments described herein generally relate to processing techniques of data from light communication sources, and in particular, to the use of authentication and data interpretation techniques for data obtained from visible light via optical camera communication sources.
  • Visible light communications are embodied in a variety of emerging wireless communication techniques, such as in communications techniques that utilize light sources such as light-emitting diode (LED) signage and LED lamps to broadcast messages.
  • LED light-emitting diode
  • a variety of applications have been proposed in the area of visible light communication, including for specialized deployments of wireless data networks that serve as a high-speed link for a last mile transmission of a network connection.
  • the brightness of the light source is modulated faster than the human eye may observe, allowing a light source to transmit messages without a perceivable flicker.
  • optical camera communications also known as “CamCom”
  • CamCom uses an image sensor within a camera for receiving and processing visible (human- or camera-visible) light data.
  • One proposal for the standardization of optical camera communications is currently being developed by the Short-Range Optical Wireless Communications Task Group for a revision of the IEEE 802.15.7-2011 specification.
  • this task group is developing enhanced standards for the use of optical ⁇ camera communications to enable scalable data rate, positioning/localization, and message broadcasting, using optical devices such as a flash, display, and image sensor as a transmitting or recei ving devi ce.
  • FIG. 1 illustrates an operational environment for processing and authenticating light communication sources with components of a motor vehicle, according to an example
  • FIG, 2A illustrates a stylized representation of a camera-captured scene observed from a motor vehicle, indicating multiple light communication sources, according to an example
  • FIG, 2B illustrates a stylized representation of a camera-captured scene observed from a motor vehicle, indicating an authentication of a particular light communication source from among multiple light communication sources, according to an example
  • FIG. 3 illustrates a stylized representation of a camera-captured scene from a motor vehicle, indicating an authentication of multiple light communication sources in a restricted field of view, according to an example
  • FIG, 4 illustrates a sequence diagram of operations for selecting and interpreting optically communicated data among components of an optical camera communications system, according to an example
  • FIG. 5 is a flowchart illustrating a method of obtaining and processing modulated light data in an optical camera communications system using a user authentication technique, according to an example
  • FIG. 6 is a flowchart illustrating a method of obtaining and processing modulated light data in an optical camera communications system using an automatic authentication technique, according to an example;
  • FIG. 7 illustrates a block diagram of components in an example system for processing and authenticating modulated light data using optical camera communications, according to an example;
  • FIG. 8 illustrates a block diagram for an example electronic processing system architecture upon which any one or more of the techniques (e.g., operations, processes, methods, and methodologies) discussed herein may be performed, according to an example.
  • optical camera communications which utilizes light emitting objects such as LED signage and LED lights to output (transmit) data to be captured (received) via an image sensor in a camera.
  • image data that includes modulated light data and authenticating the source of modulated light data from the image data are disclosed herein.
  • Authentication refers to providing or determining a proof of identity before a data source associates with (e.g., provides data to) a data sink.
  • a data source associates with (e.g., provides data to) a data sink.
  • authentication frame exchanges are used to ensure that a station has the correct authentication information (e.g., a pre-shared WEP/WPA encryption key) before being able to establish a connection with the wireless network.
  • the assumption is that if the encryption key is known, then the station is authorized to associate with the network.
  • authentication is performed at a lower layer of processing, by visually
  • identifying a data source in image data to confirm that the data sink desires to receive data from the visually observed data source.
  • the identification of a desired data source may be used to locate, select, access, and process modulated light data from a desired light emitting object, while disregarding modulated light data detected from other light emitting objects. Thus, light sources that are not authenticated may be ignored and disregarded, preventing the use of unknown, unwanted, unverified, unauthorized, or rogue data.
  • optical camera communication authentication techniques may include the identification and selection of a modulated light data source, performed using either human input or automated object recognition upon image data of the light emitting object.
  • image data for authentication enables proper verification of modulated light data from the desired source, because the image data obtained by a camera sensor captures light to visually recognize the object as it also captures the light used to transmit the modulated data. Accordingly, the optical camera communication
  • FIG, 1 illustrates an example operational environment for processing and authenticating light communication sources with components of a motor vehicle.
  • the following examples of FIGS. 1 to 3 specifically describe use cases involving the capture of image data and modulated light data from a camera positioned at the perspective of a motor vehicle occupant, such as may occur when the occupant operates the motor vehicle on a roadway.
  • the integration of the following example features may be provided in a motor vehicle with a factory-integrated telematics and infotainment system, or with an add-on telematics and infotainment device.
  • optical camera communication authentication features may also be applicable to other forms of mobile computing devices that operate independently from a motor vehicle, such as with image and data processing capability provided in smartphones, wearable devices, tablets, portable personal computers, and like user-interactive/client devices embedded in other operational systems.
  • a motor vehicle 1 10 includes a camera device 112, which is positioned outward facing with respect to the motor vehicle 110 and the surrounding environment to detect and capture a scene in a field of view.
  • the camera device 1 12 is shown as obtaining an optical image of the field of view from the forward direction of the motor vehicle 1 10, which includes visible light communication 120 being transmitted to the motor vehicle 1 10 from a light emitting object (such as LED signage).
  • the lights in the light emitting object are modulated rapidly to indicate data in a fashion that that the human eye typically cannot see or observe (e.g., with rapidly blinking lights that are not perceivable to a human).
  • the camera device 112 includes at least one sensor to capture image data of the scene, and the camera device 112 may include or be operabiy coupled to processing circuitry to detect that at least one light of the light emitting object is modulated with data (e.g., is emitting the visible light communication 120).
  • the motor vehicle 110 includes a number of processing components 130 to obtain, process, and evaluate a scene in the field of view observed in front of the motor vehicl e.
  • processing capabilities operate to capture image data for real-world objects (such as still RGB images of the LED signage) and the modulated light data provided in the visible light communication 120 (such as the modulated light data provided from operation of the LED signage).
  • the processing components 130 may include: a camera sensor 132 (e.g., CMOS/CCD sensor) to capture image data of a scene; camera data processing components 134 (e.g., implemented with programmed circuitr ') to process, store, and extract data from the captured image data; and visible light communication processing components 136 (e.g., implemented with
  • the processing components 130 may also include: authentication data processing components 138 (e.g., implemented with programmed circuitry) to implement user-interactive or automated authentication of light modulation data from a light emitting source (an object); user interface display processing components 140 (e.g. , implemented with programmed circuitry) to receive user- interactive controls, including the generation of an augmented display of the image data; and an interactive display unit 142 (e.g., a touchscreen display hardware) to output a display of the image data and receive user input and commands for the display of the image data.
  • authentication data processing components 138 e.g., implemented with programmed circuitry
  • user interface display processing components 140 e.g. , implemented with programmed circuitry
  • an interactive display unit 142 e.g., a touchscreen display hardware
  • the processing components 130 or another component integrated with the motor vehicle 1 10 may also be used to access an external network source 150 (e.g. , via the Internet), to obtain supplemental data 160 for use in the authentication or processing of data with the visible light communication 120.
  • the external network source 150 may provide a network-connected data processing server 1.52 (e.g., a web server) and data-hosting system 154 (e.g., a database) to serve the supplemental data in response to a request or a query from the processing components 130.
  • a network-connected data processing server 1.52 e.g., a web server
  • data-hosting system 154 e.g., a database
  • communication 120 may include data indicating a uniform request locator (URL) of the external network source 150, with the data processing server 1.52 and data-hosting system 154 adapted to serve the supplemental data 160 in response to the request or query.
  • URL uniform request locator
  • FIG. 2A illustrates a stylized representation 200A of an example camera-captured scene observed from a motor vehicle, indicating multiple light communication sources.
  • the stylized representation 200A illustrates an output of image data including an image of three illuminated signs in a real-world environment: an ice cream shop sign 202, a coffee shop sign 204, and a traffic sign 206.
  • Each illuminated sign includes LEDs that modulate light data in a specific pattern, to send respective sets of visible light communication data to be received and demodulated via a camera.
  • each of the three illuminated signs 202, 204, 206 provide light output that is modulated in a pattern to signal data.
  • a signal processor associated with the camera determines (e.g., locates, observes) which objects in the captured scene are transmitting optical camera communication data. This is important because in some examples only a few of the available LED light emitters in the scene actually transmit useable data.
  • the identification of a light emitting source is performed using a specialized camera communications waveform, such as with use of a start frame delimiter.
  • the lights may use a specialized signaling output to indicate that they are a modulated light data source.
  • a signal processor associated with the camera identifies available light sources that are transmitting (e.g., broadcasting) data to an available observer.
  • a manual or automated authentication process may be performed to select data from an available (identified) light source.
  • the image processor may generate a solid box (e.g., a colored box) around each light source (e.g., signs 202, 204, 206) that is transmitting modulated data.
  • the image processor provides this indication as an overlay on the image data to highlight or emphasize real-world locations of an identified modulated light data source.
  • the identification operates to highlight or mark an available data source to a human user or to an automated mechanism (with such automated mechanisms including an image recognition technique or image processing algorithm). Other methods and mechanisms for marking, listing, or identifying light emitting sources may also be utilized.
  • the information being sent by the modulated light data may include encoded information in the form of graphical, textual, or other software-interpretable content.
  • the information being sent by the modulated light data also may include a URL address that will be used by a processing system to access supplemental data (e.g., via a radio access network such as Wi-Fi or a 3G/4G data connection).
  • the stylized representation 200A may be updated to display the graphical, textual, or software-interpreted content.
  • FIG, 2B illustrates a stylized representation 200B of an example camera-captured scene observed from a motor vehicle, indicating an
  • the contextual menu 212 is provided as a message overlaid on the display output, in the form of an augmented reality output, next to the image display of the tee cream shop sign 202.
  • FIG, 2B thus illustrates an output on a graphical display, in the form of an overlay of content, which is output in response to authentication of the particular light communication source (the ice cream shop sign 202) and the processing of the information from this particular light communication source.
  • authentication of the light communication source may occur using a manual, user-initiated process; in another example, authentication of the tight communication source may occur using an automated process. After authentication is conducted, the image processing algorithms are then authorized to ingest data from the selected light source.
  • a human user may pro vide an indication, such as through an input into a graphical user interface, to indicate which data source that the user wishes to authenticate with and download data from.
  • the user may provide touch input 220 at a representation of the light emitting source (the display of the ice cream shop sign 202) to trigger a user interface command for authentication, as shown in the stylized
  • the modulated light data from the ice cream shop sign 202 may be parsed and interpreted, to obtain content.
  • a set of content to populate an available contextual menu (a food menu) of the ice cream shop establishment is received from optical camera communications, and is overlaid on the image data (as a contextual message 212) next to the representation of the object that transmitted the data.
  • the content obtained from a light emitting source may be displayed and overlaid to a user in the form of augmented reality in the stylized representation 200B; it will be understood that the content obtained from the light emitting source may be output with other types of devices and output formats in response to authentication.
  • the authentication may be automatically conducted to access and parse data from a particular data source.
  • Such automatic authentication may occur through an image recognition algorithm that selects the data source for the user, on the basis of the shape, classification, characteristics, or identification of an object or type of object (such as a particular sign, type of business associated with the sign, etc.)
  • image recognition algorithms may be used to only allow data to be downloaded and processed from objects that are previously known, such as a pedestrian control light or a traffic signal.
  • an automatic mode to authenticate with and process data from all identified sources (referred to as a '"promiscuous mode”) may be used to obtain a larger set of data from available sources.
  • the selection of data from all available sources may be further limited based on the location of the objects in the field of view (such as is further described below with reference to FIG. 3.)
  • the type, format, or characteristics of the content that is overlaid in a graphical display may be adapted based on the perspective of the field of vie captured by an image. This change to the graphical display may occur when the size and observable characteristics of respective light sources varies, especially when the image of the scene is captured from various distances.
  • the generation of the overlaid content for graphical display may be adapted to handle scenarios where a light emitting object such as signage is in the field of view but is mixed with other light sources (e.g., when observed at a long distance); when a light emitting object such as signage is visible and separated from other objects in the field of view (e.g., as depicted in FIG. 2A and 2B); or when a light emitting object such as signage is only partially visible in the captured field of view (e.g. , when observed at a close distance).
  • an image of a scene may depict multiple light sources to be overlapping and concentrated in an area of the image.
  • the modulated light data may be detected and processed from these different sources, however.
  • the respective lights are distinguishable and separated from one another in the field of view.
  • the graphical display may provide alternative graphics, a listing of detected light sources, contextual menus, and other forms of augmented views to allow obscured light sources and objects to be identified and distinguished.
  • FIG. 3 illustrates a stylized representation 300 of a camera-captured scene from a motor vehicle, indicating an example of authentication of multiple light communication sources in a restricted field of view.
  • FIG. 3 specifically illustrates the results of an approach in which only light sources in roughly the same plane as the camera are automatically authenticated (and which lights are ignored for authentication).
  • the stylized representation 300 depicts the selection of desired sources based upon the elevation angle of a camera field of view, as shown in respective area of view 310, 320, 330.
  • a first area of view 310 is adapted to identify an elevation that is too high
  • a second area of view 330 is adapted to identify an elevation that is too low
  • a third area of view 320 is adapted to identify an elevation of objects most likely to provide modulated light data.
  • the third area of view 330 may be the area that is most likely to provide modulated light data that the vehicle is interested in (such as brake system data or other vehicle-to-vehicle communication).
  • other elevations or areas of view may also provide modulated light data.
  • lights from other motor vehicles in the field of view in front of the camera convey modulated light data using the respective vehicles' rear-facing lights (tail lights), with the modulated light data indicating data such as motor vehicle speeds, system events, roadway conditions, and the like.
  • authentication of respective light communication sources is based upon angle of arrival.
  • the camera may automatically authenticate with lights that are +- 5 degrees elevation, relative to the camera position.
  • this narrowed area eliminates many overhead street lights and reflections from the field of view.
  • the overhead lights 312A, 312B, 312C, 312D, 312E are disregarded; likewise, in the area of view 330, the light reflections 332A, 332B, 332C, 332D, 332E are disregarded.
  • the field of view, the observed elevation angle, and the area used for automatic authentication may be modified based on the distance, clarity, and observation characteristics of respectiv e light sources. For exampl e, if a light source is obscured or not fully visible because the observer is too far away, too close, or past an observation angle for light emitting objects, the field of view may be modified to include or exclude additional areas of observation.
  • FIG. 1 to 3 were provided with reference to an infotainment or telematics system display in a motor vehicle, it will be understood that the techniques may be used for other variations of electronic image capture by personal electronic devices including mobile communication devices, wearables, and the like.
  • head-worn glasses that include a camera and projected display may operate to provide an augmented reality display using the techniques discussed above.
  • a smartphone including a camera and touchscreen display may provide an augmented reality or simulated reality display for browsing nearby information sources that are proximate to the user.
  • modulated light sources may be used to communicate information for games, entertainme t, public safety, among many other use cases.
  • FIG, 4 illustrates a sequence diagram of example operations for selecting and interpreting optically communicated data among components of an optical camera communications system.
  • the optical camera communications system includes a light display 402 (e.g., a LED light emitting device); a camera 404; a processing system 406 (e.g., an electronic processing system); a user interface device 408 (e.g., a display output with an in-car infotainment system or mobile computing device); and a third party data source 410 (e.g., a remote web service).
  • a light display 402 e.g., a LED light emitting device
  • a processing system 406 e.g., an electronic processing system
  • a user interface device 408 e.g., a display output with an in-car infotainment system or mobile computing device
  • a third party data source 410 e.g., a remote web service
  • the sequence diagram includes the transmission of a data message in modulated light (operation 411), from the light display 402 to the camera 404.
  • the camera 404 operates to receive, detect, and store the ii modulated light data (operation 412), such as through the buffering of image data.
  • the camera 404 further operates to provide the image data of the captured scene (operation 413) to the processing system 406, and also providing an indication of the modulated light (operation 414) to the processing system 406.
  • the processing system 406 operates to generate an output of the image data to include an indication of the light display 402 as an overlay of the image data (e.g., an augmented reality display) (operation 415). From this overlaid image data, a user interface of the image data is generated for output with the user interface device 408 (operation 416).
  • This user interface includes an indication that identifies the location of respective data sources of modulated light to a human user, such as may be highlighted or outlined directly on the user interface screen.
  • the user interface device 408 then receives a user input selection in the user interface to authenticate a light display located at the user input location (operation 417), which causes the processing system 406 to process data corresponding to the user input location (operation 418) (e.g., the modulated light obtained from the light display 402).
  • the data indicated from the user input location (e.g., the modulated light obtained from the light display 402) includes an indication of supplemental data at another source, such as the third party data source 410.
  • the processing system 406 may transmit a request to obtain supplemental data from the third party data source 410 (operation 419), and receive the supplemental data from the third part ⁇ ' data source 410 in response to this request (operation 420).
  • the processing system Based on the processed modulated light data obtained from the light display 402, and any supplemental data obtained from the third party data source 410, the processing system operates to generate an updated user interface of the image data for output on the user interface device 408 (operation 421). As discussed above, this may include an augmented reality of the processed content as an overlay over image data; other types of data outputs including simulated content, graphical content, multimedia and interactive content, may also be output via the user interface device 408.
  • FIG. S is a flowchart 500 illustrating an example method of obtaining and processing modulated light data in an optical camera communications system using a user authentication technique.
  • the following operations of the flowchart 500 may be conducted by an electronic processing system (including a specialized computing system) adapted to process optical camera
  • the operations of the flowchart 500 include the optional operation to activate the image sensor or other operational components of a camera
  • the image sensor is already activated or activated by another system component.
  • the camera system is operated to capture image data of a scene with the camera (operation 520), with this image data including the capture of modulated light data.
  • Modulated light data is detected from the image data (operation 530), and locations (e.g., sources) of the modulated light data are identified in the image data (operation 540).
  • Respective indications of the locations of the modulated light data are generated (operation 550), and a display of the image data and the indication of the locations of the modulated light data is output (operation 560).
  • the user authentication may be received in the user interface, through a user selection of the location of the modulated light data (operation 570).
  • the modulated light data that is communicated from the selected location may be processed (operation 580) (e.g., parsed and interpreted), such as through re-processing of the image data, or re-capturing modulated light data from the selected location.
  • the processing of the modulated light data may result in the obtaining of additional content, information, or other data provided from the modulated light data at the selected location, and the display of the image data and the indication of the locations of the modulated light data may be updated to reflect this additional content, information, or data (operation 590).
  • FIG. 6 is a flowchart 600 illustrating an example method of obtaining and processing modulated light data in an optical camera communications system using an automatic authentication technique. Similar to FIG. 5, the operations of the flowchart 600 may be conducted by an electronic processing system (including a specialized computing system) adapted to process optical camera communications. Although the flowchart 600 depicts automated operations, it will be understood that the operations of the flowchart 600 raay be modified based on additional user authentication and interaction operations discussed herein.
  • the operations of the flowchart 600 include the use of a camera system to capture image data of a scene with the camera (operation 610), with this image data including the capture of modulated light data.
  • a narrowed area of evaluation is determined, based on the elevation angle of the imaged area (operation 620). This narrowed area of elevation may be used, for example, to disregard areas in the image data that are unlikely to include (or cannot include) relevant light emitting sources.
  • modulated light data is detected in the image data (operation 630), and l ocations of the modulated l ight data in the image data are detected (operation 640).
  • the processing system then operates to perform an automatic authentication of one or more locations of modulated light data (operation 650), such as may be based on an image recognition of a particular object, type of object, or the detection of a data signal (e.g., signature, command) communicated from a particular object.
  • the modulated light data from the one or more authenticated locations is then processed (operation 660), and information obtained the modulated light data of the one or more authenticated locations is communicated to another control subsystem (operation 670). This may include the communication of relevant data to a vehicle control subsystem, or the generation of information for output on a display system.
  • FIG, 7 illustrates a block diagram of components in an example system for processing and authenticating modulated light data using optical camera communications.
  • the block diagram depicts an electronic processing system 710 (e.g. , a computing system), an external data system 750, and a light source system 740.
  • the electronic processing system 710 includes circuitiy (described below) operably coupled to an optical image capture system 720 and an authentication data processing component 730.
  • the electronic processing system 710 is depicted as including:
  • circuitiy to implement a user interface 712, e.g., to output a display with a user interface hardware device); a communication bus 713 to communicate data among the optical image capture system 720 and other components of the electronic processing system 710; data storage 714 to store image data, authentication data, and control instructions for operation of the electronic processing system; a wireless transceiver 715 to wirelessly communicate with an external network or devices; and processing circuitry 716 (e.g., a CPU) and a memory 717 (e.g., volatile or non-volatile memory) used to host and process the image data, authentication data, and control instructions for operation of the electronic processing system.
  • processing circuitry 716 e.g., a CPU
  • memory 717 e.g., volatile or non-volatile memory
  • the authentication data processing component 730 may be provided from specialized hardware operating independent from the processing circuitry 716 and the memory 717; in other examples, the authentication data processing component 730 may be software- configured hardware that is implemented with use of the processing circuitry 716 and the memory 7 7 (e.g., by instructions executed by the processing circuitry 716 and the memory 717).
  • the user interface 712 may be used to output a command and control interface for selection and receipt of user input for authentication, such as to authenticate a particular data source.
  • the input of user authentication from the user interface 712 may be used to control operations and initiate actions with the authentication data processing component 730.
  • the authentication data processing component 730 is depicted as including image data processing 732 to perform detection and analysis of image data; automated authentication processing 734 to perform an automatic recognition of modulated light data sources and content operations; user authentication processing 736 to generate the user-controlled interfaces and inputs to perform an manual authentication of image sources identified in images; and image recognition processing 738 to perform automatic
  • the authentication data processing component 730 and the electronic processing system may also include other components, not depicted, for implementation of other forms of authentication and user interaction operations, such as input control components (e.g., buttons, touchscreen input, external peripheral devices), and output components (e.g., a touchscreen display screen, video or audio output, etc.).
  • input control components e.g., buttons, touchscreen input, external peripheral devices
  • output components e.g., a touchscreen display screen, video or audio output, etc.
  • the optical image capture system 720 is depicted as including: an image sensor 722 to capture image data of a scene (including modulated light data emitted in respective objects in a scene); storage memory 724 to buffer and store the image data of the scene; processing circuitry 726 to perform image processing of image data for a scene and identify modulated light data in the scene; and communication circuitry 728 to communicate the image data to another location.
  • the optical image capture system 720 is adapted to capture human-visible light; in some examples, the optical image capture system 720 is additionally adapted to capture aspects of infrared and near- infrared light.
  • the light source system 740 is depicted as including: a data storage 742 to store commands and content for communication via modulated light output; processing circuitry 744 to control the modulated light output; and a light emitter 746 (e.g., a LED or LED array) to generate the modulated light output.
  • a data storage 742 to store commands and content for communication via modulated light output
  • processing circuitry 744 to control the modulated light output
  • a light emitter 746 e.g., a LED or LED array
  • the external data system 750 is depicted as including: data storage 752 to host supplemental content for access by the electronic processing system 710; a processor 754 and memory 756 to execute software instructions to host and serve the supplemental content in response to a request from the electronic processing system 710; and communication circuitry 758 to transmit the supplemental data in response to the request from the electronic processing system 710.
  • FIG, 8 is a block diagram illustrating a machine in the example form of an electronic processing system 800, within which a set or sequence of instructions may be executed to cause the machine to perform any one of the methodologies discussed herein, according to an example embodiment.
  • the machine may be a vehicle information or entertainment system, a personal computer (PC), a tablet PC, a personal digital assistant (PDA), a mobile telephone or smartphone, or any machine capable of executing instructions (sequential or otherwise) that specify actions to be taken by that machine.
  • PC personal computer
  • PDA personal digital assistant
  • machine shall also be taken to include any collection of machines that individually or jointly execute a set (or multiple sets) of instructions to perform any one or more of the methodologies discussed herein.
  • processor-based system shall be taken to include any set of one or more machines that are controlled by or operated by a processor (e.g., a computer) to individually or jointly execute instructions to perform any one or more of the methodologies discussed herein.
  • Example electronic processing system 800 includes at least one processor 802 (e.g., a central processing unit (CPU), a graphics processing unit (GPU) or both, processor cores, compute nodes, etc.), a mam memory 804 and a static memory 806, which communicate with each other via an interconnect 808 (e.g., a link, a bus, etc.).
  • the electronic processing system 800 may further include a video display unit 810, an input device 812 (e.g., an alphanumeric keyboard), and a user interface (UI) control device 814 (e.g., a mouse, button controls, etc.).
  • the video display unit 810, input device 812 and UI navigation device 814 are incorporated into a touch screen display.
  • the electronic processing system 800 may additionally include a storage device 816 (e.g., a drive unit), a signal generation device 818 (e.g., a speaker), an output controller 832 (e.g., for control of actuators, motors, and the like), a network interface device 820 (which may include or operably communicate with one or more antennas 830, transceivers, or other wireless communications hardware), and one or more sensors 826 (e.g., cameras), such as a global positioning system (GPS) sensor, compass, accelerometer, location sensor, or other sensor.
  • GPS global positioning system
  • the storage device 816 includes a machine-readable medium 822 on which is stored one or more sets of data structures and instructions 824 (e.g., software) embodying or utilized by any one or more of the methodologies or functions described herein.
  • the instructions 824 may also reside, completely or at least partially, within the main memory 804, static memory 806, and/or within the processor 802 during execution thereof by the electronic processing system 800, with the main memory 804, static memory 806, and the processor 802 also constituting machine-readable media.
  • machine-readable medium 822 is illustrated in an example embodiment to be a single medium, the term “machine-readable medium” may include a single medium or multiple media (e.g., a centralized or distributed database, and/or associated caches and servers) that store the one or more instructions 824.
  • the term “machine-readable medium” shall also be taken to include any tangible medium that is capable of storing, encoding or carrying instructions for execution by the machine and that cause the machine to perform any one or more of the methodologies of the present disclosure or that is capable of storing, encoding or carrying data structures utilized by or associated with such instructions.
  • the term “machine-readable medium” shall accordingly be taken to include, but not be limited to, solid-state memories, and optical and magnetic media. Specific examples of machine-readable media include nonvolatile memory, including but not limited to, by way of example,
  • semiconductor memory devices e.g., electrically programmable read-only memory (EPROM), electrically erasable programmable read-only memory (EEPROM)
  • EPROM electrically programmable read-only memory
  • EEPROM electrically erasable programmable read-only memory
  • flash memory devices e.g., magnetic disks such as internal hard disks and removable disks; magneto-optical disks; and CD-ROM and DVD- ROM disks,
  • the instructions 824 may further be transmitted or received over a communications network 828 using a transmission medium via the network interface device 820 utilizing any one of a number of transfer protocols (e.g., HTTP).
  • Examples of communication networks include a local area network (LAN), a wide area network (WAN), the Internet, mobile telephone networks, plain old telephone (POTS) networks, and wireless data networks (e.g., Wi-Fi, 2G/3G, and 4G LTE/LTE-A or WiMAX networks).
  • POTS plain old telephone
  • wireless data networks e.g., Wi-Fi, 2G/3G, and 4G LTE/LTE-A or WiMAX networks.
  • transmission medium shall be taken to include any intangible medium that is capable of storing, encoding, or carrying instructions for execution by the machine, and includes digital or analog communications signals or other intangible medium to facilitate communication of such software.
  • Embodiments used to facilitate and perform the techniques described herein may be implemented in one or a combination of hardware, firmware, and software. Embodiments may also be implemented as instructions stored on a machine-readable storage device, which may be read and executed by at least one processor to perform the operations described herein.
  • a machine-readable storage device may include any non-transitory mechanism for storing information in a form readable by a machine (e.g., a computer).
  • a machine-readable storage device may include read-only memory (ROM), random-access memory (RAM), magnetic disk storage media, optical storage media, flash-memory devices, and other storage devices and media.
  • a component or module may be implemented as a hardware circuit comprising custom very -large-scale integration (VLSI) circuits or gate arrays, off-the-shelf semiconductors such as logic chips, transistors, or other discrete components.
  • VLSI very -large-scale integration
  • a component or module may also be implemented in programmable hardware devices such as field programmable gate arrays, programmable array logic, programmable logic devices, or the like.
  • Components or modules may also be implemented in software for execution by various types of processors.
  • An identified component or module of executable code may, for instance, comprise one or more physical or logical blocks of computer instructions, which may, for instance, be organized as an object, procedure, or function. Nevertheless, the executables of an identified component or module need not be physically located together, but may comprise disparate instructions stored in different locations which, when joined logically together, comprise the component or module and achieve the stated purpose for the component or module.
  • a component or module of executable code may be a single instruction, or many instructions, and may even be distributed over several different code segments, among different programs, and across several memory devices or processing systems.
  • some aspects of the described process (such as code rewriting and code analysis) may take place on a different processing system (e.g., in a computer in a data center), than that in which the code is deployed (e.g., in a computer embedded in a sensor or robot).
  • operational data may be identified and illustrated herein within components or modules, and may be embodied in any suitable form and organized within any suitable type of data structure.
  • the operational data may be collected as a single data set, or may be distributed over different locations including over different storage devices, and may exist, at least partially, merely as electronic signals on a system or network.
  • the components or modules may be passive or active, including agents operable to perform desired functions.
  • Example 1 is a device for performing authentication of optical camera communications from a light emitting object, the device comprising: processing circuitry to: detect, from image data, modulated light data emitted from the light emitting object, wherein the image data depicts the light emitting object, and wherein the image data is captured with an image sensor; identify, from the image data, the light emitting object as a source of the modulated light data; receive an indication to select the light emitting object as an authenticated source of the modulated light data; and perform a command to process the modulated light data from the authenticated source, in response to the indication to select the light emitting object as the authenticated source of the modulated light data.
  • Example 2 the subject matter of Example 1 optionally includes wherein the image data indicates multiple sources of available modulated light data, wherein the multiple sources include the authenticated source and another source, and wherein operations to identify the source of the modulated light data are performed with operations to detect the authenticated source as a first source of a first set of available modulated light data and detect the another source as a second source of a second set of available modulated light data.
  • Example 3 the subject matter of Example 2 optionally includes wherein operations that perform the command to process the modulated light data, include operations to decode the first set of available modulated light data, and to not decode the second set of available modulated light data.
  • Example 4 the subject matter of Example 3 optionally includes the processing circuitry further to enable user authentication of the authenticated source of the modulated light data, with operations to: generate a graphical user interface display, the graphical user interface display including an overlay on output of the image data that provides an identification of the multiple sources of available modulated light data; and receive the indication to select the authenticated source of the modulated light data from user input received in the graphical user interface display, the user input received upon the overlay of the output of the image data in the graphical user interface display; wherein the operations to identify the light emitting object include a generation of the graphical user interface display to indicate the authenticated source and the another source, the indication of the authenticated source and the another source provided as an overlay of an output of the image data in the graphical user interface display .
  • Example 5 the subject matter of Example 4 optionally includes the processing circuitry further to output data selected with the user authentication of the authenticated source of the modulated light data, with operations to: decode and interpret content from the modulated light data obtained from the authenticated source; and update the graphical user interface display to output the decoded and interpreted content from the modulated light data.
  • Example 6 the subject matter of any one or more of Examples 3-5 optionally include the processing circuitry further to enable automatic authentication of the authenticated source of the modul ated light data, with operations to: perform image recognition of the image data; wherein the operations to identify the light emitting object include image recognition of the image data to indicate the authenticated source and the another source; and wherein the indication to select the light emitting object as the authenticated source is provided from an image recognition technique, the image recognition technique automatically performed on an object representing the source of the modulated light data in the image data.
  • Example 7 the subject matter of any one or more of Examples 1-6 optionally include the processing circuitry further to obtain supplemental data indicated in the modulated light data, with operations to: decode and parse information obtained from the modulated light data from the authenticated source, wherein the information obtained from the modulated light data indicates an identifier of the supplemental data from another data source; and obtain the supplemental data from the another data source, using the identifier of the supplemental data.
  • the subject matter of Example 7 optionally includes wherein the identifier is a uniform resource locator (URL), and wherein operations to obtain the supplemental data from, the another data source includes access of the URL using a wireless communication network.
  • URL uniform resource locator
  • Example 9 the subject matter of any one or more of Examples 1-8 optionally include wherein the image data is obtained from a camera positioned in a motor vehicle to capture an image of a scene in a direction away from the motor vehicle, and wherein the modulated light data is used to generate an automated reality display of information obtained from the modulated light data that overlays the image of the scene,
  • Example 10 the subject matter of Example 9 optionally includes the processing circuitry further to identify a limited area of evaluation from the image data for automatically authenticating the authenticated source, with operations to: identify the limited area of evaluation of the image data based on an elevation angle of the scene in the direction away from the motor vehicle, as captured from a position of the camera; wherein operations to detect the modulated light data are performed on the limited area of evaluation, and wherein operations to identify the modulated light data are performed on the limited area of evaluation.
  • Example 11 is at least one machine readable storage medium, comprising a plurality of instructions adapted for performing authentication of optical camera communications from a light emitting object, wherein the instructions, responsive to being executed with processor circuitry of a machine, cause the machine to perform operations that: detect, from image data, modulated light data emitted from the light emitting object, wherein the image data depicts the light emitting object, and wherein the image data is captured with an image sensor; identify, from the image data, the light emitting object as a source of the modulated light data; receive an indication to select the light emitting object as an authenticated source of the modulated light data; and perform a command to process the modulated light data from the authenticated source, in response to the indication to select the light emitting object as the authenticated source of the modulated light data.
  • Example 12 the subject matter of Example 11 optionally includes wherein the image data indicates multiple sources of available modulated light data, wherein the multiple sources include the authenticated source and another source, and wherein operations to identify the source of the modulated light data are performed with operations to detect the authenticated source as a first source of a first set of available modulated light data and detect the another source as a second source of a second set of available modulated light data.
  • Example 13 the subject matter of Example 12 optionally includes wherein operations that perform the command to process the modulated light data, include operations to decode the first set of available modulated light data, and to not decode the second set of available modulated light data.
  • Example 14 the subject matter of Example 13 optionally includes wherein the instructions further cause the machine to enable user authentication of the authenticated source of the modulated light data, with operations that: generate a graphical user interface display, the graphical user interface display including an overlay on output of the image data that provides an identification of the multiple sources of available modulated light data; and receive the indication to select the authenticated source of the modulated light data from user input received in the graphical user interface display , the user input recei ved upon the overlay of the output of the image data in the graphical user interface display; wherein the operations to identify the light emitting object include a generation of the graphical user interface display to indicate the authenticated source and the another source, the indication of the authenticated source and the another source provided as an overlay of an output of the image data in the graphical user interface display.
  • Example 15 the subject matter of Example 14 optionally includes wherein the instructions further cause the machine to output data selected with the user authentication of the authenticated source of the modulated light data, with operations that: decode and interpret content from the modulated light data obtained from the authenticated source; and update the graphical user interface display to output the decoded and interpreted content from the modulated light data.
  • Example 16 the subject matter of any one or more of Examples 13-15 optionally include wherein the instructions further cause the machine to enable automatic authenti cation of the authenticated source of the modulated light data, with operations that: perform image recognition of the image data; w herein the operations to identify the light emitting object include image recognition of the image data to indicate the authenticated source and the another source; and wherein the indication to select the light emitting object as the authenticated source is provided from an image recognition technique, the image recognition technique automatically performed on an object representing the source of the modulated light data in the image data,
  • Example 17 the subject matter of any one or more of Examples 1 1-16 optionally include wherein the instructions further cause the machine to obtain supplemental data indicated in the modulated light data, with operations that: decode and parse information obtained from the modulated light data from the authenticated source, wherein the information obtained from the modulated light data indicates an identifier of the supplemental data from another data source; and obtain the supplemental data from the another data source, using the identifier of the supplemental data.
  • Example 18 the subject matter of Example 17 optionally includes wherein the identifier is a uniform resource locator (URL), and wherein operations to obtain the supplemental data from the another data source includes access of the URL using a wireless communication network.
  • the identifier is a uniform resource locator (URL)
  • operations to obtain the supplemental data from the another data source includes access of the URL using a wireless communication network.
  • URL uniform resource locator
  • Example 19 the subject matter of any one or more of Examples 11-18 optionally include wherein the image data is obtained from a camera positioned in a motor vehicle to capture an image of a scene in a direction away from the motor vehicle, and wherein the modulated light data is used to generate an automated reality display of information obtained from the modulated light data that overlays the image of the scene.
  • Example 20 the subject matter of Example 19 optionally includes wherein the instructions further cause the machine to identify a limited area of evaluation from the image data for automatically authenticating the authenticated source, with operations that: identify the limited area of evaluation of the image data based on an ele vation angle of the scene in the direction away from the motor vehicle, as captured from a position of the camera; wherein operations to detect the modulated light data are performed on the limited area of evaluation, and wherein operations to identify the modulated light data are performed on the limited area of evaluation.
  • Example 21 is a method of performing authentication of optical camera communications from a light emitting object, the method comprising electronic operations including: detecting, from image data, modulated light data emitted from the light emitting object, wherein the image data depicts the light emitting object, and wherein the image data is captured with an image sensor; identifying, from the image data, the light emitting object as a source of the modulated light data; receiving an indication to select the light emitting object as an authenticated source of the modulated light data; and performing a command to process the modulated light data from the authenticated source, in response to the indication to select the light emitting object as the authenticated source of the modulated light data.
  • Example 22 the subject matter of Example 21 optionally includes wherein the image data indicates multiple sources of available modulated light data, wherein the multiple sources include the authenticated source and another source, and wherein identifying the source of the modulated light data is performed by detecting the authenticated source as a first source of a first set of available modulated light data and detect the another source as a second source of a second set of available modulated light data.
  • Example 23 the subject matter of Example 22 optionally includes wherein performing the command to process the modulated light data, includes decoding the first set of available modulated light data, and not decoding the second set of available modulated light data.
  • Example 24 the subject matter of Example 23 optionally includes the electronic operations further including enabling user authentication of the authenticated source of the modulated light data, by: generating a graphical user interface display, the graphical user interface display including an overlay on output of the image data that provides an identification of the multiple sources of available modulated light data; and receiving the indication to select the authenticated source of the modulated light data from user input received in the graphical user interface display, the user input received upon the overlay of the output of the image data in the graphical user interface display; wherein identifying the light emitting object includes generating the graphical user interface display to indicate the authenticated source and the another source, the indication of the authenticated source and the another source provided as an overlay of an output of the image data in the graphical user interface display.
  • Example 25 the subject matter of Example 24 optionally includes the electronic operations further including outputting data selected with the user authentication of the authenticated source of the modulated light data, by:
  • decoding and interpreting content from the modulated light data obtained from the authenticated source and updating the graphical user interface display to output the decoded and interpreted content from the modulated light data.
  • Example 26 the subject matter of any one or more of Examples 23-25 optionally include the electronic operations further including enabling automatic authentication of the authenticated source of the modulated light data, by : performing image recognition of the image data; wherein identifying the light emitting object includes image recognition of the image data to indicate the authenticated source and the another source; and wherein the indication to select the light emitting object as the authenticated source is provided from an image recognition technique, the image recognition technique automatically performed on an object representing the source of the modulated light data in the image data.
  • Example 27 the subj ect matter of any one or more of Examples 21-26 optionally include the electronic operations further including obtaining supplemental data indicated in the modulated light data, by: decoding and parsing information obtained from the modulated light data from the authenticated source, wherein the information obtained from the modulated light data indicates an identifier of the supplemental data from another data source; and obtaining the supplemental data from the another data source, using the identifier of the supplemental data.
  • Example 28 the subject matter of Example 27 optionally includes wherein the identifier is a uniform resource locator (URL), and wherein obtaining the supplemental data from the another data source includes access of the URL using a wireless communication network.
  • the identifier is a uniform resource locator (URL)
  • obtaining the supplemental data from the another data source includes access of the URL using a wireless communication network.
  • Example 29 the subject matter of any one or more of Examples 21-28 optionally include wherein the image data is obtained from a camera positioned in a motor vehicle to capture an image of a scene in a direction away from the motor vehicle, and wherein the modulated light data is used to generate an automated reality display of information obtained from the modulated light data that overlays the image of the scene.
  • Example 30 the subject matter of Example 29 optionally includes the electronic operations further including identifying a limited area of evaluation from the image data for automatically authenticating the authenticated source, by: identifying the limited area of evaluation of the image data based on an elevation angle of the scene in the direction away from the motor vehicle, as captured from a position of the camera; wherein detecting the modulated light data is performed on the limited area of evaluation, and wherein identifying the modulated light data is performed on the limited area of evaluation.
  • Example 31 is an apparatus comprising means for performing any of the methods of Examples 21-30.
  • Example 32 is at least one machine readable medium including instructions, which when executed by a computing system, cause the computing system to perform any of the methods of Examples 21-30.
  • Example 33 is a system for processing and authenticating modulated light data using optical camera communications, comprising: an optical image capture system; a processing system, comprising: processing circuitry; image data processing circuitry to evaluate image data, the image data including an indication of modulated light data from a light source, wherein the image data is captured with an image sensor; authentication data processing circuitry to: detect, from image data, modulated light data emi tted from the light source; identify, from the image data, the light source as a source of the modulated light data; receive an indication to select the light source as an authenticated source of the modulated light data; and perform a command to process the modulated light data from the authenticated source, in response to the indication to select the light source as the authenticated source of the modulated light data.
  • Example 34 the subject matter of Example 33 optionally includes a light source system, comprising: data storage to store data to be transmitted with a modulated light output; a light emitter to output the data with the modulated light output; and processing circuitry coupled to the data storage and the light emitter, the processing circuitry to control emission of the data with the modul ated light output via the light emitter.
  • a light source system comprising: data storage to store data to be transmitted with a modulated light output; a light emitter to output the data with the modulated light output; and processing circuitry coupled to the data storage and the light emitter, the processing circuitry to control emission of the data with the modul ated light output via the light emitter.
  • Example 35 the subject matter of any one or more of Examples 33-34 optionally include an external data system, accessible via a network connection, the external data system comprising; data storage to store data; communication circuitry to receive a request for supplemental data; and a processor and memory to process the request to serve the supplemental data and transmit the supplemental data in response to the request; wherein the request for supplemental data is provided from the processing system, in response to reading the modulated light data from the light source, wherein the modulated light data indicates details of the request for supplemental data.
  • an external data system accessible via a network connection
  • the external data system comprising; data storage to store data; communication circuitry to receive a request for supplemental data; and a processor and memory to process the request to serve the supplemental data and transmit the supplemental data in response to the request; wherein the request for supplemental data is provided from the processing system, in response to reading the modulated light data from the light source, wherein the modulated light data indicates details of the request for supplemental data.
  • Example 36 is an apparatus, comprising: means for capturing image data; means for detecting, from the image data, modulated light data emitted from a light emitting object; means for identifying, from the image data, the light emitting object as a source of the modulated light data; means for receiving an indication to select the light emitting object as an authenticated source of the modulated light data; and means for performing a command to process the modulated light data from the authenticated source, in response to the indication to select the light emitting object as the authenticated source of the modulated light data.
  • Example 37 the subject matter of Example 36 optionally includes wherein the image data indicates multiple sources of available modulated light data, wherein the multiple sources include the authenticated source and another source, the apparatus further comprising: means for detecting the authenticated source as a first source of a first set of available modulated light data and detect the another source as a second source of a second set of available modulated light data.
  • Example 38 the subject matter of Example 37 optionally includes means for performing the command to process the modulated light data by decoding the first set of available modulated light data, and not decoding the second set of available modulated light data.
  • Example 39 the subject matter of Example 38 optionally includes means for enabling user authentication of the authenticated source of the modulated light data, including: means for generating a graphical user interface display, the graphical user interface display including an overlay on output of the image data that provides an identification of the multiple sources of available modulated light data; and means for receiving the indication to select the authenticated source of the modulated light data from user input received in the graphical user interface display, the user input received upon the overlay of the output of the image data in the graphical user interface display; means for identifying the light emitting object by generating a graphical user interface display to indicate the authenticated source and the another source, the indication of the authenticated source and the another source provided as an overlay of an output of the image data in the graphical user interface display.
  • Example 40 the subject matter of Example 39 optionally includes means for outputting data selected with the user authenticatio of the authenticated source of the modulated light data, including: means for decoding and interpreting content from the modulated light data obtained from the authenticated source; and means for updating the graphical user interface display to output the decoded and interpreted content from the modulated light data.
  • Example 41 the subject matter of any one or more of Examples 38-40 optionally include means for enabling automatic authentication of the authenticated source of the modulated light data, including: means for performing image recognition of the image data; means for identifying the light emitting object by image recognition of the image data to indicate the authenticated source and the another source; and means for obtaining the indication to select the light emitting object as the authenticated source an image recognition technique, the image recognition technique automatically performed on an object representing the source of the modulated light data in the image data.
  • Example 42 the subject matter of any one or more of Examples 36-41 optionally include means for obtaining supplemental data indicated in the modulated light data, including: means for decoding and parsing information obtained from the modulated light data from the authenticated source, wherein the information obtained from the modulated light data indicates an identifier of the supplemental data from another data source; and means for obtaining the supplemental data from the another data source, using the identifier of the supplemental data.
  • Example 43 the subject matter of Example 42 optionally includes means for obtaining the supplemental data from the another data source by access of a uniform resource locator (URL) using a wireless communication network, wherein the identifier indicates the URL.
  • a uniform resource locator URL
  • Example 44 the subject matter of any one or more of Examples 36-43 optionally include means for obtaining the image data to capture an image of a scene in a direction away from the apparatus; and means for generating an automated reality display of information obtained from the modulated light data that overlays the image of the scene, using the modulated light data.
  • Example 45 the subject matter of Example 44 optionally includes means for identifying a limited area of evaluation from the image data for automatically authenticating the authenticated source, including: means for identifying the limited area of evaluation of the image data based on an elevation angle of the scene in the direction away from the apparatus, as captured from a position of the apparatus; means for detecting the modulated light data on the limited area of evaluation, and wherein identifying the modulated light data is performed on the limited area of evaluation.

Landscapes

  • Engineering & Computer Science (AREA)
  • Signal Processing (AREA)
  • Physics & Mathematics (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Multimedia (AREA)
  • Electromagnetism (AREA)
  • Computer Security & Cryptography (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Optical Communication System (AREA)
  • Studio Devices (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

Various systems and methods for identifying and performing authentication of visible light communications using optical camera communication techniques are described. In an example, an electronic processing system to authenticate a particular light emitting source includes electronic operations for: detecting, from image data, modulated light data emitted from a light emitting object, where the image data depicts the light emitting object, and where the image data is captured with an image sensor (e.g., of a camera); identifying, from the image data, the light emitting object as a source of the modulated light data; receiving an indication to select the light emitting object as an authenticated source of the modulated light data; and performing a command to process the modulated light data from the authenticated source, with the command performed in response to the indication to select the light emitting object as the authenticated source of the modulated light data.

Description

PRIORITY APPLICATION
[0001] This application claims the benefit of priority to U.S. Application
Serial No. 15/282,328, filed 30 September 2016, which is incorporated herein by reference in its entirety.
TECHNICAL FIELD
[0002] Embodiments described herein generally relate to processing techniques of data from light communication sources, and in particular, to the use of authentication and data interpretation techniques for data obtained from visible light via optical camera communication sources.
BACKGROUND
[0003] Visible light communications are embodied in a variety of emerging wireless communication techniques, such as in communications techniques that utilize light sources such as light-emitting diode (LED) signage and LED lamps to broadcast messages. A variety of applications have been proposed in the area of visible light communication, including for specialized deployments of wireless data networks that serve as a high-speed link for a last mile transmission of a network connection. In many uses of visible light communications, the brightness of the light source is modulated faster than the human eye may observe, allowing a light source to transmit messages without a perceivable flicker.
[0004] One implementation of visible light communications, optical camera communications, also known as "CamCom", uses an image sensor within a camera for receiving and processing visible (human- or camera-visible) light data. One proposal for the standardization of optical camera communications is currently being developed by the Short-Range Optical Wireless Communications Task Group for a revision of the IEEE 802.15.7-2011 specification. For example, this task group is developing enhanced standards for the use of optical ί camera communications to enable scalable data rate, positioning/localization, and message broadcasting, using optical devices such as a flash, display, and image sensor as a transmitting or recei ving devi ce. BRIEF DESCRIPTION OF THE DRAWINGS
[0005] In the drawings, which are not necessarily drawn to scale, like numerals may describe similar components in different views. Like numerals having different letter suffixes may represent different instances of similar components. Some embodiments are illustrated by way of example, and not limitation, in the figures of the accompanying drawings in which:
[0006] FIG. 1 illustrates an operational environment for processing and authenticating light communication sources with components of a motor vehicle, according to an example;
[0007] FIG, 2A illustrates a stylized representation of a camera-captured scene observed from a motor vehicle, indicating multiple light communication sources, according to an example;
[0008] FIG, 2B illustrates a stylized representation of a camera-captured scene observed from a motor vehicle, indicating an authentication of a particular light communication source from among multiple light communication sources, according to an example;
[0009] FIG. 3 illustrates a stylized representation of a camera-captured scene from a motor vehicle, indicating an authentication of multiple light communication sources in a restricted field of view, according to an example;
[0010] FIG, 4 illustrates a sequence diagram of operations for selecting and interpreting optically communicated data among components of an optical camera communications system, according to an example;
[0011] FIG. 5 is a flowchart illustrating a method of obtaining and processing modulated light data in an optical camera communications system using a user authentication technique, according to an example;
[0012] FIG. 6 is a flowchart illustrating a method of obtaining and processing modulated light data in an optical camera communications system using an automatic authentication technique, according to an example; [0013] FIG. 7 illustrates a block diagram of components in an example system for processing and authenticating modulated light data using optical camera communications, according to an example; and
[0014] FIG. 8 illustrates a block diagram for an example electronic processing system architecture upon which any one or more of the techniques (e.g., operations, processes, methods, and methodologies) discussed herein may be performed, according to an example.
DETAILED DESCRIPTION
[0015] In the following description, methods, configurations, and related apparatuses are disclosed for the processing and authentication of image data detected from camera image object sources, for image data that indicates modulated light communicated using visible light communications. In
particular, the techniques discussed herein are relevant to the application of visible light communication commonly referred to as optical camera communications, which utilizes light emitting objects such as LED signage and LED lights to output (transmit) data to be captured (received) via an image sensor in a camera. Various device-based and system-based techniques for analyzing such image data that includes modulated light data and authenticating the source of modulated light data from the image data are disclosed herein.
[0016] Authentication, as used in the contexts discussed herein, refers to providing or determining a proof of identity before a data source associates with (e.g., provides data to) a data sink. As a similar example of authentication, in IEEE 802.1 1 (Wi-Fi) wireless communication networks, authentication frame exchanges are used to ensure that a station has the correct authentication information (e.g., a pre-shared WEP/WPA encryption key) before being able to establish a connection with the wireless network. In this setting, the assumption is that if the encryption key is known, then the station is authorized to associate with the network. In the field of optical camera communications, there is a similar technical challenge to ensure that a received data stream is provided from an authenticated source before allowing that data to initiate further actions on a receiving device. Because many types of visible light communications are openly broadcasted to any listener in observable range of the light, the ability to obtain data only from desired or trusted locations becomes a complex yet important issue.
[0017] In the examples of optical camera communications discussed herein, authentication is performed at a lower layer of processing, by visually
identifying a data source in image data to confirm that the data sink desires to receive data from the visually observed data source. The identification of a desired data source may be used to locate, select, access, and process modulated light data from a desired light emitting object, while disregarding modulated light data detected from other light emitting objects. Thus, light sources that are not authenticated may be ignored and disregarded, preventing the use of unknown, unwanted, unverified, unauthorized, or rogue data.
[0018] As discussed herein, optical camera communication authentication techniques may include the identification and selection of a modulated light data source, performed using either human input or automated object recognition upon image data of the light emitting object. The use of image data for authentication enables proper verification of modulated light data from the desired source, because the image data obtained by a camera sensor captures light to visually recognize the object as it also captures the light used to transmit the modulated data. Accordingly, the optical camera communication
authentication techniques discussed herein provide significant operational and security benefits over existing approaches that choose to consume and process ail available modulated light data sources without authentication.
[0019] FIG, 1 illustrates an example operational environment for processing and authenticating light communication sources with components of a motor vehicle. The following examples of FIGS. 1 to 3 specifically describe use cases involving the capture of image data and modulated light data from a camera positioned at the perspective of a motor vehicle occupant, such as may occur when the occupant operates the motor vehicle on a roadway. The integration of the following example features may be provided in a motor vehicle with a factory-integrated telematics and infotainment system, or with an add-on telematics and infotainment device. However, it will be understood that the following optical camera communication authentication features may also be applicable to other forms of mobile computing devices that operate independently from a motor vehicle, such as with image and data processing capability provided in smartphones, wearable devices, tablets, portable personal computers, and like user-interactive/client devices embedded in other operational systems.
[0020] As shown, in FIG. 1, a motor vehicle 1 10 includes a camera device 112, which is positioned outward facing with respect to the motor vehicle 110 and the surrounding environment to detect and capture a scene in a field of view. The camera device 1 12 is shown as obtaining an optical image of the field of view from the forward direction of the motor vehicle 1 10, which includes visible light communication 120 being transmitted to the motor vehicle 1 10 from a light emitting object (such as LED signage). The lights in the light emitting object are modulated rapidly to indicate data in a fashion that that the human eye typically cannot see or observe (e.g., with rapidly blinking lights that are not perceivable to a human). The camera device 112 includes at least one sensor to capture image data of the scene, and the camera device 112 may include or be operabiy coupled to processing circuitry to detect that at least one light of the light emitting object is modulated with data (e.g., is emitting the visible light communication 120).
[0021] The motor vehicle 110 includes a number of processing components 130 to obtain, process, and evaluate a scene in the field of view observed in front of the motor vehicl e. Such processing capabilities operate to capture image data for real-world objects (such as still RGB images of the LED signage) and the modulated light data provided in the visible light communication 120 (such as the modulated light data provided from operation of the LED signage). For example, the processing components 130 may include: a camera sensor 132 (e.g., CMOS/CCD sensor) to capture image data of a scene; camera data processing components 134 (e.g., implemented with programmed circuitr ') to process, store, and extract data from the captured image data; and visible light communication processing components 136 (e.g., implemented with
programmed circuitry) to detect and interpret modulated light data emitted from an object in the scene.
[0022] The processing components 130 may also include: authentication data processing components 138 (e.g., implemented with programmed circuitry) to implement user-interactive or automated authentication of light modulation data from a light emitting source (an object); user interface display processing components 140 (e.g. , implemented with programmed circuitry) to receive user- interactive controls, including the generation of an augmented display of the image data; and an interactive display unit 142 (e.g., a touchscreen display hardware) to output a display of the image data and receive user input and commands for the display of the image data.
[0023] The processing components 130 or another component integrated with the motor vehicle 1 10 may also be used to access an external network source 150 (e.g. , via the Internet), to obtain supplemental data 160 for use in the authentication or processing of data with the visible light communication 120. For example, the external network source 150 may provide a network-connected data processing server 1.52 (e.g., a web server) and data-hosting system 154 (e.g., a database) to serve the supplemental data in response to a request or a query from the processing components 130. For example, the visible light
communication 120 may include data indicating a uniform request locator (URL) of the external network source 150, with the data processing server 1.52 and data-hosting system 154 adapted to serve the supplemental data 160 in response to the request or query.
[0024] FIG. 2A illustrates a stylized representation 200A of an example camera-captured scene observed from a motor vehicle, indicating multiple light communication sources. The stylized representation 200A illustrates an output of image data including an image of three illuminated signs in a real-world environment: an ice cream shop sign 202, a coffee shop sign 204, and a traffic sign 206. Each illuminated sign includes LEDs that modulate light data in a specific pattern, to send respective sets of visible light communication data to be received and demodulated via a camera.
[0025] Thus, in the stylized representation 200A of FIG. 2A, each of the three illuminated signs 202, 204, 206 provide light output that is modulated in a pattern to signal data. As an initial processing step, a signal processor associated with the camera determines (e.g., locates, observes) which objects in the captured scene are transmitting optical camera communication data. This is important because in some examples only a few of the available LED light emitters in the scene actually transmit useable data. In an example, the identification of a light emitting source is performed using a specialized camera communications waveform, such as with use of a start frame delimiter. In other examples, the lights may use a specialized signaling output to indicate that they are a modulated light data source. In response to this determination, a signal processor associated with the camera identifies available light sources that are transmitting (e.g., broadcasting) data to an available observer.
[0026] The information on identified light sources is used in the
authentication process, to determine which of the identified light sources provide a data stream available to be consumed by an associated processing system. A manual or automated authentication process then may be performed to select data from an available (identified) light source. For example, as shown in FIG. 2A, in the image processor may generate a solid box (e.g., a colored box) around each light source (e.g., signs 202, 204, 206) that is transmitting modulated data. In an example, the image processor provides this indication as an overlay on the image data to highlight or emphasize real-world locations of an identified modulated light data source. The identification operates to highlight or mark an available data source to a human user or to an automated mechanism (with such automated mechanisms including an image recognition technique or image processing algorithm). Other methods and mechanisms for marking, listing, or identifying light emitting sources may also be utilized.
[ΘΘ27] In an example, the information being sent by the modulated light data may include encoded information in the form of graphical, textual, or other software-interpretable content. As discussed above for FIG, 1, the information being sent by the modulated light data also may include a URL address that will be used by a processing system to access supplemental data (e.g., via a radio access network such as Wi-Fi or a 3G/4G data connection). After capturing and decoding the data, the stylized representation 200A may be updated to display the graphical, textual, or software-interpreted content.
[ΘΘ28] FIG, 2B illustrates a stylized representation 200B of an example camera-captured scene observed from a motor vehicle, indicating an
authentication of a particular light communication source from among the multiple light communication sources. In the stylized representation 200B, authentication to select modulated light data from the ice cream shop sign 202 results in the processing and receipt of information used to display a contextual menu 212, The contextual menu 212 is provided as a message overlaid on the display output, in the form of an augmented reality output, next to the image display of the tee cream shop sign 202.
[0029] FIG, 2B thus illustrates an output on a graphical display, in the form of an overlay of content, which is output in response to authentication of the particular light communication source (the ice cream shop sign 202) and the processing of the information from this particular light communication source. In an example, authentication of the light communication source may occur using a manual, user-initiated process; in another example, authentication of the tight communication source may occur using an automated process. After authentication is conducted, the image processing algorithms are then authorized to ingest data from the selected light source.
[0030] In a manual authentication operation, a human user may pro vide an indication, such as through an input into a graphical user interface, to indicate which data source that the user wishes to authenticate with and download data from. For example, the user may provide touch input 220 at a representation of the light emitting source (the display of the ice cream shop sign 202) to trigger a user interface command for authentication, as shown in the stylized
representation 200B. In response to the touch input 220, the modulated light data from the ice cream shop sign 202 may be parsed and interpreted, to obtain content. In this scenario, a set of content to populate an available contextual menu (a food menu) of the ice cream shop establishment is received from optical camera communications, and is overlaid on the image data (as a contextual message 212) next to the representation of the object that transmitted the data. Thus, the content obtained from a light emitting source may be displayed and overlaid to a user in the form of augmented reality in the stylized representation 200B; it will be understood that the content obtained from the light emitting source may be output with other types of devices and output formats in response to authentication.
[0031] In an automatic authentication operation, the authentication may be automatically conducted to access and parse data from a particular data source. Such automatic authentication may occur through an image recognition algorithm that selects the data source for the user, on the basis of the shape, classification, characteristics, or identification of an object or type of object (such as a particular sign, type of business associated with the sign, etc.) For example, in a controlled mode, image recognition algorithms may be used to only allow data to be downloaded and processed from objects that are previously known, such as a pedestrian control light or a traffic signal. As another example, an automatic mode to authenticate with and process data from all identified sources (referred to as a '"promiscuous mode") may be used to obtain a larger set of data from available sources. However, the selection of data from all available sources may be further limited based on the location of the objects in the field of view (such as is further described below with reference to FIG. 3.)
[0032] In certain examples, the type, format, or characteristics of the content that is overlaid in a graphical display may be adapted based on the perspective of the field of vie captured by an image. This change to the graphical display may occur when the size and observable characteristics of respective light sources varies, especially when the image of the scene is captured from various distances. In an example, the generation of the overlaid content for graphical display may be adapted to handle scenarios where a light emitting object such as signage is in the field of view but is mixed with other light sources (e.g., when observed at a long distance); when a light emitting object such as signage is visible and separated from other objects in the field of view (e.g., as depicted in FIG. 2A and 2B); or when a light emitting object such as signage is only partially visible in the captured field of view (e.g. , when observed at a close distance).
[0033] For example, as a motor vehicle travels on a roadway and is a large distance from a light source, an image of a scene may depict multiple light sources to be overlapping and concentrated in an area of the image. (The modulated light data may be detected and processed from these different sources, however.) At a closer location, the respective lights are distinguishable and separated from one another in the field of view. At an even closer location, when an observer is very close or has partially passed the light emitting object, the object may become distorted or not be fully visible. In cases where the light source is obscured, the graphical display may provide alternative graphics, a listing of detected light sources, contextual menus, and other forms of augmented views to allow obscured light sources and objects to be identified and distinguished.
[0034] FIG. 3 illustrates a stylized representation 300 of a camera-captured scene from a motor vehicle, indicating an example of authentication of multiple light communication sources in a restricted field of view. FIG. 3 specifically illustrates the results of an approach in which only light sources in roughly the same plane as the camera are automatically authenticated (and which lights are ignored for authentication).
[0035] The stylized representation 300 depicts the selection of desired sources based upon the elevation angle of a camera field of view, as shown in respective area of view 310, 320, 330. In the camera field of view, a first area of view 310 is adapted to identify an elevation that is too high, and a second area of view 330 is adapted to identify an elevation that is too low; whereas a third area of view 320 is adapted to identify an elevation of objects most likely to provide modulated light data. For example, the third area of view 330 may be the area that is most likely to provide modulated light data that the vehicle is interested in (such as brake system data or other vehicle-to-vehicle communication). In other examples, other elevations or areas of view may also provide modulated light data. In the scenario depicted by the stylized representation 300, lights from other motor vehicles in the field of view in front of the camera (e.g., lights 322A, 322B, 322C, 322D, 322E, 322F, 322G, 3221 1) convey modulated light data using the respective vehicles' rear-facing lights (tail lights), with the modulated light data indicating data such as motor vehicle speeds, system events, roadway conditions, and the like.
[0036] In an example, authentication of respective light communication sources is based upon angle of arrival. In this fashion, the camera may automatically authenticate with lights that are +- 5 degrees elevation, relative to the camera position. For example, in a field of view captured while driving a motor vehicle, this narrowed area eliminates many overhead street lights and reflections from the field of view. Thus, in the area of view 310, the overhead lights 312A, 312B, 312C, 312D, 312E are disregarded; likewise, in the area of view 330, the light reflections 332A, 332B, 332C, 332D, 332E are disregarded.
[0037] In still further examples, the field of view, the observed elevation angle, and the area used for automatic authentication may be modified based on the distance, clarity, and observation characteristics of respectiv e light sources. For exampl e, if a light source is obscured or not fully visible because the observer is too far away, too close, or past an observation angle for light emitting objects, the field of view may be modified to include or exclude additional areas of observation.
[0038] Although the preceding examples of FIG. 1 to 3 were provided with reference to an infotainment or telematics system display in a motor vehicle, it will be understood that the techniques may be used for other variations of electronic image capture by personal electronic devices including mobile communication devices, wearables, and the like. For example, head-worn glasses that include a camera and projected display may operate to provide an augmented reality display using the techniques discussed above. Likewise, a smartphone including a camera and touchscreen display may provide an augmented reality or simulated reality display for browsing nearby information sources that are proximate to the user. Further, in addition to the commercial and advertising use cases suggested above, modulated light sources may be used to communicate information for games, entertainme t, public safety, among many other use cases.
[0039] FIG, 4 illustrates a sequence diagram of example operations for selecting and interpreting optically communicated data among components of an optical camera communications system. As shown, the optical camera communications system includes a light display 402 (e.g., a LED light emitting device); a camera 404; a processing system 406 (e.g., an electronic processing system); a user interface device 408 (e.g., a display output with an in-car infotainment system or mobile computing device); and a third party data source 410 (e.g., a remote web service).
[0040] As shown, the sequence diagram includes the transmission of a data message in modulated light (operation 411), from the light display 402 to the camera 404. The camera 404 operates to receive, detect, and store the ii modulated light data (operation 412), such as through the buffering of image data. The camera 404 further operates to provide the image data of the captured scene (operation 413) to the processing system 406, and also providing an indication of the modulated light (operation 414) to the processing system 406.
[0041] The processing system 406 operates to generate an output of the image data to include an indication of the light display 402 as an overlay of the image data (e.g., an augmented reality display) (operation 415). From this overlaid image data, a user interface of the image data is generated for output with the user interface device 408 (operation 416). This user interface includes an indication that identifies the location of respective data sources of modulated light to a human user, such as may be highlighted or outlined directly on the user interface screen. The user interface device 408 then receives a user input selection in the user interface to authenticate a light display located at the user input location (operation 417), which causes the processing system 406 to process data corresponding to the user input location (operation 418) (e.g., the modulated light obtained from the light display 402).
[0042] In some examples, the data indicated from the user input location (e.g., the modulated light obtained from the light display 402) includes an indication of supplemental data at another source, such as the third party data source 410. in response, the processing system 406 may transmit a request to obtain supplemental data from the third party data source 410 (operation 419), and receive the supplemental data from the third part}' data source 410 in response to this request (operation 420).
[0043] Based on the processed modulated light data obtained from the light display 402, and any supplemental data obtained from the third party data source 410, the processing system operates to generate an updated user interface of the image data for output on the user interface device 408 (operation 421). As discussed above, this may include an augmented reality of the processed content as an overlay over image data; other types of data outputs including simulated content, graphical content, multimedia and interactive content, may also be output via the user interface device 408.
[0044] FIG. S is a flowchart 500 illustrating an example method of obtaining and processing modulated light data in an optical camera communications system using a user authentication technique. The following operations of the flowchart 500 may be conducted by an electronic processing system (including a specialized computing system) adapted to process optical camera
communications. It will be understood that the operations of the flowchart 500 may also be performed by other devices, with the sequence and type of
operations of the flowchart 500 potentially modified based on the other examples of authentication provided above.
[0045] The operations of the flowchart 500 include the optional operation to activate the image sensor or other operational components of a camera
(operation 510); in other examples, the image sensor is already activated or activated by another system component. The camera system is operated to capture image data of a scene with the camera (operation 520), with this image data including the capture of modulated light data. Modulated light data is detected from the image data (operation 530), and locations (e.g., sources) of the modulated light data are identified in the image data (operation 540).
[ΘΘ46] Respective indications of the locations of the modulated light data are generated (operation 550), and a display of the image data and the indication of the locations of the modulated light data is output (operation 560). The user authentication may be received in the user interface, through a user selection of the location of the modulated light data (operation 570). In response to the user authentication, the modulated light data that is communicated from the selected location may be processed (operation 580) (e.g., parsed and interpreted), such as through re-processing of the image data, or re-capturing modulated light data from the selected location. The processing of the modulated light data may result in the obtaining of additional content, information, or other data provided from the modulated light data at the selected location, and the display of the image data and the indication of the locations of the modulated light data may be updated to reflect this additional content, information, or data (operation 590).
[0047] FIG. 6 is a flowchart 600 illustrating an example method of obtaining and processing modulated light data in an optical camera communications system using an automatic authentication technique. Similar to FIG. 5, the operations of the flowchart 600 may be conducted by an electronic processing system (including a specialized computing system) adapted to process optical camera communications. Although the flowchart 600 depicts automated operations, it will be understood that the operations of the flowchart 600 raay be modified based on additional user authentication and interaction operations discussed herein.
[0048] The operations of the flowchart 600 include the use of a camera system to capture image data of a scene with the camera (operation 610), with this image data including the capture of modulated light data. In an optional example, a narrowed area of evaluation is determined, based on the elevation angle of the imaged area (operation 620). This narrowed area of elevation may be used, for example, to disregard areas in the image data that are unlikely to include (or cannot include) relevant light emitting sources.
[0049] Within the area of evaluation, modulated light data is detected in the image data (operation 630), and l ocations of the modulated l ight data in the image data are detected (operation 640). The processing system then operates to perform an automatic authentication of one or more locations of modulated light data (operation 650), such as may be based on an image recognition of a particular object, type of object, or the detection of a data signal (e.g., signature, command) communicated from a particular object. The modulated light data from the one or more authenticated locations is then processed (operation 660), and information obtained the modulated light data of the one or more authenticated locations is communicated to another control subsystem (operation 670). This may include the communication of relevant data to a vehicle control subsystem, or the generation of information for output on a display system.
[0050] FIG, 7 illustrates a block diagram of components in an example system for processing and authenticating modulated light data using optical camera communications. As shown, the block diagram depicts an electronic processing system 710 (e.g. , a computing system), an external data system 750, and a light source system 740. The electronic processing system 710 includes circuitiy (described below) operably coupled to an optical image capture system 720 and an authentication data processing component 730.
[0051] The electronic processing system 710 is depicted as including:
circuitiy to implement a user interface 712, e.g., to output a display with a user interface hardware device); a communication bus 713 to communicate data among the optical image capture system 720 and other components of the electronic processing system 710; data storage 714 to store image data, authentication data, and control instructions for operation of the electronic processing system; a wireless transceiver 715 to wirelessly communicate with an external network or devices; and processing circuitry 716 (e.g., a CPU) and a memory 717 (e.g., volatile or non-volatile memory) used to host and process the image data, authentication data, and control instructions for operation of the electronic processing system. In an example, the authentication data processing component 730 may be provided from specialized hardware operating independent from the processing circuitry 716 and the memory 717; in other examples, the authentication data processing component 730 may be software- configured hardware that is implemented with use of the processing circuitry 716 and the memory 7 7 (e.g., by instructions executed by the processing circuitry 716 and the memory 717).
[0052] In the electronic processing system 710, the user interface 712 may be used to output a command and control interface for selection and receipt of user input for authentication, such as to authenticate a particular data source. The input of user authentication from the user interface 712 may be used to control operations and initiate actions with the authentication data processing component 730. The authentication data processing component 730 is depicted as including image data processing 732 to perform detection and analysis of image data; automated authentication processing 734 to perform an automatic recognition of modulated light data sources and content operations; user authentication processing 736 to generate the user-controlled interfaces and inputs to perform an manual authentication of image sources identified in images; and image recognition processing 738 to perform automatic
identification of particular objects, types of objects, light sources and light types, and the like. The authentication data processing component 730 and the electronic processing system may also include other components, not depicted, for implementation of other forms of authentication and user interaction operations, such as input control components (e.g., buttons, touchscreen input, external peripheral devices), and output components (e.g., a touchscreen display screen, video or audio output, etc.). [0053] The optical image capture system 720 is depicted as including: an image sensor 722 to capture image data of a scene (including modulated light data emitted in respective objects in a scene); storage memory 724 to buffer and store the image data of the scene; processing circuitry 726 to perform image processing of image data for a scene and identify modulated light data in the scene; and communication circuitry 728 to communicate the image data to another location. In an example, the optical image capture system 720 is adapted to capture human-visible light; in some examples, the optical image capture system 720 is additionally adapted to capture aspects of infrared and near- infrared light.
[0054] The light source system 740 is depicted as including: a data storage 742 to store commands and content for communication via modulated light output; processing circuitry 744 to control the modulated light output; and a light emitter 746 (e.g., a LED or LED array) to generate the modulated light output.
[0055] The external data system 750 is depicted as including: data storage 752 to host supplemental content for access by the electronic processing system 710; a processor 754 and memory 756 to execute software instructions to host and serve the supplemental content in response to a request from the electronic processing system 710; and communication circuitry 758 to transmit the supplemental data in response to the request from the electronic processing system 710.
[0056] FIG, 8 is a block diagram illustrating a machine in the example form of an electronic processing system 800, within which a set or sequence of instructions may be executed to cause the machine to perform any one of the methodologies discussed herein, according to an example embodiment. The machine may be a vehicle information or entertainment system, a personal computer (PC), a tablet PC, a personal digital assistant (PDA), a mobile telephone or smartphone, or any machine capable of executing instructions (sequential or otherwise) that specify actions to be taken by that machine.
Further, while only a single machine is illustrated, the term "machine" shall also be taken to include any collection of machines that individually or jointly execute a set (or multiple sets) of instructions to perform any one or more of the methodologies discussed herein. Similarly, the term "processor-based system" shall be taken to include any set of one or more machines that are controlled by or operated by a processor (e.g., a computer) to individually or jointly execute instructions to perform any one or more of the methodologies discussed herein.
[0057] Example electronic processing system 800 includes at least one processor 802 (e.g., a central processing unit (CPU), a graphics processing unit (GPU) or both, processor cores, compute nodes, etc.), a mam memory 804 and a static memory 806, which communicate with each other via an interconnect 808 (e.g., a link, a bus, etc.). The electronic processing system 800 may further include a video display unit 810, an input device 812 (e.g., an alphanumeric keyboard), and a user interface (UI) control device 814 (e.g., a mouse, button controls, etc.). In one embodiment, the video display unit 810, input device 812 and UI navigation device 814 are incorporated into a touch screen display. The electronic processing system 800 may additionally include a storage device 816 (e.g., a drive unit), a signal generation device 818 (e.g., a speaker), an output controller 832 (e.g., for control of actuators, motors, and the like), a network interface device 820 (which may include or operably communicate with one or more antennas 830, transceivers, or other wireless communications hardware), and one or more sensors 826 (e.g., cameras), such as a global positioning system (GPS) sensor, compass, accelerometer, location sensor, or other sensor.
[0058] The storage device 816 includes a machine-readable medium 822 on which is stored one or more sets of data structures and instructions 824 (e.g., software) embodying or utilized by any one or more of the methodologies or functions described herein. The instructions 824 may also reside, completely or at least partially, within the main memory 804, static memory 806, and/or within the processor 802 during execution thereof by the electronic processing system 800, with the main memory 804, static memory 806, and the processor 802 also constituting machine-readable media.
[0059] While the machine-readable medium 822 is illustrated in an example embodiment to be a single medium, the term "machine-readable medium" may include a single medium or multiple media (e.g., a centralized or distributed database, and/or associated caches and servers) that store the one or more instructions 824. The term "machine-readable medium" shall also be taken to include any tangible medium that is capable of storing, encoding or carrying instructions for execution by the machine and that cause the machine to perform any one or more of the methodologies of the present disclosure or that is capable of storing, encoding or carrying data structures utilized by or associated with such instructions. The term "machine-readable medium" shall accordingly be taken to include, but not be limited to, solid-state memories, and optical and magnetic media. Specific examples of machine-readable media include nonvolatile memory, including but not limited to, by way of example,
semiconductor memory devices (e.g., electrically programmable read-only memory (EPROM), electrically erasable programmable read-only memory (EEPROM)) and flash memory devices; magnetic disks such as internal hard disks and removable disks; magneto-optical disks; and CD-ROM and DVD- ROM disks,
[0060] The instructions 824 may further be transmitted or received over a communications network 828 using a transmission medium via the network interface device 820 utilizing any one of a number of transfer protocols (e.g., HTTP). Examples of communication networks include a local area network (LAN), a wide area network (WAN), the Internet, mobile telephone networks, plain old telephone (POTS) networks, and wireless data networks (e.g., Wi-Fi, 2G/3G, and 4G LTE/LTE-A or WiMAX networks). The term "transmission medium" shall be taken to include any intangible medium that is capable of storing, encoding, or carrying instructions for execution by the machine, and includes digital or analog communications signals or other intangible medium to facilitate communication of such software.
[0061] Embodiments used to facilitate and perform the techniques described herein may be implemented in one or a combination of hardware, firmware, and software. Embodiments may also be implemented as instructions stored on a machine-readable storage device, which may be read and executed by at least one processor to perform the operations described herein. A machine-readable storage device may include any non-transitory mechanism for storing information in a form readable by a machine (e.g., a computer). For example, a machine-readable storage device may include read-only memory (ROM), random-access memory (RAM), magnetic disk storage media, optical storage media, flash-memory devices, and other storage devices and media. [0062] It should be understood that the functional units or capabilities described in this specification may have been referred to or labeled as components or modules, in order to more particularly emphasize their implementation independence. Such components may be embodied by any number of software or hardware forms. For example, a component or module may be implemented as a hardware circuit comprising custom very -large-scale integration (VLSI) circuits or gate arrays, off-the-shelf semiconductors such as logic chips, transistors, or other discrete components. A component or module may also be implemented in programmable hardware devices such as field programmable gate arrays, programmable array logic, programmable logic devices, or the like. Components or modules may also be implemented in software for execution by various types of processors. An identified component or module of executable code may, for instance, comprise one or more physical or logical blocks of computer instructions, which may, for instance, be organized as an object, procedure, or function. Nevertheless, the executables of an identified component or module need not be physically located together, but may comprise disparate instructions stored in different locations which, when joined logically together, comprise the component or module and achieve the stated purpose for the component or module.
[0063] Indeed, a component or module of executable code may be a single instruction, or many instructions, and may even be distributed over several different code segments, among different programs, and across several memory devices or processing systems. In particular, some aspects of the described process (such as code rewriting and code analysis) may take place on a different processing system (e.g., in a computer in a data center), than that in which the code is deployed (e.g., in a computer embedded in a sensor or robot). Similarly, operational data may be identified and illustrated herein within components or modules, and may be embodied in any suitable form and organized within any suitable type of data structure. The operational data may be collected as a single data set, or may be distributed over different locations including over different storage devices, and may exist, at least partially, merely as electronic signals on a system or network. The components or modules may be passive or active, including agents operable to perform desired functions. [0064] Additional examples of the presently described method, system, and device embodiments include the following, non-limiting configurations. Each of the following non-limiting examples may stand on its own, or may be combined in any permutation or combination with any one or more of the other examples provided below or throughout the present disclosure.
[0065] Example 1 is a device for performing authentication of optical camera communications from a light emitting object, the device comprising: processing circuitry to: detect, from image data, modulated light data emitted from the light emitting object, wherein the image data depicts the light emitting object, and wherein the image data is captured with an image sensor; identify, from the image data, the light emitting object as a source of the modulated light data; receive an indication to select the light emitting object as an authenticated source of the modulated light data; and perform a command to process the modulated light data from the authenticated source, in response to the indication to select the light emitting object as the authenticated source of the modulated light data.
[0066] In Example 2, the subject matter of Example 1 optionally includes wherein the image data indicates multiple sources of available modulated light data, wherein the multiple sources include the authenticated source and another source, and wherein operations to identify the source of the modulated light data are performed with operations to detect the authenticated source as a first source of a first set of available modulated light data and detect the another source as a second source of a second set of available modulated light data.
[0067] In Example 3, the subject matter of Example 2 optionally includes wherein operations that perform the command to process the modulated light data, include operations to decode the first set of available modulated light data, and to not decode the second set of available modulated light data.
[0068] In Example 4, the subject matter of Example 3 optionally includes the processing circuitry further to enable user authentication of the authenticated source of the modulated light data, with operations to: generate a graphical user interface display, the graphical user interface display including an overlay on output of the image data that provides an identification of the multiple sources of available modulated light data; and receive the indication to select the authenticated source of the modulated light data from user input received in the graphical user interface display, the user input received upon the overlay of the output of the image data in the graphical user interface display; wherein the operations to identify the light emitting object include a generation of the graphical user interface display to indicate the authenticated source and the another source, the indication of the authenticated source and the another source provided as an overlay of an output of the image data in the graphical user interface display .
[ΘΘ69] In Example 5, the subject matter of Example 4 optionally includes the processing circuitry further to output data selected with the user authentication of the authenticated source of the modulated light data, with operations to: decode and interpret content from the modulated light data obtained from the authenticated source; and update the graphical user interface display to output the decoded and interpreted content from the modulated light data.
[0070] In Example 6, the subject matter of any one or more of Examples 3-5 optionally include the processing circuitry further to enable automatic authentication of the authenticated source of the modul ated light data, with operations to: perform image recognition of the image data; wherein the operations to identify the light emitting object include image recognition of the image data to indicate the authenticated source and the another source; and wherein the indication to select the light emitting object as the authenticated source is provided from an image recognition technique, the image recognition technique automatically performed on an object representing the source of the modulated light data in the image data.
[0071] In Example 7, the subject matter of any one or more of Examples 1-6 optionally include the processing circuitry further to obtain supplemental data indicated in the modulated light data, with operations to: decode and parse information obtained from the modulated light data from the authenticated source, wherein the information obtained from the modulated light data indicates an identifier of the supplemental data from another data source; and obtain the supplemental data from the another data source, using the identifier of the supplemental data. [0072] In Example 8, the subject matter of Example 7 optionally includes wherein the identifier is a uniform resource locator (URL), and wherein operations to obtain the supplemental data from, the another data source includes access of the URL using a wireless communication network.
[0073] In Example 9, the subject matter of any one or more of Examples 1-8 optionally include wherein the image data is obtained from a camera positioned in a motor vehicle to capture an image of a scene in a direction away from the motor vehicle, and wherein the modulated light data is used to generate an automated reality display of information obtained from the modulated light data that overlays the image of the scene,
[0074] In Example 10, the subject matter of Example 9 optionally includes the processing circuitry further to identify a limited area of evaluation from the image data for automatically authenticating the authenticated source, with operations to: identify the limited area of evaluation of the image data based on an elevation angle of the scene in the direction away from the motor vehicle, as captured from a position of the camera; wherein operations to detect the modulated light data are performed on the limited area of evaluation, and wherein operations to identify the modulated light data are performed on the limited area of evaluation.
[0075] Example 11 is at least one machine readable storage medium, comprising a plurality of instructions adapted for performing authentication of optical camera communications from a light emitting object, wherein the instructions, responsive to being executed with processor circuitry of a machine, cause the machine to perform operations that: detect, from image data, modulated light data emitted from the light emitting object, wherein the image data depicts the light emitting object, and wherein the image data is captured with an image sensor; identify, from the image data, the light emitting object as a source of the modulated light data; receive an indication to select the light emitting object as an authenticated source of the modulated light data; and perform a command to process the modulated light data from the authenticated source, in response to the indication to select the light emitting object as the authenticated source of the modulated light data. [0076] In Example 12, the subject matter of Example 11 optionally includes wherein the image data indicates multiple sources of available modulated light data, wherein the multiple sources include the authenticated source and another source, and wherein operations to identify the source of the modulated light data are performed with operations to detect the authenticated source as a first source of a first set of available modulated light data and detect the another source as a second source of a second set of available modulated light data.
[0077] In Example 13, the subject matter of Example 12 optionally includes wherein operations that perform the command to process the modulated light data, include operations to decode the first set of available modulated light data, and to not decode the second set of available modulated light data.
[0078] In Example 14, the subject matter of Example 13 optionally includes wherein the instructions further cause the machine to enable user authentication of the authenticated source of the modulated light data, with operations that: generate a graphical user interface display, the graphical user interface display including an overlay on output of the image data that provides an identification of the multiple sources of available modulated light data; and receive the indication to select the authenticated source of the modulated light data from user input received in the graphical user interface display , the user input recei ved upon the overlay of the output of the image data in the graphical user interface display; wherein the operations to identify the light emitting object include a generation of the graphical user interface display to indicate the authenticated source and the another source, the indication of the authenticated source and the another source provided as an overlay of an output of the image data in the graphical user interface display.
[0079] In Example 15, the subject matter of Example 14 optionally includes wherein the instructions further cause the machine to output data selected with the user authentication of the authenticated source of the modulated light data, with operations that: decode and interpret content from the modulated light data obtained from the authenticated source; and update the graphical user interface display to output the decoded and interpreted content from the modulated light data. [0080] In Example 16, the subject matter of any one or more of Examples 13-15 optionally include wherein the instructions further cause the machine to enable automatic authenti cation of the authenticated source of the modulated light data, with operations that: perform image recognition of the image data; w herein the operations to identify the light emitting object include image recognition of the image data to indicate the authenticated source and the another source; and wherein the indication to select the light emitting object as the authenticated source is provided from an image recognition technique, the image recognition technique automatically performed on an object representing the source of the modulated light data in the image data,
[0081] In Example 17, the subject matter of any one or more of Examples 1 1-16 optionally include wherein the instructions further cause the machine to obtain supplemental data indicated in the modulated light data, with operations that: decode and parse information obtained from the modulated light data from the authenticated source, wherein the information obtained from the modulated light data indicates an identifier of the supplemental data from another data source; and obtain the supplemental data from the another data source, using the identifier of the supplemental data.
[0082] In Example 18, the subject matter of Example 17 optionally includes wherein the identifier is a uniform resource locator (URL), and wherein operations to obtain the supplemental data from the another data source includes access of the URL using a wireless communication network.
[0083] In Example 19, the subject matter of any one or more of Examples 11-18 optionally include wherein the image data is obtained from a camera positioned in a motor vehicle to capture an image of a scene in a direction away from the motor vehicle, and wherein the modulated light data is used to generate an automated reality display of information obtained from the modulated light data that overlays the image of the scene.
[0084] In Example 20, the subject matter of Example 19 optionally includes wherein the instructions further cause the machine to identify a limited area of evaluation from the image data for automatically authenticating the authenticated source, with operations that: identify the limited area of evaluation of the image data based on an ele vation angle of the scene in the direction away from the motor vehicle, as captured from a position of the camera; wherein operations to detect the modulated light data are performed on the limited area of evaluation, and wherein operations to identify the modulated light data are performed on the limited area of evaluation.
[0085] Example 21 is a method of performing authentication of optical camera communications from a light emitting object, the method comprising electronic operations including: detecting, from image data, modulated light data emitted from the light emitting object, wherein the image data depicts the light emitting object, and wherein the image data is captured with an image sensor; identifying, from the image data, the light emitting object as a source of the modulated light data; receiving an indication to select the light emitting object as an authenticated source of the modulated light data; and performing a command to process the modulated light data from the authenticated source, in response to the indication to select the light emitting object as the authenticated source of the modulated light data.
[Θ086] In Example 22, the subject matter of Example 21 optionally includes wherein the image data indicates multiple sources of available modulated light data, wherein the multiple sources include the authenticated source and another source, and wherein identifying the source of the modulated light data is performed by detecting the authenticated source as a first source of a first set of available modulated light data and detect the another source as a second source of a second set of available modulated light data.
[Θ087] In Example 23, the subject matter of Example 22 optionally includes wherein performing the command to process the modulated light data, includes decoding the first set of available modulated light data, and not decoding the second set of available modulated light data.
[0088] In Example 24, the subject matter of Example 23 optionally includes the electronic operations further including enabling user authentication of the authenticated source of the modulated light data, by: generating a graphical user interface display, the graphical user interface display including an overlay on output of the image data that provides an identification of the multiple sources of available modulated light data; and receiving the indication to select the authenticated source of the modulated light data from user input received in the graphical user interface display, the user input received upon the overlay of the output of the image data in the graphical user interface display; wherein identifying the light emitting object includes generating the graphical user interface display to indicate the authenticated source and the another source, the indication of the authenticated source and the another source provided as an overlay of an output of the image data in the graphical user interface display.
[0089] In Example 25, the subject matter of Example 24 optionally includes the electronic operations further including outputting data selected with the user authentication of the authenticated source of the modulated light data, by:
decoding and interpreting content from the modulated light data obtained from the authenticated source; and updating the graphical user interface display to output the decoded and interpreted content from the modulated light data.
[0090] In Example 26, the subject matter of any one or more of Examples 23-25 optionally include the electronic operations further including enabling automatic authentication of the authenticated source of the modulated light data, by : performing image recognition of the image data; wherein identifying the light emitting object includes image recognition of the image data to indicate the authenticated source and the another source; and wherein the indication to select the light emitting object as the authenticated source is provided from an image recognition technique, the image recognition technique automatically performed on an object representing the source of the modulated light data in the image data.
[ΘΘ91] In Example 27, the subj ect matter of any one or more of Examples 21-26 optionally include the electronic operations further including obtaining supplemental data indicated in the modulated light data, by: decoding and parsing information obtained from the modulated light data from the authenticated source, wherein the information obtained from the modulated light data indicates an identifier of the supplemental data from another data source; and obtaining the supplemental data from the another data source, using the identifier of the supplemental data.
[0092] In Example 28, the subject matter of Example 27 optionally includes wherein the identifier is a uniform resource locator (URL), and wherein obtaining the supplemental data from the another data source includes access of the URL using a wireless communication network.
[0093] In Example 29, the subject matter of any one or more of Examples 21-28 optionally include wherein the image data is obtained from a camera positioned in a motor vehicle to capture an image of a scene in a direction away from the motor vehicle, and wherein the modulated light data is used to generate an automated reality display of information obtained from the modulated light data that overlays the image of the scene.
[0094] In Example 30, the subject matter of Example 29 optionally includes the electronic operations further including identifying a limited area of evaluation from the image data for automatically authenticating the authenticated source, by: identifying the limited area of evaluation of the image data based on an elevation angle of the scene in the direction away from the motor vehicle, as captured from a position of the camera; wherein detecting the modulated light data is performed on the limited area of evaluation, and wherein identifying the modulated light data is performed on the limited area of evaluation.
[0095] Example 31 is an apparatus comprising means for performing any of the methods of Examples 21-30.
[0096] Example 32 is at least one machine readable medium including instructions, which when executed by a computing system, cause the computing system to perform any of the methods of Examples 21-30.
[0097] Example 33 is a system for processing and authenticating modulated light data using optical camera communications, comprising: an optical image capture system; a processing system, comprising: processing circuitry; image data processing circuitry to evaluate image data, the image data including an indication of modulated light data from a light source, wherein the image data is captured with an image sensor; authentication data processing circuitry to: detect, from image data, modulated light data emi tted from the light source; identify, from the image data, the light source as a source of the modulated light data; receive an indication to select the light source as an authenticated source of the modulated light data; and perform a command to process the modulated light data from the authenticated source, in response to the indication to select the light source as the authenticated source of the modulated light data. [0098] In Example 34, the subject matter of Example 33 optionally includes a light source system, comprising: data storage to store data to be transmitted with a modulated light output; a light emitter to output the data with the modulated light output; and processing circuitry coupled to the data storage and the light emitter, the processing circuitry to control emission of the data with the modul ated light output via the light emitter.
[0099] In Example 35, the subject matter of any one or more of Examples 33-34 optionally include an external data system, accessible via a network connection, the external data system comprising; data storage to store data; communication circuitry to receive a request for supplemental data; and a processor and memory to process the request to serve the supplemental data and transmit the supplemental data in response to the request; wherein the request for supplemental data is provided from the processing system, in response to reading the modulated light data from the light source, wherein the modulated light data indicates details of the request for supplemental data.
[0100] Example 36 is an apparatus, comprising: means for capturing image data; means for detecting, from the image data, modulated light data emitted from a light emitting object; means for identifying, from the image data, the light emitting object as a source of the modulated light data; means for receiving an indication to select the light emitting object as an authenticated source of the modulated light data; and means for performing a command to process the modulated light data from the authenticated source, in response to the indication to select the light emitting object as the authenticated source of the modulated light data.
[0101] In Example 37, the subject matter of Example 36 optionally includes wherein the image data indicates multiple sources of available modulated light data, wherein the multiple sources include the authenticated source and another source, the apparatus further comprising: means for detecting the authenticated source as a first source of a first set of available modulated light data and detect the another source as a second source of a second set of available modulated light data.
[0102] In Example 38, the subject matter of Example 37 optionally includes means for performing the command to process the modulated light data by decoding the first set of available modulated light data, and not decoding the second set of available modulated light data.
[0103] In Example 39, the subject matter of Example 38 optionally includes means for enabling user authentication of the authenticated source of the modulated light data, including: means for generating a graphical user interface display, the graphical user interface display including an overlay on output of the image data that provides an identification of the multiple sources of available modulated light data; and means for receiving the indication to select the authenticated source of the modulated light data from user input received in the graphical user interface display, the user input received upon the overlay of the output of the image data in the graphical user interface display; means for identifying the light emitting object by generating a graphical user interface display to indicate the authenticated source and the another source, the indication of the authenticated source and the another source provided as an overlay of an output of the image data in the graphical user interface display.
[0104] In Example 40, the subject matter of Example 39 optionally includes means for outputting data selected with the user authenticatio of the authenticated source of the modulated light data, including: means for decoding and interpreting content from the modulated light data obtained from the authenticated source; and means for updating the graphical user interface display to output the decoded and interpreted content from the modulated light data.
[0105] In Example 41, the subject matter of any one or more of Examples 38-40 optionally include means for enabling automatic authentication of the authenticated source of the modulated light data, including: means for performing image recognition of the image data; means for identifying the light emitting object by image recognition of the image data to indicate the authenticated source and the another source; and means for obtaining the indication to select the light emitting object as the authenticated source an image recognition technique, the image recognition technique automatically performed on an object representing the source of the modulated light data in the image data.
[0106] In Example 42, the subject matter of any one or more of Examples 36-41 optionally include means for obtaining supplemental data indicated in the modulated light data, including: means for decoding and parsing information obtained from the modulated light data from the authenticated source, wherein the information obtained from the modulated light data indicates an identifier of the supplemental data from another data source; and means for obtaining the supplemental data from the another data source, using the identifier of the supplemental data.
[0107] In Example 43, the subject matter of Example 42 optionally includes means for obtaining the supplemental data from the another data source by access of a uniform resource locator (URL) using a wireless communication network, wherein the identifier indicates the URL.
[0108] In Example 44, the subject matter of any one or more of Examples 36-43 optionally include means for obtaining the image data to capture an image of a scene in a direction away from the apparatus; and means for generating an automated reality display of information obtained from the modulated light data that overlays the image of the scene, using the modulated light data.
[0109] In Example 45, the subject matter of Example 44 optionally includes means for identifying a limited area of evaluation from the image data for automatically authenticating the authenticated source, including: means for identifying the limited area of evaluation of the image data based on an elevation angle of the scene in the direction away from the apparatus, as captured from a position of the apparatus; means for detecting the modulated light data on the limited area of evaluation, and wherein identifying the modulated light data is performed on the limited area of evaluation.
[0110] In the above Detailed Description, various features may be grouped together to streamline the disclosure. However, the claims may not set forth every feature disclosed herein as embodiments may feature a subset of said features. Further, embodiments may include fewer features than those disclosed in a particular example. Thus, the following claims are hereby incorporated into the Detailed Description, with a claim standing on its own as a separate embodiment.

Claims

CLAIMS What is claimed is:
1. A device adapted for performing authentication of optical camera communications from a light emitting object, the device comprising:
processing circuitry to:
detect, from image data, modulated light data emitted from the light emitting object, wherein the image data depicts the light emitting object, and wherein the image data is captured with an image sensor;
identify, from the image data, the light emitting object as a source of the modulated light data;
receive an indication to select the light emitting object as an
authenticated source of the modulated light data; and perform a command to process the modulated light data from the authenticated source, in response to the indication to select the light emitting object as the authenticated source of the modulated light data.
2, The device of claim 1, wherein the image data indicates multiple sources of available modulated light data, wherein the multiple sources include the authenticated source and another source, and wherein operations to identify' the source of the modulated light data are performed with operations to detect the authenticated source as a first source of a first set of available modulated light data and detect the another source as a second source of a second set of available modulated light data.
3. The device of claim 2, wherein operations that perform the command to process the modulated light data, include operations to decode the first set of available modulated light data, and to not decode the second set of available modulated light data.
4. The device of claim 3, the processing circuitry further to enable user authentication of the authenticated source of the modulated light data, with operations to:
generate a graphical user interface display, the graphical user interface display including an overlay on output of the image data that provides an identification of the multiple sources of available modulated light data: and
receive the indication to select the authenticated source of the modulated light data from user input recei ed in the graphical user interface display, the user input received upon the overlay of the output of the image data in the graphical user interface display;
wherein the operations to identify the light emitting object include a generation of the graphical user interface display to indicate the authenticated source and the another source, the indication of the authenticated source and the another source provided as an overlay of an output of the image data in the graphical user interface display,
5. The device of claim 4, the processing circuitry further to output data selected with the user authentication of the authenticated source of the modulated light data, with operations to:
decode and interpret content from the modulated light data obtained from the authenticated source; and
update the graphical user interface display to output the decoded and interpreted content from the modulated light data.
6. The device of claim 3, the processing circuitry further to enable automatic authentication of the authenticated source of the modulated light data, with operations to:
perform image recognition of the image data;
wherein the operations to identify the light emitting object include image recognition of the image data to indicate the authenticated source and the another source; and wherein the indication to select the light emitting object as the authenticated source is provided from an image recognition technique, the image recognition technique automatically performed on an object representing the source of the modulated light data in the image data.
7. The device of claim 1, the processing circuitry further to obtain supplemental data indicated in the modulated light data, with operations to: decode and parse information obtained from the modulated light data from the authenticated source, wherein the information obtained from the modulated light data indicates an identifier of the supplemental data from another data source; and obtain the supplemental data from the another data source, using the identifier of the supplemental data.
8. The device of claim 7, wherein the identifier is a uniform resource locator (URL), and wherein operations to obtain the supplemental data from the another data source includes access of the URL using a wireless communication network.
9. The device of claim 1, wherein the image data is obtained from a camera positioned in a motor vehicle to capture an image of a scene in a direction away from the motor vehicle, and wherein the modulated light data is used to generate an automated reality display of information obtained from the modulated light data that overlays the image of the scene.
10. The device of claim 9, the processing circuitry further to identify a limited area of evaluation from the image data for automatically authenticating the authenticated source, with operations to:
identify the limited area of evaluation of the image data based on an elevation angle of the scene in the direction away from the motor vehicle, as captured from a position of the camera; wherein operations to detect the modulated light data are performed on the limited area of evaluation, and wherein operations to identify the modulated light data are performed on the limited area of evaluation.
11. A method of performing authentication of optical camera
communications from a light emitting object, the method comprising electronic operations including:
detecting, from image data, modulated light data emitted from the light emitting object, wherein the image data depicts the light emitting object, and wherein the image data is captured with an image sensor;
identifying, from the image data, the light emitting object as a source of the modulated light data;
receiving an indication to select the light emitting object as an
authenticated source of the modulated light data; and performing a command to process the modulated light data from the authenticated source, in response to the indication to select the light emitting object as the authenticated source of the modulated light data.
12. The method of claim 11, wherein the image data indicates multiple sources of available modulated light data, wherein the multiple sources include the authenticated source and another source, and wherein identifying the source of the modulated light data is performed by detecting the authenticated source as a first source of a first set of available modulated light data and detect the another source as a second source of a second set of available modulated light data.
13. The method of claim 12, wherein performing the command to process the modulated light data, includes decoding the first set of available modulated light data, and not decoding the second set of available modulated light data.
14. The method of claim 13, the electronic operations further including enabling user authentication of the authenticated source of the modulated light data, by:
generating a graphical user interface display, the graphical user interface display including an o verlay on output of the image data that provides an identification of the multiple sources of available modulated light data: and
receiving the indication to select the authenticated source of the
modulated light data from user input received in the graphical user interface display, the user input received upon the overlay of the output of the image data in the graphical user interface display;
wherein identifying the light emitting object includes generating the graphical user interface display to indicate the authenticated source and the another source, the indication of the authenticated source and the another source provided as an overlay of an output of the image data in the graphical user interface display.
15. The method of claim 14, the electronic operations further including outputting data selected with the user authentication of the authenticated source of the modulated light data, by:
decoding and interpreting content from the modulated light data obtained from the authenticated source; and
updating the graphical user interface display to output the decoded and interpreted content from the modulated light data.
16. The method of claim 13, the electronic operations further including enabling automatic authentication of the authenticated source of the modul ated light data, by :
performing image recognition of the image data;
wherein identifying the light emitting object includes image recognition of the image data to indicate the authenticated source and the another source; and wherein the indication to select the light emitting object as the authenticated source is provided from an image recognition technique, the image recognition technique automatically performed on an object representing the source of the modulated light data in the image data.
17. The method of claim 11, the electronic operations further including obtaining supplemental data indicated in the modulated light data, by:
decoding and parsing information obtained from the modulated light data from the authenticated source, wherein the information obtained from the modulated light data indicates an identifier of the supplemental data from another data source; and obtaining the supplemental data from the another data source, using the identifier of the supplemental data.
1 8. The method of claim 17, wherein the identifier is a uniform resource locator (URL), and wherein obtaining the supplemental data from the another data source includes access of the URL using a wireless communication network.
19. The method of claim 1 1 , wherein the image data is obtained from a camera positioned in a motor vehicle to capture an image of a scene in a direction away from the motor vehicle, and wherein the modulated light data is used to generate an automated real ity display of information obtained from th e modulated light data that overlays the image of the scene.
20. The method of claim 19, the electronic operations further including identifying a limited area of evaluation from the image data for automatically authenticating the authenticated source, by:
identifying the limited area of evaluation of the image data based on an el evation angl e of the scene in the direction away from the motor vehicle, as captured from a position of the camera; wherein detecting the modulated light data is performed on the limited area of evaluation, and wherein identifying the modulated light data is performed on the limited area of evaluation.
21. An apparatus comprising means for performing any of the methods of claims 11-20.
22. At least one machine readable medium including instructions, which when executed by a computing system, cause the computing system to perform any of the methods of claims 11- 20.
23. A system for processing and authenticating modulated light data using optical camera communications, comprising:
an optical image capture system;
a processing system, comprising:
processing circuitry;
data processing circuitry to evaluate image data, the image data including an indication of modulated light data from a light source, wherein the image data is captured with an image sensor;
authentication data processing circuitry to:
detect, from image data, modulated light data emitted from the light source;
identify, from the image data, the light source as a source of the modulated light data;
receive an indication to select the light source as an
authenticated source of the modulated light data; and
perform a command to process the modulated light data from the authenticated source, in response to the indication to select the light source as the authenticated source of the modulated light data.
24. The system of claim 23, further comprising: a light source system, comprising:
data storage to store data to be transmitted with a modulated light output;
a light emitter to output the data with the modulated light output; and
processing circuitry coupled to the data storage and the light emitter, the processing circuitry to control emission of the data with the modulated light output via the light emitter.
25. The system of claim 23, further comprising:
an external data system, accessible via a network connection, the external data system comprising:
data storage to store data;
communication circuitry to receive a request for supplemental data; and
a processor and memory to process the request to serve the
supplemental data and transmit the supplemental data in response to the request;
wherein the request for supplemental data is provided from the processing system, in response to reading the modulated light data from the light source, wherein the modulated light data indicates details of the request for supplemental data.
PCT/US2017/046611 2016-09-30 2017-08-11 Data processing and authentication of light communication sources WO2018063532A1 (en)

Priority Applications (4)

Application Number Priority Date Filing Date Title
EP17856991.9A EP3520251B1 (en) 2016-09-30 2017-08-11 Data processing and authentication of light communication sources
KR1020197005012A KR102488828B1 (en) 2016-09-30 2017-08-11 Data processing and authentication of optical communication sources
CN201780053468.9A CN109644052B (en) 2016-09-30 2017-08-11 Data processing and authentication for optical communication sources
JP2019507929A JP7172004B2 (en) 2016-09-30 2017-08-11 Data processing and authentication of optical communication sources

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US15/282,328 US20180098215A1 (en) 2016-09-30 2016-09-30 Data processing and authentication of light communication sources
US15/282,328 2016-09-30

Publications (1)

Publication Number Publication Date
WO2018063532A1 true WO2018063532A1 (en) 2018-04-05

Family

ID=61758633

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/US2017/046611 WO2018063532A1 (en) 2016-09-30 2017-08-11 Data processing and authentication of light communication sources

Country Status (6)

Country Link
US (1) US20180098215A1 (en)
EP (1) EP3520251B1 (en)
JP (1) JP7172004B2 (en)
KR (1) KR102488828B1 (en)
CN (1) CN109644052B (en)
WO (1) WO2018063532A1 (en)

Families Citing this family (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10425243B2 (en) * 2015-08-07 2019-09-24 Tridonic Gmbh & Co Kg Commissioning device for commissioning installed building technology devices
US11817900B2 (en) * 2016-11-29 2023-11-14 Signify Holding B.V. Visible light communication detecting and/or decoding
CN106788727B (en) * 2017-01-06 2019-09-06 京东方科技集团股份有限公司 Vehicle-mounted VISIBLE LIGHT EMISSION system and reception system and communication network
DE102018005870A1 (en) * 2018-07-25 2020-01-30 Zf Active Safety Gmbh System for locating and classifying objects
KR102639260B1 (en) * 2019-01-03 2024-02-22 삼성전자주식회사 Electronic apparatus and the control method thereof
CN110133685B (en) * 2019-05-22 2023-04-14 吉林大学 OCC-based street lamp auxiliary mobile phone detailed positioning communication system
CN110649971A (en) * 2019-09-29 2020-01-03 福州京东方光电科技有限公司 Visible light generation and communication method and device and visible light communication system
US11445369B2 (en) * 2020-02-25 2022-09-13 International Business Machines Corporation System and method for credential generation for wireless infrastructure and security
WO2021259502A1 (en) * 2020-06-26 2021-12-30 Telefonaktiebolaget Lm Ericsson (Publ) Enabling rendering of user-specific information using a display device

Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP1981183A2 (en) 2007-04-09 2008-10-15 Ajang Bahar Devices, systems and methods for ad hoc wireless communication
WO2014063150A2 (en) 2012-10-19 2014-04-24 Daniel Ryan Self-identifying one-way authentication method using optical signals
US20140212145A1 (en) * 2012-12-27 2014-07-31 Panasonic Corporation Information communication method
WO2014200268A1 (en) * 2013-06-12 2014-12-18 부경대학교 산학협력단 Optical communication system utilizing illumination sensor of terminal and method therefor
US20150188629A1 (en) * 2013-12-27 2015-07-02 Panasonic Corporation Visible light communication method, identification signal, and receiver
US20150302421A1 (en) * 2012-10-12 2015-10-22 Document Security Systems Inc. Authenticated Barcode Patterns
EP2940899A1 (en) 2012-12-27 2015-11-04 Panasonic Intellectual Property Corporation of America Information communication method
EP2940897A1 (en) 2012-12-27 2015-11-04 Panasonic Intellectual Property Corporation of America Information communication method
US20160005309A1 (en) 2014-07-01 2016-01-07 Panasonic Intellectual Property Corporation Of America Information communication method
US20160036484A1 (en) * 2014-08-02 2016-02-04 Obx Computing Corporation Networkable light emitting device and methods and systems for using same

Family Cites Families (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP4765027B2 (en) * 2005-07-29 2011-09-07 国立大学法人 奈良先端科学技術大学院大学 Information processing apparatus and information processing system
JP2007295490A (en) 2006-04-27 2007-11-08 Kyocera Corp Visible optical communication apparatus, and visible light receiving method
US7974536B2 (en) * 2007-09-06 2011-07-05 Motorola Mobility, Inc. System and method for pre-configuring and authenticating data communication links
JP2009212768A (en) * 2008-03-04 2009-09-17 Victor Co Of Japan Ltd Visible light communication light transmitter, information provision device, and information provision system
JP2010212920A (en) * 2009-03-10 2010-09-24 Panasonic Corp Visible light communication device and method
JP5282899B2 (en) 2009-03-19 2013-09-04 カシオ計算機株式会社 Information restoration apparatus and information restoration method
EP2503852A1 (en) * 2011-03-22 2012-09-26 Koninklijke Philips Electronics N.V. Light detection system and method
JP6157011B2 (en) * 2011-10-14 2017-07-05 フィリップス ライティング ホールディング ビー ヴィ Encoded photodetector
US9310881B2 (en) * 2012-09-13 2016-04-12 Intel Corporation Methods and apparatus for facilitating multi-user computer interaction
US20150062114A1 (en) * 2012-10-23 2015-03-05 Andrew Ofstad Displaying textual information related to geolocated images
RU2696613C2 (en) 2015-02-27 2019-08-05 Панасоник Интеллекчуал Проперти Корпорэйшн оф Америка Method of generating signal, signal generation module and program

Patent Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP1981183A2 (en) 2007-04-09 2008-10-15 Ajang Bahar Devices, systems and methods for ad hoc wireless communication
US20150302421A1 (en) * 2012-10-12 2015-10-22 Document Security Systems Inc. Authenticated Barcode Patterns
WO2014063150A2 (en) 2012-10-19 2014-04-24 Daniel Ryan Self-identifying one-way authentication method using optical signals
US20140212145A1 (en) * 2012-12-27 2014-07-31 Panasonic Corporation Information communication method
EP2940899A1 (en) 2012-12-27 2015-11-04 Panasonic Intellectual Property Corporation of America Information communication method
EP2940897A1 (en) 2012-12-27 2015-11-04 Panasonic Intellectual Property Corporation of America Information communication method
WO2014200268A1 (en) * 2013-06-12 2014-12-18 부경대학교 산학협력단 Optical communication system utilizing illumination sensor of terminal and method therefor
US20150188629A1 (en) * 2013-12-27 2015-07-02 Panasonic Corporation Visible light communication method, identification signal, and receiver
US20160005309A1 (en) 2014-07-01 2016-01-07 Panasonic Intellectual Property Corporation Of America Information communication method
US20160036484A1 (en) * 2014-08-02 2016-02-04 Obx Computing Corporation Networkable light emitting device and methods and systems for using same

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
See also references of EP3520251A4

Also Published As

Publication number Publication date
US20180098215A1 (en) 2018-04-05
EP3520251A1 (en) 2019-08-07
JP2019532389A (en) 2019-11-07
EP3520251B1 (en) 2022-10-26
JP7172004B2 (en) 2022-11-16
KR102488828B1 (en) 2023-01-17
CN109644052A (en) 2019-04-16
KR20190050768A (en) 2019-05-13
EP3520251A4 (en) 2020-06-24
CN109644052B (en) 2023-03-17

Similar Documents

Publication Publication Date Title
EP3520251B1 (en) Data processing and authentication of light communication sources
US11686586B2 (en) Facilitating rider pick-up for a transport service
US10019621B2 (en) Identifying a vehicle using a mobile device
CN109154980A (en) For verifying the content of traffic sign and the method for infield
CN114868106A (en) Projecting, controlling and managing user equipment applications using connection resources
JP7039636B2 (en) Systems, management devices, monitoring terminals, and programs
US10625854B2 (en) Drone interacting with a stranger having a cellphone
US20190215654A1 (en) A method of locating a mobile device in a group of mobile devices
KR20120075580A (en) A system and method for car service management using mobile augmented reality in smart phone
CN111897507A (en) Screen projection method and device, second terminal and storage medium
KR102440381B1 (en) Hailing a vehicle
US20190199522A1 (en) System and method for communication service verification, and verification server thereof
CN110024307B (en) Visible light communication detection and/or decoding
EP3007098A1 (en) Method, device and system for realizing visual identification
CN115803796A (en) System and method for detecting a pole attack for a vehicle
CN107864501B (en) Method and equipment for acquiring and providing wireless access point connection information
JP2008070923A (en) Authentication system, authentication method and terminal
TW201617953A (en) Information processing system, control method, and program storage medium
US11159668B2 (en) Method for information processing and electronic device
US10285055B2 (en) Authentication system, authentication method, server device, and client device
JP7210050B2 (en) Method and system
WO2020139490A1 (en) Authentication by navigation-correlated sensing
US20190266402A1 (en) Method and apparatus for identifying individuals using an augmented-reality application
US20230062244A1 (en) Extended reality control of smart devices
US20210275924A1 (en) Method and apparatus for gathering visual data using an augmented-reality application

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 17856991

Country of ref document: EP

Kind code of ref document: A1

ENP Entry into the national phase

Ref document number: 2019507929

Country of ref document: JP

Kind code of ref document: A

ENP Entry into the national phase

Ref document number: 20197005012

Country of ref document: KR

Kind code of ref document: A

NENP Non-entry into the national phase

Ref country code: DE

ENP Entry into the national phase

Ref document number: 2017856991

Country of ref document: EP

Effective date: 20190430