WO2018091349A1 - Dispositif d'identification d'un second dispositif et procédé associé - Google Patents

Dispositif d'identification d'un second dispositif et procédé associé Download PDF

Info

Publication number
WO2018091349A1
WO2018091349A1 PCT/EP2017/078754 EP2017078754W WO2018091349A1 WO 2018091349 A1 WO2018091349 A1 WO 2018091349A1 EP 2017078754 W EP2017078754 W EP 2017078754W WO 2018091349 A1 WO2018091349 A1 WO 2018091349A1
Authority
WO
WIPO (PCT)
Prior art keywords
user
embedded code
light
light source
user information
Prior art date
Application number
PCT/EP2017/078754
Other languages
English (en)
Inventor
Bartel Marinus Van De Sluis
Dirk Valentinus René ENGELEN
Dzmitry Viktorovich Aliakseyeu
Jerome Eduard MAES
Original Assignee
Philips Lighting Holding B.V.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Philips Lighting Holding B.V. filed Critical Philips Lighting Holding B.V.
Publication of WO2018091349A1 publication Critical patent/WO2018091349A1/fr

Links

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04BTRANSMISSION
    • H04B10/00Transmission systems employing electromagnetic waves other than radio-waves, e.g. infrared, visible or ultraviolet light, or employing corpuscular radiation, e.g. quantum communication
    • H04B10/11Arrangements specific to free-space transmission, i.e. transmission through air or vacuum
    • H04B10/114Indoor or close-range type systems
    • H04B10/116Visible light communication
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/10Image acquisition
    • G06V10/12Details of acquisition arrangements; Constructional details thereof
    • G06V10/14Optical characteristics of the device performing the acquisition or on the illumination arrangements
    • G06V10/145Illumination specially adapted for pattern recognition, e.g. using gratings
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/20Scenes; Scene-specific elements in augmented reality scenes

Definitions

  • the invention relates to a first device for identifying a second device.
  • the invention further relates to a system for identifying a second device.
  • the invention further relates to a method of identifying a device.
  • the invention further relates to a computer program product for performing the method.
  • Smart devices are omnipresent in our daily lives. Users of these smart devices tend to pay more attention to their smart devices than to other people, because the smart devices enable users to communicate with friends, family and strangers via multiple social connection services. Examples of such social connection services include social media networks, online dating services, online messaging services, etc. Typically, users
  • RF radio frequency
  • a smart device may scan the RF network or scan its vicinity for RF signals in order to determine which other devices are its vicinity, and the smart device may send a message via the RF protocol and request a connection with another device.
  • a problem with such an RF-based social connection service is that the user operating the first smart device may not know which (physical) device might receive the broadcasted user identifier.
  • Another problem that occurs with such an RF- based social connection service is that it can be difficult for the user operating the second smart device to associate the correct (physical) person with the user profile.
  • Document Dl WO 2016102336 Al discloses a communication system comprising wearable devices arranged for wireless communication in an operating environment.
  • a first head wearable device has a camera for capturing an environment image and a first processor arranged for detecting a further head wearable device based on a viewing direction of a wearer of the first head wearable device.
  • the further head wearable device is identified at an operational distance based on a remotely detectable visual property and a communication session is established between the first head wearable device and the further head wearable device.
  • the further head wearable device may be provided with a light source that emanates coded light for identifying the head wearable device at a distance, or with some other label, graphical symbol or visual code. Once a visual marker is detected within an image coming from the outward camera of the wearable device, information about or extracted from the marker is used to retrieve information about the detected head wearable device from an earlier stored configuration information.
  • the object is achieved by a first device for identifying a second device, the first device comprising:
  • a camera for capturing a stream of images comprising light emitted by a light source of the second device, which light comprises an embedded code representative of user information of a user operating the second device
  • an image analysis module arranged for retrieving the embedded code from the plurality of images and for determining a first location of the light source in the stream of images relative to the field of view of the camera
  • a processor for decoding the embedded code, for retrieving the user information from the embedded code, for rendering a virtual indicator having a second location that corresponds to the first location on the display, wherein the virtual indicator is representative of the user information.
  • the user information (e.g. a user name or a user profile) is rendered at a location on the display that corresponds to a location of the second device in the field of view of the camera.
  • the embedded code further comprises a connection request message
  • the first device further comprises a user input element for receiving user input indicative of an acceptance of a connection request.
  • the processor may further be arranged for retrieving the connection request message from the embedded code and for generating a connection response message if the user input indicative of the acceptance of the connection request has been received.
  • the first device may further comprise a communication unit for communicating the connection response message to the second device.
  • the first device is able to establish a connection with the second device based on a received coded light signal. This requires a user of the first device to point the camera of the first device towards a light source of the second device, which light source is arranged for emitting the light comprising the embedded code.
  • the camera of the first device and the light source of the second device need to share a line-of-sight.
  • the user can determine whether or not to accept the connection request with the second device (and therewith with the user operating the second device).
  • the first device Upon receiving the user input indicative of the acceptance of the connection request, the first device transmits a connection response message to the second device in order to establish the connection.
  • the communication unit comprises a light source arranged for emitting light, and a driver for driving the light source such that the emitted light comprises a second embedded code comprising the connection response message.
  • the light source of the first device may be arranged for emitting light comprising the second embedded code.
  • the camera is arranged for receiving messages embedded in the light emitted by the second device after communicating the connection response message to the second device, and the light source is arranged for communicating messages embedded in the light to the second device after the connection response message has been communicated to the second device.
  • VLC visible light communication
  • FSO Free-space optical communication
  • OWC Optical wireless communications
  • high data transfer speeds can be achieved, which may be beneficial for purposes wherein, for example, audio or video messages are streamed between the first device and the second device.
  • the communication unit is arranged for receiving messages from and transmitting messages to the second device via a radio frequency protocol after communicating the connection response message to the second device.
  • the radio frequency protocol e.g. Bluetooth, Wi-Fi, ZigBee, etc.
  • the communication unit comprises a radio frequency transceiver, and the communication unit is arranged for communicating the connection response via a radio frequency protocol, and arranged for receiving messages from and transmitting messages to the second device via the radio frequency protocol after communicating the connection response message to the second device.
  • the user input element is arranged for receiving user input indicative of a rejection of the connection request, and the connection response message is indicative of the rejection if the user input indicative of the rejection of the connection request has been received.
  • the user operating the first device may inform the user operating the second device that he or she does not wish to establish a connection with him or her.
  • the processor is arranged for embedding user information about a user operating the first device in the connection response message. This embodiment is advantageous, because it enables a user operating the second device to see who has accepted the connection request.
  • the processor is arranged for accessing a key for decoding the embedded code.
  • the key may, for example, be provided by a software application running on the first device.
  • the second device may run the same software application. This allows the first device to decode the embedded code only when the same software applications are running on both the first and the second device.
  • the first device are smart glasses.
  • the display may be at least one of the glasses of the smart glasses.
  • the display may be a projector arranged for projecting the virtual indicator on the glasses or on the eye of the user of the first device.
  • a user wearing the smart glasses may see the second device (and also the user operating the second device) physically through the glasses, whereupon the user information of the user operating the second device may be rendered on the glasses, projected on the glasses or on the user's eye.
  • the user information is rendered as a virtual overlay on top of the physical world.
  • the processor is further arranged for rendering at least a part of one or more images of the stream of images on the display.
  • a user can see the stream of images together with the virtual indicator, which further enables the user to see which user corresponds to which second device.
  • the user information is representative of an identifier of the device. This enables a first device to either identify the device, which may be beneficial if the first device is arranged for establishing connections with a plurality of devices.
  • the user information is representative of a status of a user operating the device. This is beneficial, because it provides information about the user of the device to a user operating a first device. This enables the user of the first device to determine whether to accept or reject the connection request based on the status of the user.
  • the user information is representative of a user profile. This is beneficial, because it provides information about the user of the device to a user operating a first device. This enables the user of the first device to determine whether to accept or reject the connection request based on the user profile.
  • the object is achieved by a system for identifying a second device, the system comprising:
  • the second device comprising a light source arranged for emitting light, and a driver for driving the light source such that the emitted light comprises an embedded code representative of user information of a user operating the second device.
  • the object is achieved by a method of identifying a device, the method comprising:
  • the object is achieved by a computer program product comprising computer program code to perform the method of claim 14 when the computer program product is run on a processing unit of the computing device.
  • a computer program product comprising computer program code to perform the method of claim 14 when the computer program product is run on a processing unit of the computing device.
  • FIG. 1 shows schematically an embodiment of a system according to the invention for identifying a device
  • Figs. 2 shows schematically an embodiment of a system according to the invention for identifying a device
  • Fig. 3 shows schematically an embodiment of a system according to the invention for establishing a connection between a first device and a second device, wherein user information of a plurality of second user devices is presented on a display;
  • Fig. 4 shows schematically an embodiment of a wearable device according to the invention for establishing a connection with a second device
  • Fig. 5a shows schematically a top view of a wearable device according to the invention for determining a location of a second device relative to a user
  • Fig. 5b shows schematically an enlarged view of the wearable device of Fig.
  • Fig. 6 shows schematically steps of a method according to the invention of establishing a connection with a device.
  • Figs. 1 and 2 show schematically an embodiment of systems 100, 200 according to the invention for identifying a second device 112, 212 by a first device 102, 202.
  • the first device 102, 202 comprises a camera 106, 206 (which in the example in Fig. 1 is located at the backside of the first device 102, 202) for capturing a stream of images comprising light emitted by a light source 114, 214 of the second device 112, 212, which light comprises an embedded code 118, 218 representative of user information of a user operating the second device 112, 212.
  • the first device 102, 202 further comprises a display 104.
  • the first device further comprises image analysis module for retrieving the embedded code from the plurality of images and for determining a first location of the light source in the stream of images relative to the field of view of the camera 108.
  • the first device 102, 202 further comprises a processor 204 for decoding the embedded code, for retrieving the user information from the embedded code, for rendering a virtual indicator 120 having a second location that corresponds to the first location on the display 104, wherein the virtual indicator 120 is representative of the user information.
  • the embedded code 118, 218, emitted by the second device 112, 212 may further comprises a connection request message.
  • the first device 102, 202 may further comprise a user input element for receiving user input indicative of an acceptance of a connection request.
  • the processor may be further arranged for retrieving the connection request message from the embedded code 118, 218 and for generating a connection response message 230 if the user input indicative of the acceptance of the connection request has been received.
  • the first device 102, 202 may further comprise a communication unit 208 for communicating the connection response message 230 to the second device 112, 212.
  • connection response message 230 Upon receiving, by the second device 112, 212, the connection response message 230, the connection may have been established and the first device 102, 202 and the second device 112, 212 may 'freely' communicate. This may enable, for example, video conferencing, texting, sharing content such as audio, photos and videos, playing games, accessing social media profiles, etc. between the first device 102, 202 and the second device 112, 212.
  • the first device 102, 202 may be any type of smart device comprising a camera for detecting light comprising an embedded code 118, 218.
  • Examples of first devices 102, 202 include but are not limited to portable smart devices, such as smartphones, laptops and tablet pes, and wearable smart devices such as smart watches and smart goggles.
  • the camera 106, 206 of the first device 102, 202 may be any type of camera arranged for detecting light and the code 118, 218 embedded in the light.
  • the camera may, for example, be a rolling shutter camera or a global shutter camera.
  • the embedded code 118, 218 emitted by the second device 112, 212 may be created by controlling a time- varying, modulated current to one or more light sources 114, 214 of the second device 112, 212 to produce variations in the light output that are detectable by the camera 106, 206 comprised in the first device 102, 202.
  • the second device 112, 212 comprises one or more light source 114, 214 arranged for emitting light, and a driver (not shown) for driving the one or more light sources 114, 214 such that the emitted light comprises the embedded code 118, 218 representative of information about the user operating the second device.
  • the driver is arranged for driving the one or more light sources 114, 214, such as an LED light source, to embed the code 118, 218 in the light.
  • the embedded code may further comprise the connection request message.
  • the embedded code 118, 218 may be created by any known principle of embedding a code in light, for example by controlling a time- varying, modulated current to the one or more light sources to produce variations in the light output, by modulating the amplitude and/or the duty-cycle of the light pulses, etc.
  • the one or more light sources 114, 214 may, for example, be the flash light source of a camera comprised in a smart phone. Using the flash light source to emit light comprising the embedded code 118, 218 is beneficial because most smart devices are already equipped with a flash light source.
  • the one or more light sources 114, 214 may, for example, be indicator light sources. Such indicator light sources may be arranged for providing an indicator of the status of the second device 112, 212.
  • the one or more light sources 114, 214 may be arranged as a display arranged for displaying content to a user.
  • the light sources of the display may be controlled collectively or individually so as to emit light comprising the embedded code 118, 218.
  • the one or more light sources 118, 218 may be arranged as a display backlight.
  • the light sources of the backlight may be controlled collectively or individually so as to emit light comprising the embedded code 118, 218.
  • the light source may be comprised in a peripheral device (such as a bracelet, a watch, a necklace, a wearable lighting device, etc.), which may be connected to a smart device.
  • a peripheral device such as a bracelet, a watch, a necklace, a wearable lighting device, etc.
  • the smart device and the peripheral device together form the second device.
  • the (one or more) light source(s) 114, 214 ofthe second device 112, 212 may be arranged for emitting invisible light (such as infrared or ultraviolet).
  • invisible light such as infrared or ultraviolet.
  • the processor 204 of the first device 102, 202 will render the virtual indicator 120 of the second device 112, 212 on the display 104, which virtual indicator 120 has a location on the display that corresponds to the actual location of the second device 112, 212.
  • the (one or more) light source(s) 114, 214 of the second device 112, 212 may be arranged for emitting visible light.
  • the second device 112, 212 may comprise a first light source for emitting invisible light comprising the code and a second light source for emitting visible light.
  • An advantage of visible light is that a user operating the first device 102, 202 may be able to see that the user operating the second device 112, 212 is looking for a connection.
  • the visible light may have a specific color or dynamic pattern (e.g. a blinking sequence) indicative of a connection request, a user status, a device status, etc.
  • the (one or more) light source(s) 114, 214 of the second device 112, 212 may be further arranged for emitting visible light having a color.
  • the color of the light may be selected based on the status of the user. This is beneficial, because it enables a user operating the first device 102, 202 to see the status of the user operating the second device 112, 212, simply by looking at the (one or more) light source(s) 114, 214.
  • the color of the light may for example be red when the user is not available and green when the user is available. Additionally or alternatively, the colour may be indicative of the mood of the user.
  • the (one or more) light source(s) of the second device 112, 212 may be controlled dynamically over time to indicate the status of the user.
  • the intensity of the light may for example change based on a physiological parameter (e.g. the heart beat) of the user.
  • the image analysis module of the first device 102, 202 may be comprised in for example a microcontroller, or the image analysis module may be a software program which may be executed by the processor 204.
  • the image analysis module is arranged for retrieving the embedded code from the plurality of images and for determining a first location of the light source 114, 214 in the stream of images relative to the field of view of the camera. Any known image analysis technique for retrieving a code from a stream of images may be used. Also, any known image analysis technique may be used for determining a location (e.g. X-Y coordinates in the image) of a light sources.
  • the processor 204 may render the virtual indicator 120 (e.g.
  • the processor 204 of the first device 102 is arranged for rendering the stream of images on the display 104 and thereby show the second device 112' on the display 104. Based on the location of the light source 114 in the stream of images (top-left corner), the processor may render the virtual indicator 120 at a location that corresponds to the location of the second device in the stream of images.
  • the location of the virtual indicator 120 may be in close proximity of the light source, for example, next to the location of the light source, above/below the location of the light source.
  • the processor 204 of the first device 102, 202 may be further arranged for retrieving the connection request message (and therewith the user information about the user operating the second device) from the embedded code 118, 218. Upon detecting the connection request message, the processor 204 may provide, via the display, an indication of the connection request. This indication may, for example, be a simple display of the virtual indicator 120. A user may select the connection request by providing the user input indicative of an acceptance of the connection request at the user input element.
  • the user input element may be any type of element arranged for receiving user input.
  • the user input element may for example comprise a touch-sensitive device such as a touchpad, a touchscreen, one or more buttons and/or one or more sliders for receiving touch input.
  • the user input element may comprise a microphone arranged for receiving voice commands from the user operating the first device, which voice commands may be indicative of the acceptance (or rejection) of the connection request. Examples of such voice commands may be spoken words like 'accept connection with user X' or 'reject connection with user X'.
  • the user input element may comprise a gesture/motion detection means, such as a gyroscope and/or an
  • the accelerometer arranged for detecting gestures made with the first device, which gestures may be indicative of the acceptance (or rejection) of the connection request. Examples of such gestures are shaking the first device 102, 202 or changing the orientation of the first device 102, 202.
  • the first device 102, 202 may, for example, be smart glasses, and the shaking of a person's head may be the gesture (e.g. vertical shaking for an acceptance and horizontal shaking for a rejection).
  • the user input element may comprise a camera (e.g. a front facing camera on a smartphone) arranged for detecting user gestures indicative of the acceptance (or rejection) of the connection request. Examples of such gestures are hand movements, eye-blinking, head shaking, etc.
  • the user input element may comprise a means (e.g. a camera) for gaze detection and an acceptance of the connection request may be based on a time period of gazing towards the location of the second device 112, 212.
  • a means e.g. a camera
  • an acceptance of the connection request may be based on a time period of gazing towards the location of the second device 112, 212.
  • the communication unit 208 of the first device 102, 202 is arranged for communicating the connection response message 230 to the second device 112, 212 in order to accept (or reject) a connection request.
  • the connection response message 230 may be representative of an acceptance (or a rejection) of the connection request.
  • the communication unit 208 may be arranged for communicating directly or indirectly with the second device 112, 212 (i.e. transmitting and receiving messages/data/packets to and from the second device 112, 212).
  • the communication unit 208 may be arranged for
  • connection request message 218 and/or the connection response message 230 may comprise a key for setting up the connection between the first device 102, 202 and the second device 112, 212.
  • the processor 204 of the first device may retrieve the key upon decoding the embedded code 118, 218, and use the key to gain access to a communication unit 216 of the second device 112, 212.
  • the second device 112, 212 may comprise a communication unit 216 for receiving the connection response message 230 from the first device 102, 202.
  • the communication unit 216 may be further arranged for communicating messages/data/packets 232 to the first device 102, 202, whereupon the communication unit 208 of the first device 102, 202 may receive the messages/data/packets 232.
  • the communication unit 216 of the second device 112, 212 may be similar to the communication 208 unit of the first device 102, 202.
  • the communication unit 208 of the first device 102, 202 may comprise a light source (not shown), such as an LED light source, and a driver for driving the light source.
  • the driver may drive the light source such that it emits visible light comprising a second embedded code.
  • the second embedded code may comprise the connection response message 230.
  • the second device 112, 212 is arranged for generating the connection request message.
  • the connection request message may comprise information indicative of a request for a connection.
  • the second device 112, 212 may, for example, be arranged for running a software application which enables the user to communicate with other users.
  • the user operating the second device 112, 212 may request a connection with a first device 102, 202 operated by another user, which first device 102, 202 may be running the same application.
  • the user operating the second device 112, 212 may position the second device 112, 212 such that its light source 114, 214 is in line-of-sight of the camera 106, 206 of the first device 102, 202, or the user of the first device 102, 202 may position the first device 102, 202 such that its camera 106, 206 is in line-of-sight of the light source 114, 214 of the second device 112, 212 enabling the first device 102, 202 to detect the connection request.
  • the embedded code 118, 218 is representative of information about a user operating the second device 112, 212.
  • the embedded code 118, 218 may comprise the user information (e.g. a user profile, user status, user name, user preferences, etc.).
  • the embedded code 118, 218 may comprise a link to the user information, which user information may be stored on a remote server which, for example, may be accessible via the internet.
  • the user information about the user operating the second device 112, 212 may be further representative of an identifier of the second device 112, 212, which enables the first device 102, 202 to identify the second device 112, 212 and to distinguish between multiple second devices 112, 212.
  • the identifier may, for example, further comprise information about the type of second device 112, 212 (smartphone, laptop, smart goggles, etc.) or about the status of the second device (on, off, stand-by, (un)locked, battery level, etc.).
  • the user information about the user operating the second device 112, 212 may be representative of a user profile. This enables the user operating the first device 102. 202 to identify the user operating the second device 112, 212.
  • the user profile may comprise characteristics about the user of the second device 112, 212, such as the user's name, age, gender, interests, image(s) of the user, etc.
  • the user operating the second device 112, 212 may position the second device 112, 212 such that its light source 114, 214 is in line-of-sight of the camera 106, 206 of the first device 102, 202 or the user of the first device 102, 202 may position the first device 102, 202 such that its camera 106, 206 is in line-of-sight of the light source 114, 214 of the second device 112, 212, enabling the first device 102, 202 to receive the user profile.
  • the user of the first device 102, 202 may then determine whether to accept the connection request based on the information provided by the user profile. Additionally or alternatively, the user profile may comprise a promotional message, such as an advertisement (e.g.
  • the second device 112, 212 may, for example, be arranged for running a software application related to a service (e.g. a ticket swap service, or a taxi service) related to the promotional message.
  • a service e.g. a ticket swap service, or a taxi service
  • the user operating the second device 112, 212 may request a connection with a first device 102, 202 operated by another user, which first device 102, 202 is running the same software application.
  • the user operating the second device 112, 212 may position the second device 112, 212 such that its light source 114, 214 is in line-of-sight of the camera 106, 206 of the first device 102, 202 or the user of the first device 102, 202 may position the first device 102, 202 such that its camera 106, 206 is in line-of-sight of the light source 114, 214 of the second device 112, 212, enabling the first device 102, 202 to receive the promotional message.
  • a user of the first device 102, 202 may then accept the connection request if he or she is interested in the promotional offer. After the acceptance, the first 102, 202 and second device 112, 212 establish a connection, which enable the users to
  • the user information about the user operating the second device 112, 212 may be representative of a status of a user operating the second device 112, 212.
  • the status may, for example, be indicative of an availability status of the user operating the second device 112, 212 (e.g. available, busy, do not disturb, etc.). This is beneficial because it enables a user operating the first device 102, 202 to see the availability status of the user operating the second device 112, 212 when the connection request message 218 has been received, and to determine whether or not to accept the connection request based on the availability status.
  • the status may be representative of a state of a software application running on the second device 112, 212 (the status may be representative of a game level of a game software application, a (type of) song played by a music application, a video played by a video application, etc.). This is beneficial because it enables a user operating the first device 102, 202 to see what the user operating the second device 112, 212 is doing on the second device 112, 212 when the connection request message has been received, and to determine whether or not to accept the connection request based on what he or she is doing. Additionally or alternatively, the status may be representative of the mood of the user operating the second device 112, 212 (e.g. happy, sad, excited, tired, etc.).
  • the status may be representative of a physiological parameter of the user operating the second device 112, 212 (e.g. a heart rate, breathing rate, skin conductivity, etc.).
  • a physiological parameter of the user operating the second device 112, 212 e.g. a heart rate, breathing rate, skin conductivity, etc.
  • This is beneficial because it enables a user operating the first device 102, 202 to see the physical condition of the user operating the second device 112, 212 when the connection request message has been received, and to determine whether or not to accept the connection request based on the physical condition of the user operating the second device 112, 212.
  • the user information may be information related to a current activity of a user.
  • the activity for example typing a message, taking a picture, walking, running, etc.
  • the user of the second device 112, 212 may be an actor on a theatre stage wearing a light emitting device for emitting light comprising the embedded code representative of the user information.
  • the user information may, for example, be related to a script of the actor, or the speech of the actor may be analyzed and the user information may relate to the words spoken by the actor.
  • the first device may receive (a link to) the spoken/scripted words, whereupon the processor of the first device 102, 202 may render the spoken words/scripted words on a display of the first device 102, 202, enabling a user
  • the connection request message may be transmitted by the second device 112, 212 continuously. This enables a user operating the first device 102, 202 to always receive the connection request message from the second device 112, 212.
  • the connection request message may be transmitted by the second device 112, 212 every predetermined time period. This may be beneficial because it may reduce the energy consumption required for driving the light source 114, 214.
  • the connection request message may be transmitted by the second device 112, 212 upon receiving a user input (e.g. a user input dedicated to triggering the transmission of the connection request message, a user input unlocking the second device 112, 212, etc.). This is beneficial because it enables a user operating the second device 112, 212 to determine when the connection request will be transmitted.
  • connection request message may be transmitted by the second device 112, 212 only when a (specific) software application is running on the second device 112, 212, or only when an action occurs in the software application (e.g. when a new song is played by a music application, when an availability status of the user in a social media application is changed, when an image is captured by an image capture application, etc.).
  • a (specific) software application is running on the second device 112, 212
  • an action occurs in the software application (e.g. when a new song is played by a music application, when an availability status of the user in a social media application is changed, when an image is captured by an image capture application, etc.).
  • the processor 204 may be further arranged for determining whether to render the visual indicator 120 representative of the user information or not based on the user information. If, for example, a user profile of a user operating a second device 112, 212 does not match with preferences of a user operating the first device 102, 202, the processor 204 may not render the visual indicator 120 received from that second device 112, 212.
  • Fig. 3 shows schematically an embodiment of a system 300 according to the invention for establishing a connection between a first device 302 and one or more second devices 312, 322, wherein user information of a plurality of second devices 312, 322 is presented on a display 304 of the first device 302.
  • a first light source 314 of a primary second device 312 may emit light comprising a first code and a second light source 324 of a secondary second device 322 may emit light comprising a second code.
  • a camera (not shown) of the first device 302 may capture a stream of images of a first user 310, the primary second device 312, a second user 320 and the secondary second device 322.
  • the processor (not shown) of the first device 302 may be arranged for rendering at least a part of one or more images of the plurality of images on the display 304, such that the first 3 lO'and second 320' users and their respective second devices are visible for a user operating the first device 302.
  • An image analysis module may determine the locations of the light sources 314' and 324' in the stream of images and retrieve the embedded codes from the light emitted by the light sources 314, 324.
  • the processor may be further arranged for decoding the embedded code, for retrieving the user information (and, optionally, the connection request message) from the embedded code and for rendering virtual indicators 316 and 326 having locations on the display 304 that correspond to the locations of the light sources 314' and 324'.
  • the processor may further render indicators of acceptance/rejection of a connection request.
  • a user may, for example, select the checkmark or the cross via the touch-sensitive display 304 to respectively accept or reject the connection request.
  • Fig. 4 shows schematically an embodiment of one glass of a pair of smart glasses (i.e. the first device) 402 for establishing a connection with a second (wearable) device 412.
  • the smart glasses 402 comprise a camera 404 for capturing a stream of images comprising light emitted by a light source 414 of the second wearable device 412, which light comprises an embedded code 418 representative of user information of the user 410 operating/wearing the second device 412.
  • the embedded code may further comprise a connection request message.
  • An image analysis module may retrieve the embedded code from the plurality of images and determine a first location of the light source 414 in the stream of images relative to the field of view of the camera 404.
  • a processor may decode the embedded code and retrieve the user information from the embedded code (and, optionally, the connection request message), whereupon the processor may render a virtual indicator having a second location on a display that corresponds to the first location, wherein the virtual indicator is representative of the user information.
  • the user wearing the smart glasses 402 may see the second wearable device 412 (and also the user 410 operating the second wearable device 412) physically through the glasses and the user information 406 of the user 410 operating the second wearable device 412 may be rendered on the glasses, may be projected on the glasses or projected directly onto the eye/retina of the wearer of the glasses 402. As a result, the user information 406 is rendered as a virtual overlay on top of the physical world.
  • the user wearing the glasses 402 may, for example, accept or reject the connection request by nodding/shaking his or her head, blinking according to a predefined blinking pattern, providing a tap input at the glasses 402, providing auditory input, etc., whereupon the processor may generate a connection response message and transmit this to the second wearable device 412 in order to establish a connection between the first 402 and the second 412 device.
  • the second device 112, 212 may be any user device comprising a light source for emitting light comprising an embedded code representative of user information.
  • Examples of second devices 112, 212 include but are not limited to portable smart devices, such as smartphones, laptops and tablet pes, wearable smart devices such as smart watches and smart goggles, light emitting clothing or jewelry, etc.
  • the user operating the second device 112, 212 may be an animal.
  • the user information may be representative of information about the animal.
  • the animal may be equipped with a lighting device (i.e. the second device 112, 212) comprising the light source 114, 214 arranged for emitting light comprising the embedded code representative of the information about the animal.
  • a lighting device i.e. the second device 112, 212
  • the light source 114, 214 arranged for emitting light comprising the embedded code representative of the information about the animal.
  • a lighting device i.e. the second device 112, 212
  • a cat can be equipped with a band with lighting device, enabling other people to read out the home address of the cat.
  • An advantage over other technologies such as NFC or RFID is that this can be read out from a distance using a smartphone. In this way, users do not need to catch or approach the cat in order to read out the information.
  • Fig. 5a shows schematically a top view of a wearable device 500 for determining a location of a light source 520 relative to a user 510.
  • Fig. 5b shows
  • the wearable device 500, 500' comprises a camera 504, 504' arranged for capturing a stream of images of the light source 520 (comprised in a second device).
  • An image analysis module may be arranged for determining a first location of the light source 520 in the stream of images relative to the field of view 508 of the camera 504. Based on the location of the light source 520 a first angle ⁇ may be determined.
  • a field of view 518 of the user relative to the field of view 508 of the camera needs to be determined.
  • This relative field of view can be determined by calculating the angles a and ⁇ .
  • Angle a can be calculated if distance 'b' between the centerline of the eye and an edge of the glass 502 is known, and if distance 'a' between the glass 502 and the user's pupil is known by calculating the inverse tangent of distance 'b' divided by distance 'a'.
  • Angle ⁇ can be calculated in a similar fashion.
  • the field of view 518 of the user relative to the field of view 508 of the camera 504 can be calculated.
  • the processor to calculate the second location 520' by compensating for the difference between the positions and angles of fields of view 508 and 518, and to position a virtual indicator at or nearby the second location 520'.
  • Figs. 5a and 5b illustrate the calculation of the second location 520' based on the first location in a horizontal plane, but it should be understood that the second location 520' can be calculated in a similar fashion in a vertical plane.
  • Fig. 6 shows schematically steps of a method 600 according to the invention of identifying a device.
  • the method comprises the steps of:
  • capturing 602 a stream of images comprising light emitted by a light source of the device, which light comprises an embedded code representative of user information of a user operating the device,
  • the method 600 may be executed by computer program code of a computer program product when the computer program product is run on a processing unit of a computing device, such as the processor 204 of the first device 202.
  • any reference signs placed between parentheses shall not be construed as limiting the claim.
  • Use of the verb "comprise” and its conjugations does not exclude the presence of elements or steps other than those stated in a claim.
  • the article “a” or “an” preceding an element does not exclude the presence of a plurality of such elements.
  • the invention may be implemented by means of hardware comprising several distinct elements, and by means of a suitably programmed computer or processing unit. In the device claim enumerating several means, several of these means may be embodied by one and the same item of hardware. The mere fact that certain measures are recited in mutually different dependent claims does not indicate that a combination of these measures cannot be used to advantage.
  • aspects of the invention may be implemented in a computer program product, which may be a collection of computer program instructions stored on a computer readable storage device which may be executed by a computer.
  • the instructions of the present invention may be in any interpretable or executable code mechanism, including but not limited to scripts, interpretable programs, dynamic link libraries (DLLs) or Java classes.
  • the instructions can be provided as complete executable programs, partial executable programs, as modifications to existing programs (e.g. updates) or extensions for existing programs (e.g. plugins).
  • parts of the processing of the present invention may be distributed over multiple computers or processors.
  • Storage media suitable for storing computer program instructions include all forms of nonvolatile memory, including but not limited to EPROM, EEPROM and flash memory devices, magnetic disks such as the internal and external hard disk drives, removable disks and CD-ROM disks.
  • the computer program product may be distributed on such a storage medium, or may be offered for download through HTTP, FTP, email or through a server connected to a network such as the Internet.

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Multimedia (AREA)
  • Theoretical Computer Science (AREA)
  • Electromagnetism (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Signal Processing (AREA)
  • Artificial Intelligence (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

Premier dispositif 102, 202 permettant d'identifier un second dispositif 112, 212. Le premier dispositif 102, 202 comprend une caméra 106, 206 pour capturer un flux d'images comprenant de la lumière émise par une source de lumière 114, 214 du second dispositif 112, 212, laquelle lumière comprend un code intégré 118, 218 représentant des informations d'utilisateur d'un utilisateur qui actionne le second dispositif 112, 212, un affichage 104, un module d'analyse d'image agencé pour récupérer le code intégré 118, 218 à partir de la pluralité d'images et pour déterminer un premier emplacement de la source de lumière 114, 214 dans le flux d'images par rapport au champ de vision de la caméra 106, 206, et un processeur pour décoder le code intégré 118, 218, pour récupérer les informations d'utilisateur à partir du code intégré 118, 218, pour rendre un indicateur virtuel 120 ayant un second emplacement qui correspond au premier emplacement sur l'affichage 104, l'indicateur virtuel 120 représentant les informations d'utilisateur.
PCT/EP2017/078754 2016-11-17 2017-11-09 Dispositif d'identification d'un second dispositif et procédé associé WO2018091349A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
EP16199233 2016-11-17
EP16199233.4 2016-11-17

Publications (1)

Publication Number Publication Date
WO2018091349A1 true WO2018091349A1 (fr) 2018-05-24

Family

ID=57396270

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/EP2017/078754 WO2018091349A1 (fr) 2016-11-17 2017-11-09 Dispositif d'identification d'un second dispositif et procédé associé

Country Status (1)

Country Link
WO (1) WO2018091349A1 (fr)

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20140147122A1 (en) * 2012-11-29 2014-05-29 Symbol Technologies, Inc. System and method for automatically activating a function of a mobile device with another device
WO2016102336A1 (fr) 2014-12-22 2016-06-30 Koninklijke Philips N.V. Système de communication comprenant des dispositifs pouvant être portés sur la tête

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20140147122A1 (en) * 2012-11-29 2014-05-29 Symbol Technologies, Inc. System and method for automatically activating a function of a mobile device with another device
WO2016102336A1 (fr) 2014-12-22 2016-06-30 Koninklijke Philips N.V. Système de communication comprenant des dispositifs pouvant être portés sur la tête

Similar Documents

Publication Publication Date Title
US11887016B2 (en) Actionable suggestions for activities
US9418481B2 (en) Visual overlay for augmenting reality
US9952881B2 (en) Virtual assistant system to enable actionable messaging
US9936127B2 (en) Systems and methods for providing attention directing functions in an image capturing device
US20150177939A1 (en) User interface based on wearable device interaction
US20170206721A1 (en) Light-Based Security Systems
US20170359456A1 (en) Communication system comprising head wearable devices
US10243597B2 (en) Methods and apparatus for communicating with a receiving unit
TW201633226A (zh) 社交提醒
KR20160064337A (ko) 컨텐트 제공 방법 및 장치
US9270627B1 (en) Light-based messaging systems
KR20160079664A (ko) 웨어러블 기기를 제어하는 디바이스 및 그 제어 방법
JP6728863B2 (ja) 情報処理システム
US10621624B2 (en) Live auction advertisements for smart signs
KR102452314B1 (ko) 컨텐츠 재생 방법 및 이를 지원하는 전자 장치
US11463533B1 (en) Action-based content filtering
JP2017174339A (ja) 情報提示装置及び情報処理システム
US10395453B1 (en) Light-based vehicle messaging systems
US20210152910A1 (en) OVER-THE-EAR HEADPHONE DEVICE WITH A CIRCLE-OF-SIGHT (CoS) SIGNALING ELEMENT
WO2018091349A1 (fr) Dispositif d'identification d'un second dispositif et procédé associé
US20210105353A1 (en) METHOD AND SYSTEM FOR AN EAR BUD WITH A CIRCLE-OF-SIGHT (CoS) SIGNALING ELEMENT
US20200059603A1 (en) A method of providing information about an object
US20230388576A1 (en) Engagement and synchronization using received audio or visual cues
KR20230040676A (ko) 전자장치 및 키트
KR20230040678A (ko) 비밀신호를 이용한 통신 방법 및 그 장치

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 17793980

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 17793980

Country of ref document: EP

Kind code of ref document: A1