EP3581960A1 - Dispositifs optiques à double mode pour détection de temps de vol et transfert d'informations et appareil, systèmes et procédés les utilisant - Google Patents

Dispositifs optiques à double mode pour détection de temps de vol et transfert d'informations et appareil, systèmes et procédés les utilisant Download PDF

Info

Publication number
EP3581960A1
EP3581960A1 EP19179806.5A EP19179806A EP3581960A1 EP 3581960 A1 EP3581960 A1 EP 3581960A1 EP 19179806 A EP19179806 A EP 19179806A EP 3581960 A1 EP3581960 A1 EP 3581960A1
Authority
EP
European Patent Office
Prior art keywords
optical
optical signal
signal
information
wavelength
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
EP19179806.5A
Other languages
German (de)
English (en)
Inventor
Hannes Plank
Norbert Druml
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Infineon Technologies AG
Original Assignee
Infineon Technologies AG
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Infineon Technologies AG filed Critical Infineon Technologies AG
Publication of EP3581960A1 publication Critical patent/EP3581960A1/fr
Pending legal-status Critical Current

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04BTRANSMISSION
    • H04B10/00Transmission systems employing electromagnetic waves other than radio-waves, e.g. infrared, visible or ultraviolet light, or employing corpuscular radiation, e.g. quantum communication
    • H04B10/07Arrangements for monitoring or testing transmission systems; Arrangements for fault measurement of transmission systems
    • H04B10/075Arrangements for monitoring or testing transmission systems; Arrangements for fault measurement of transmission systems using an in-service signal
    • H04B10/079Arrangements for monitoring or testing transmission systems; Arrangements for fault measurement of transmission systems using an in-service signal using measurements of the data signal
    • H04B10/0795Performance monitoring; Measurement of transmission parameters
    • H04B10/07953Monitoring or measuring OSNR, BER or Q
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/88Lidar systems specially adapted for specific applications
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04WWIRELESS COMMUNICATION NETWORKS
    • H04W4/00Services specially adapted for wireless communication networks; Facilities therefor
    • H04W4/02Services making use of location information
    • H04W4/023Services making use of location information using mutual or relative location information between multiple location based services [LBS] targets or of distance thresholds
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/02Systems using the reflection of electromagnetic waves other than radio waves
    • G01S17/06Systems determining position data of a target
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/02Systems using the reflection of electromagnetic waves other than radio waves
    • G01S17/06Systems determining position data of a target
    • G01S17/08Systems determining position data of a target for measuring distance only
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/48Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00
    • G01S7/483Details of pulse systems
    • G01S7/486Receivers
    • G01S7/4865Time delay measurement, e.g. time-of-flight measurement, time of arrival measurement or determining the exact position of a peak
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04BTRANSMISSION
    • H04B10/00Transmission systems employing electromagnetic waves other than radio-waves, e.g. infrared, visible or ultraviolet light, or employing corpuscular radiation, e.g. quantum communication
    • H04B10/07Arrangements for monitoring or testing transmission systems; Arrangements for fault measurement of transmission systems
    • H04B10/071Arrangements for monitoring or testing transmission systems; Arrangements for fault measurement of transmission systems using a reflected signal, e.g. using optical time domain reflectometers [OTDR]
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04BTRANSMISSION
    • H04B10/00Transmission systems employing electromagnetic waves other than radio-waves, e.g. infrared, visible or ultraviolet light, or employing corpuscular radiation, e.g. quantum communication
    • H04B10/11Arrangements specific to free-space transmission, i.e. transmission through air or vacuum
    • H04B10/114Indoor or close-range type systems
    • H04B10/1143Bidirectional transmission
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04BTRANSMISSION
    • H04B10/00Transmission systems employing electromagnetic waves other than radio-waves, e.g. infrared, visible or ultraviolet light, or employing corpuscular radiation, e.g. quantum communication
    • H04B10/11Arrangements specific to free-space transmission, i.e. transmission through air or vacuum
    • H04B10/114Indoor or close-range type systems
    • H04B10/116Visible light communication
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04BTRANSMISSION
    • H04B10/00Transmission systems employing electromagnetic waves other than radio-waves, e.g. infrared, visible or ultraviolet light, or employing corpuscular radiation, e.g. quantum communication
    • H04B10/50Transmitters
    • H04B10/516Details of coding or modulation
    • H04B10/548Phase or frequency modulation
    • H04B10/556Digital modulation, e.g. differential phase shift keying [DPSK] or frequency shift keying [FSK]
    • H04B10/5561Digital phase modulation
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04JMULTIPLEX COMMUNICATION
    • H04J14/00Optical multiplex systems
    • H04J14/08Time-division multiplex systems
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04WWIRELESS COMMUNICATION NETWORKS
    • H04W64/00Locating users or terminals or network equipment for network management purposes, e.g. mobility management
    • H04W64/006Locating users or terminals or network equipment for network management purposes, e.g. mobility management with additional information processing, e.g. for direction or speed determination
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/20Scenes; Scene-specific elements in augmented reality scenes
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04BTRANSMISSION
    • H04B10/00Transmission systems employing electromagnetic waves other than radio-waves, e.g. infrared, visible or ultraviolet light, or employing corpuscular radiation, e.g. quantum communication
    • H04B10/40Transceivers
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04BTRANSMISSION
    • H04B10/00Transmission systems employing electromagnetic waves other than radio-waves, e.g. infrared, visible or ultraviolet light, or employing corpuscular radiation, e.g. quantum communication
    • H04B10/40Transceivers
    • H04B10/43Transceivers using a single component as both light source and receiver, e.g. using a photoemitter as a photoreceiver
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04BTRANSMISSION
    • H04B10/00Transmission systems employing electromagnetic waves other than radio-waves, e.g. infrared, visible or ultraviolet light, or employing corpuscular radiation, e.g. quantum communication
    • H04B10/50Transmitters
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04BTRANSMISSION
    • H04B10/00Transmission systems employing electromagnetic waves other than radio-waves, e.g. infrared, visible or ultraviolet light, or employing corpuscular radiation, e.g. quantum communication
    • H04B10/60Receivers

Definitions

  • the present disclosure relates generally to methods, systems, and devices usable for localizing (e.g ., determining a position of), and transferring information from, other devices using a plurality of optical signals (e.g ., infrared signals) having substantially similar wavelengths.
  • a plurality of optical signals e.g ., infrared signals
  • certain exemplary embodiments of the present disclosure include optical devices that are capable of reflecting first optical signals emitted by compatible optical sensing systems (e.g ., ToF measurement systems), and transmitting second optical signals bearing information related to the optical device or an associated object (e.g ., a device or object identity, user interface (UI) to the object, etc. ).
  • the first and second optical signals can have a substantially similar wavelength (e.g ., infrared), and the optical device can reflect and transmit the optical signals in a time-multiplexed ( e.g ., time-synchronized) manner, such that the second (information-bearing) optical signals do not interfere with first optical signal used for ToF measurements.
  • these arrangements facilitate improved interaction with persons, devices, objects, etc . in a user's environment. These improvements can be particularly advantageous in AR and VR applications, as explained further below.
  • Exemplary embodiments of the present disclosure include methods and/or procedures for receiving, in a time-multiplexed manner, a plurality of optical signals having substantially similar wavelengths.
  • the exemplary methods and/or procedures can include emitting a first optical signal having a first wavelength and comprising a plurality of intermittent active and inactive durations.
  • the exemplary methods and/or procedures can also include receiving, during one or more of the active durations, a reflected version of the first optical signal and performing a time-of-flight (ToF) measurement based on the first optical signal and the reflected version of the first optical signal.
  • the exemplary methods and/or procedures can also include receiving, only during one or more of the inactive durations of the first optical signal, a second optical signal having a second wavelength substantially similar to the first wavelength of the first optical signal.
  • ToF time-of-flight
  • exemplary methods and/or procedures can be performed by a time-of-flight (TOF) measurement device.
  • TOF time-of-flight
  • the reflected version of the first optical signal and the second optical signal can be received from a single optical device.
  • the first and second wavelengths can be associated with infrared light.
  • the exemplary methods and/or procedures can also include determining a location of the optical device based on the ToF measurement, and retrieving, from the received second optical signal, information encoded in said received second optical signal.
  • the second optical signal can include information encoded based on phase-shift keying (PSK) modulation, and retrieving the information can include demodulating the PSK-encoded information.
  • PSK phase-shift keying
  • the exemplary methods and/or procedures can also include determining a relative signal strength of at least one of: the received reflections of the first optical signal and the received second optical signal; and determining the location of the optical device further based on the determined relative signal strength and a correlation function between the first optical signal and the reflected version of the first optical signal.
  • the exemplary methods and/or procedures can also include transmitting information identifying one or more of: at least one of an active duration and an inactive duration of the first optical signal; and an identifier of the optical device.
  • the optical sensing system can include an optical transmitter configured to emit a first optical signal having a first wavelength and comprising a plurality of intermittent active and inactive durations.
  • the optical sensing system can also include an optical sensor configured to: receive, during one or more of the active durations, a reflected version of the first optical signal; and receive, only during one or more of the inactive durations of the first optical signal, a second optical signal having a second wavelength substantially similar to the first wavelength of the first optical signal.
  • the optical sensing system can also include a one or more processing circuits configured to perform a time-of-flight (ToF) measurement based on the first optical signal and the reflected version of the first optical signal.
  • ToF time-of-flight
  • the optical sensor and the one or more processing circuits comprise a time-of-flight (ToF) measurement device.
  • the reflected version of the first optical signal and the second optical signal can be received from a single optical device.
  • the first and second wavelengths can be associated with infrared light.
  • the one or more processing circuits and the optical sensor can be further configured to cooperatively: determine a location of the optical device based on the ToF measurement; and retrieve, from the received second optical signal, information encoded in said received second optical signal.
  • the second optical signal can include information encoded based on phase-shift keying (PSK) modulation, and retrieving the information can include demodulating the PSK-encoded information.
  • PSK phase-shift keying
  • the one or more processing circuits and the optical sensor can be further configured to cooperatively determine a relative signal strength of at least one of: the received reflections of the first optical signal, and the received second optical signal.
  • the one or more processing circuits can be configured to determine the location of the optical device further based on the determined relative signal strength and a correlation function between the first optical signal and the reflected version of the first optical signal.
  • the optical sensing system can also include a transmitter configured to transmit information identifying one or more of: at least one of an active duration and an inactive duration of the first optical signal; and an identifier of the optical device.
  • the transmitter can be one of an optical transmitter and a radio frequency (RF) transmitter.
  • exemplary embodiments of the present disclosure include a dual-mode optical device comprising an optical reflector configured to reflect an incident first optical signal having a first wavelength, wherein the first optical signal comprises a plurality of intermittent active and inactive durations.
  • the optical device also includes an optical transmitter configured to emit a second optical signal having a second wavelength substantially similar to the first wavelength.
  • the optical device also includes a detector configured to determine the active and inactive durations of the first optical signal, and inhibit the optical transmitter from emitting the second optical signal during the detected active durations of the first optical signal.
  • the first and second wavelengths can be associated with infrared light.
  • the detector can comprise an optical detector configured to detect at least one of: energy of the first optical signal; and a transmission pattern of the first optical signal.
  • the detector can be further configured to enable the optical transmitter to emit the second optical signal during the detected inactive durations of the first optical signal.
  • the optical transmitter can be further configured to encode information in the second optical signal.
  • the information can include one or more of: an identifier (ID) of the optical device; information pertaining to an operating parameter of the optical device; and information pertaining to a user interface (UI) of an object associated with the optical device.
  • the optical transmitter can be configured to encode information using phase-shift keying (PSK) modulation.
  • PSK phase-shift keying
  • the dual-mode optical device can also include a receiver configured to receive information identifying one or more of: at least one of an active duration and an inactive duration of the first optical signal; and an identifier of the dual-mode optical device.
  • the receiver can be one of an optical receiver and a radio frequency (RF) receiver.
  • exemplary embodiments of the present disclosure include methods and/or procedures for facilitating user interaction with an object associated with an optical device.
  • the exemplary methods and/or procedures can include emitting a first optical signal having a first wavelength and comprising a plurality of intermittent active and inactive durations.
  • the exemplary methods and/or procedures can also include receiving, during one or more of the active durations, a reflected version of the first optical signal.
  • the exemplary methods and/or procedures can also include receiving, only during one or more of the inactive durations of the first optical signal, a second optical signal having a second wavelength substantially similar to the first wavelength of the first optical signal.
  • the exemplary methods and/or procedures can also include determining a location of at least one of the optical device and the object, based on the first optical signal and the received reflected version of the first optical signal.
  • determining the location can include performing a time-of-flight (ToF) measurement based on the first optical signal and the reflected version of the first optical signal; and determining the location based on the ToF measurement.
  • determining the location of at least one of the optical device and the object can include determining the location relative to an optical sensing system that performs the emitting and receiving operations.
  • ToF time-of-flight
  • determining the location relative to the optical sensing system can be further based on a correlation function between the first optical signal and the reflected version of the first optical signal, and a relative signal strength of at least one of: the received reflected version of the first optical signal; and the received second optical signal.
  • the exemplary methods and/or procedures can also include determining a user interface associated with the object based on the location and information encoded in the second optical signal. In some embodiments, determining the user interface associated with the object can be based on the location relative to the optical sensing system. In some embodiments, determining the user interface associated with the object can include retrieving user-interface information from a record in a database, the record being associated with the information encoded in the second optical signal.
  • the exemplary methods and/or procedures can also include capturing images of a field of view using a camera, the field of view including the object.
  • the exemplary methods and/or procedures can also include rendering the user-interface information associated with the object on a display.
  • the user-interface information can be rendered together with the captured images in a region of the display that corresponds to a position of the object in the field of view.
  • the exemplary methods and/or procedures can also include receiving user input associated with the object.
  • the reflected version of the first optical signal and the second optical signal can be received from a single optical device.
  • the exemplary methods and/or procedures can be performed by a time-of-flight (TOF) measurement device.
  • the first and second wavelengths can be associated with infrared light.
  • Figure 1 illustrates an system 100 usable for position determination of one or more optical devices (e.g ., reflective tags) based on time-of-flight (ToF) measurements.
  • system 100 comprises an active illumination unit 110 (e.g ., a light source) configured to emit modulated light signals.
  • the light signals can have a first wavelength, e.g ., in an infrared band.
  • the light signals can be emitted over a range of azimuths and elevations such that they are incident on devices 191, 192, and 193, each of which has a different 3-D position ⁇ x ⁇ ⁇ z ⁇ ⁇ in a coordinate system 160.
  • Coordinate system 160 can be a global coordinate system, or a coordinate system established relative to a particular object or landmark ( e.g ., a building or a room).
  • Each of devices 191-193 can reflect the emitted light signals, such that reflected versions of the light signals pass through an optical apparatus, which can comprise one or more lenses.
  • Optical apparatus 140 can be configured to project the reflected light signals on an image sensor 130.
  • the combination of the optical apparatus 140 and image sensor 130 can be referred to as a "ToF camera", which has an associated camera coordinate system 150.
  • the origin of camera coordinate system 150 is established with respect to optical apparatus 140, such that each of the reflected light signals is incident on the optical apparatus 140 at a different 3-D position ⁇ x y z ⁇ in camera coordinate system 150. If, alternately, the origin were established with respect to the image sensor 130, the respective 3-D positions ⁇ x y z ⁇ of the reflected light signals would be with respect to that established origin.
  • Image sensor 130 can comprise an array of photosensitive pixels configured to receive modulated light signals and convert them into electrical signals. These electrical signals can be read from the image sensor by a ToF measurement and position determination circuit 120.
  • Circuit 120 can comprise, e.g ., one or more processing circuits that can be any combination of analog/digital hardware, processors, and non-transitory media ( e.g ., memories) storing processor-executable instructions. As shown in Figure 1 , circuit 120 also controls light source 110, more specifically the timing and the phase the light signal emissions from source 110.
  • circuit 120 can determine ( e.g ., estimate the actual) respective 3-D positions ⁇ x y z ⁇ , in camera coordinate system 150, of the light signals reflected from devices 191-193.
  • circuit 120 can be configured to determine a distance from system 100 (e.g ., from image sensor 140 and/or optical apparatus 130) to the device 191 (and, alternately, also devices 192-193) based on a measured time-of-flight (ToF) of the emitted signal.
  • the measured ToF is based on the known characteristics of the emitted light signal and the reflection.
  • processing circuit 120 can be configured to determine an angle of arrival of the reflected signals, and thereby estimate the respective positions of device 191 in camera coordinate system 150.
  • circuit 120 can also be configured to determine the position and the orientation of apparatus 100 in the coordinate system 160.
  • this functionality of circuit 120 can comprise one or more of a 2-D camera, orientation and/or motion sensors, global navigation and positioning system (GNSS, e.g ., GPS), positioning based on communication with other devices (e.g ., based on cellular base station and/or WiFi Access Point ID), etc .
  • GNSS global navigation and positioning system
  • system 100 can be stationary, such that the stationary position and/or orientation of system 100 can be stored in a memory comprising circuit 120.
  • circuit 120 is further configured to determine the position of device 191 (and, alternately, devices 192-193) in coordinate system 160 based on the determined position of device 191 in camera coordinate system 150 and the determined position and/or orientation of system 100 in coordinate system 160. This can be performed, e.g ., using one or more rotations to account for the orientation of system 100 in coordinate system 160, as well as one or more translations to account for the position of system 100 in coordinate system 160. Such translations and rotations can be performed, e.g ., using matrix operations known to skilled persons.
  • Figure 2 illustrates another optical system usable for position determination of one or more devices (e.g ., reflective tags) based on ToF measurements.
  • the system shown in Figure 2 can be seen as another view of system shown in Figure 1 , so that similar elements are given similar numbers.
  • Figure 2 shows certain elements in more detail, using numbers different than used in Figure 1 .
  • the optical system includes an optical sensing system 200 and a plurality of devices 291-293.
  • Optical sensing system 200 includes an illumination unit 210 that emits a series of light pulses 215. These are emitted over a range of azimuths and elevations, denoted with the shaded cone in Figure 2 , such that they illuminate various reflective devices (or tags) 291-293.
  • Each of devices 291-293 reflect light pulses 215 such that the reflections are incident on lens 240, which is configured to project the reflected light pulses on an image sensor 230. Since the devices 291-293 are at different positions relative to light pulses 215, the reflected light pulses will appear at different pixels within the pixel array comprising image sensor 230.
  • System 200 also includes ToF measurement and position determination circuit 220, which itself comprises various circuits and/or functional units.
  • circuit 220 includes image acquisition circuit 221, which reads the pixel values after image sensor 230 has converted the received optical energy into per-pixel electrical signals.
  • Image acquisition circuit 221 also controls a phase-shifting unit (PSU) 225, by which it can control the phase of the light pulses 215 emitted by illumination unit 210.
  • PSU phase-shifting unit
  • image acquisition circuit 221 can configure, via PSU 225, illumination unit 210 to emit a series of four groups of light pulses, each group having one or more pulses that are shifted 90 degrees relative to the pulses of the immediately preceding group.
  • the four groups of pulses can have phases of 0, 90, 180, and 270 degrees.
  • the timing and/or phases of the various groups of pulses can be further controlled by modulation clock supply circuit 223, which also supplies the timing (e.g ., clock signal) used for reading the pixel values from image sensor 230.
  • image acquisition circuit 221 can capture a series of four images, each corresponding to one of the four emitted light phases of 0, 90, 180, and 270 degrees. Further exemplary principles of operation are described as follows. Due to the travel time, the phase of the reflected pulsed light signal is shifted. This phase-shift can be measured by a photonic mixer device (PMD) located on each pixel of the sensor 230. For example, w hen photons arrive at the pixel, the PMD transfers the generated charges into either of two capacitors - A or B - associated with each pixel. Whether the charges are stored in A or B is controlled by the signal Fmod generated by clock supply circuit 223, which determines the frequency of the emitted light pulse.
  • PMD photonic mixer device
  • This signal is also provided to each pixel comprising sensor 230, and controls the reading of the voltages (A and B) associated with each pixel.
  • the difference between the two voltages is related to the per-pixel phase value, p, which is read for each pixel by image acquisition circuit 221.
  • phase offset ⁇ is by using the four-phase approach discussed above, thereby capturing four different images of the same scene, based on emitted light pulses shifted by 0, 90, 180, and 270 degrees. These images are provided to a processing system 227, which combines the four images into a depth image 228. Based on the de, from which the actual phase offset ⁇ can be determined. Since actual phase offset ⁇ is proportional to the ToF of the emitted light signal, the distance between the device and the sensor can be determined using the light modulation frequency f and the speed of light c.
  • circuit 229 can determine the position of the reflecting device in a world coordinate system according to the methods discussed above in relation to Figure 1 .
  • the phase values are directly influenced only by pulsed infrared light within a certain frequency interval.
  • Continuous light sources such as lamps or sunlight typically do not produce a signature in either the phase or the depth images.
  • AR devices such as smartphones or head-mounted devices (HMD) are becoming increasingly available to the public. Some of these include a ToF measurement circuitry operating according to principles described above with respect to Figures 1-2 . In such applications, ToF measurement circuitry is often used for user input (e.g ., gesture) detection and 3D scanning. If an AR device today wants to provide augmented information at a certain location, it needs to know its own position and orientation. This information is only available at locations which feature an indoor positioning system, or reference data.
  • AR devices are limited in their connectivity to other electronics.
  • traditional methods such as Wi-Fi or Bluetooth
  • the location of a communication partner is generally known only to an accuracy and/or resolution of 1-100 meters. This coarse location information prevents embedding and/or overlaying information and interaction possibilities with other devices and/or objects that appear in the AR field of view.
  • Exemplary embodiments of the present disclosure further extend the principles discussed above to provide devices that are capable of reflecting first optical signals emitted by sensing devices (e.g ., ToF measurement devices), and transmitting second optical signals comprising information associated with the device, e.g ., a device identity, user interface (UI) to an object associated with the device, etc .
  • sensing devices e.g ., ToF measurement devices
  • second optical signals comprising information associated with the device, e.g ., a device identity, user interface (UI) to an object associated with the device, etc .
  • UI user interface
  • Such information can facilitate user interaction with the object, e.g ., in AR and VR applications.
  • the first and second optical signals can have substantially similar wavelengths; for example, both signals can be infrared signals.
  • the device can be configured to reflect the first optical signal and transmit the second optical signal in a time-multiplexed manner, such that the second optical signal does not interfere with ToF measurements on the first optical signal.
  • the device can control the time-multiplexing by detecting active and/or inactive durations of the emitted first optical signal, and inhibiting and enabling the transmission of the second optical signal, respectively.
  • the device can receive, from the ToF measurement device, information identifying the active and/or inactive durations.
  • Exemplary embodiments of the present disclosure also include optical sensing systems that are capable of emitting first optical signals usable for ToF measurement, receiving versions of the first optical signals reflected by a device, performing ToF measurements based on characteristics of the first optical signal and the reflected versions, and receiving second optical signals comprising information associated with the device, e.g ., a device identity, user interface (UI) to an object associated with the device, etc .
  • optical sensing systems that are capable of emitting first optical signals usable for ToF measurement, receiving versions of the first optical signals reflected by a device, performing ToF measurements based on characteristics of the first optical signal and the reflected versions, and receiving second optical signals comprising information associated with the device, e.g ., a device identity, user interface (UI) to an object associated with the device, etc .
  • UI user interface
  • the first and second optical signals can have substantially similar wavelengths; for example, both signals can be infrared signals.
  • the optical sensing system can be configured to transmit the first optical signal and receive the second optical signal in a time-multiplexed manner, such that the second optical signal does not interfere with ToF measurements on the first optical signal.
  • the optical sensing system can control the time-multiplexing by sending, to the device, information identifying the active and/or inactive durations of the first optical signal.
  • the device can transmit in a time-multiplexed manner based on detecting active and/or inactive durations of the emitted first optical signal.
  • Figure 3 illustrates an exemplary optical system configurable for ToF measurement and information transfer, according to various exemplary embodiments of the present disclosure.
  • the system shown in Figure 3 can be seen as an augmented version of the system shown in Figure 2 , so that similar elements are given similar numbers ( e.g ., last two digits are the same). However, Figure 3 shows certain additional elements, which are given numbers different than those used in Figure 1 .
  • the optical system includes an optical sensing system 300 and a plurality of devices 391-393.
  • Optical sensing system 300 includes an illumination unit 310 that emits a series of light pulses 315. These are emitted over a range of azimuths and elevations, denoted with the shaded cone in Figure 3 , such that they illuminate various devices 391-393.
  • the light pulses 315 can comprise active durations (or periods), during which illumination unit 310 emits optical energy (e.g ., at an infrared wavelength). Two active durations 315a and 315c are shown.
  • the light pulses 315 can also include one or more inactive durations, intermittent between the active durations, during which illumination unit 310 does not emit optical energy.
  • One inactive duration 315b is shown for purposes of illustration. As denoted by the bracket underlying the pulses, Figure 3 illustrates the operation of the optical system during active period 315a, e.g ., for ToF measurement.
  • each of devices 391-393 includes an optical reflector capable of reflecting at least a portion of all incident optical signals. As such, each of devices 391-393 reflects light pulses 315, during active periods 315a and 315c, such that the reflections are incident on lens 340, which is configured to project the reflected light pulses on an image sensor 330. Since the devices 391-393 are at different positions relative to light pulses 315, the reflected light pulses will appear at different pixels within the pixel array comprising image sensor 330.
  • the timing of the active and inactive durations of the light pulses is under control of ToF measurement and position determination circuit 320 ( e.g ., via PSU 325) which operates during active periods in manner substantially similar to circuit 220 described above with respect to Figure 2 . As such, the ToF measurement and position determination operations of optical sensor 320 during active periods 315a and 315c will not be described further.
  • each of devices 391-393 comprises an optical transceiver, labeled 391a-393a, respectively.
  • Each optical transceiver can include an optical transmitter configured to emit an optical signal having a wavelength substantially similar to the wavelength of light pulses 315 ( e.g ., an infrared wavelength).
  • Each optical transceiver 391a-393a can also include an optical detector (e.g ., a photodiode) configured to determine and/or detect the active and inactive durations of light pulses 315.
  • each optical detector can be configured to inhibit its associated optical transmitter from emitting an optical signal during the detected active durations of light pulses 315 ( e.g ., 315a and 315c).
  • each optical detector can be configured to enable its associated optical transmitter to emit an optical signal during the detected inactive durations of light pulses 315 ( e.g ., 315b).
  • each optical transceiver 391a-393a can also include an optical receiver configured to receive information transmitted by the optical sensing system 300.
  • the information can be related to the active and/or inactive durations of the light pulses 315, and/or can include an identifier of the dual-mode optical device.
  • each optical device 391-393 can detect the onset of inactive duration 315b and enable its optical transmitter to transmit an information-bearing optical signal that can be received by optical sensing system 300 via lens 340 and image sensor 330.
  • PSK phase-shift keying
  • the optical sensing system can detect one or more of signals 371-373 transmitted during inactive period 315b.
  • the received optical beam widths of signals 371-373 can be narrow enough that they can be differentiated on the pixel array comprising image sensor 330.
  • image acquisition circuit 321 captures the pixel signals from image sensor 330 during inactive period 315b in a similar manner as during active period 315a, and sends these images to processing circuit 327 and, ultimately, to device information determination circuit 340.
  • This circuit can be configured to retrieve, for instance by demodulating, the information encoded in one or more of optical signals 371-373.
  • image acquisition circuit 321 can control illumination unit 310 and image sensor 330 to capture a plurality of images, such that the information can be retrieved from the captured plurality of images by the cooperative operation of processing system 327 and device information determination circuit 340.
  • multiple images can be captured and used to retrieve each information bit encoded within signals 371-373. The number of required images per bit can depend on the type of modulation used.
  • Figure 5 is a flow diagram that further illustrates the operations of optical devices 391-393 and optical sensing system 300 described above in relation to Figures 3-4 .
  • Each block in Figure 5 is associated with a particular operation performed by one of these peers.
  • the operations of one or more of the blocks associated with a particular peer e.g ., optical device
  • Figure 5 shows blocks in a particular order, this order is merely exemplary, and the operations can be performed in a different order than shown in Figure 5 and can be combined and/or divided into blocks having different functionality. Blocks comprising optional operations are shown using dashed lines.
  • the operations include block 505, in which the optical sensing system (OSS for short) can send information identifying an inactive duration of the signal emitted for the purposes of ToF measurement (referred to as "first signal” or “first optical signal”).
  • the OSS can send such information in an optical signal or a RF signal.
  • another device e.g ., an AR or VR device
  • the operations of block 510 can include the optical device receiving the signal comprising such information, from which it can detect that the first signal is inactive and, consequently, enable transmission of the optical signal used to convey information (referred to as "second signa" or “second optical signal”).
  • the information sent in block 505 can include an identifier of a particular optical device, e.g ., to indicate that the particular optical device is allowed to transmit during the inactive period.
  • the operations of block 510 can include the optical device detecting the onset of the inactive duration based on a lack of energy corresponding to the first optical signal.
  • the optical device e.g ., a detector
  • the optical device enables the transmission of the second optical signal.
  • the optical device emits the second optical signal with encoded information, e.g ., an information-bearing signal.
  • the information can include an identifier (ID) of the optical device encoded therein.
  • the second optical signal can include encoded information pertaining to a user interface (UI) of an object associated with the optical device.
  • Such information can include, e.g ., a universal resource locator (URL) or universal resource indicator (URI) that points to a location ( e.g ., on a server) where the UI can be obtained.
  • the information can include information pertaining to an operating parameter of the optical device, such as an intensity setting.
  • the information can be encoded based on phase-shift keying (PSK) modulation.
  • PSK phase-shift keying
  • This signal is received by the OSS and in block 520, the OSS captures one or more images and determines position of the signal on the pixel array of its image sensor. Subsequently, in block 525, the OSS retrieves the information from the captured image(s). In some embodiments, such as when the operations of block 515 include encoding using PSK modulation, retrieving the information can include demodulating the PSK-encoded information. Such information can subsequently be used by the OSS to identify the device, one or more of its operating parameters, and/or a UI of the object associated with the device, or transmitted to another system to facilitate the same.
  • the OSS operations also include block 530, where the OSS can send information identifying an active duration of the first optical signal.
  • block 530 can be included if corresponding block 505 (identifying an inactive duration) is also included.
  • the OSS emits the first optical signal (i.e. , the signal used for ToF measurement) during an active period.
  • the optical device detects that the first optical signal is in an active period and inhibits the emission of the second optical signal. If the operations of block 530 are included, the detection operation in block 540 can be based on receiving the information transmitted in block 530. In such case, the optical device can inhibit the emissions until subsequently receiving information identifying the next inactive duration. Otherwise, the detection operation in block 540 can be based the presence of energy corresponding to the first optical signal, and the optical device can inhibit the emissions until subsequently detecting a lack of energy corresponding to the first optical signal.
  • the optical device reflects the first optical signal, and the reflections are received by the OSS.
  • the OSS can emit a plurality of light pulses (or groups of light pulses), each of which can be at a different phase.
  • the operations of blocks 535 and 545 can include all of these emissions.
  • the OSS captures a plurality of images and determines a relative phase offset and pixel position of the reflected first optical signals.
  • the OSS measures the ToF of the first optical signal, which corresponds to the distance to the device.
  • the OSS determines the position of the optical device in a coordinate system. This can include a position in a local coordinate system ( e.g ., of the optical device) and, optionally, a position in a world coordinate system.
  • the OSS can optionally send information identifying the next inactive duration similar, e.g ., to the optional operation of block 505.
  • the optical device can detect the onset of an inactive duration of the first optical signal, either by receiving the information sent by the OSS in block 555 or by detecting a lack of energy corresponding to the first optical signal. Based on this detection, the optical device can enable its optical transmitter to emit the second signal. Subsequently, and if desired and/or necessary, in block 575 the optical device can emit the second optical signal with encoded information, in a manner similar as described above with respect to block 515..
  • Figure 6 further illustrates the time-multiplexed transmission and reception of the two optical signals between the optical sensing system (or sensor) and optical device, according to various exemplary embodiments of the present disclosure.
  • Figure 6(a) illustrates an operation similar to that shown in Figures 3-4 , in which the OSS emits a series of four phase-shifted light pulses ("first signal") during an active period (denoted as "2"), which are reflected by the optical device. The reflections are then used by the OSS to determine ToF/distance and device position, as discussed above.
  • the device emits the information-bearing (“second”) signal.
  • the OSS captures phase images of this signal and uses these to retrieve ( e.g ., decode and/or demodulate) the encoded information.
  • Figure 6(b) illustrates a scenario that can be used for long distance measurements, such as 3D scanning.
  • eight (8) phase-images are used to create a single depth image.
  • These eight images consist of two sets of four images, with each set at a different modulation frequencies.
  • each set of images contains the complete information necessary to create a single depth image.
  • the optical device can transmit the information-bearing signal in intermittent inactive periods between the two four-image active periods.
  • Figure 6(c) illustrates a scenario in which multiple the OSS captures multiple depth-image sets concurrently. For example, that can be used when distance measurements are not required to be very precise.
  • two light pulses with 90-degree offset are transmitted during each active period, and the optical device can transmit the information-bearing signal in intermittent inactive periods between the two-image active periods.
  • Figure 6(d) illustrates a scenario usable for capturing multiple depth images concurrently, which can be particular useful for AR/VR applications.
  • a higher update-rate e.g ., four-phase
  • another stream of depth images with a lower update-rate (e.g ., eight-phase) and longer range for room scanning or obstacle detection.
  • the optical device can detect first signals corresponding to both streams by either using the exposure times or number of phase-image per depth image. Based on this detection, the device can transmit information intermittent with one stream, and be silent for distance measurements in the other stream.
  • the OSS captures a depth image.
  • the depth image can be derived from any number of phase images between 1 and 8. These phase images can be captured using a single modulation frequency (such as in Figure 6(a) ) or using multiple modulation frequencies (such as in Figure 6(b) ).
  • ToF-based distance measurements are based on measuring the phase-shift of a "correlation function" which approximately represents a correlation between the phases of the emitted and reflected optical signals.
  • the phase-shift of this function is proportional with the distance due to the travel time of the light pulses.
  • the phase shift repeats at distances corresponding to multiples of 2 ⁇ radians, such that single-frequency ToF measurements can be ambiguous with respect to the actual distance.
  • Figure 7 shows the correlation function having the same value for two devices ("tags") at different distances from the sensor.
  • One possible solution, mentioned above, is to capture eight phase images, e.g ., four each at two different modulation frequencies. As also mentioned above, however, this lowers the position update rate and may not be suitable for many applications.
  • the OSS determining a relative signal strength (e.g ., intensity) of an incident signal from the optical device in the position determination.
  • the signal can be one or both of the information-bearing signal transmitted by the optical device ("second optical signal") or the signal transmitted by the OSS and reflected by the optical device ("first optical signal”). If the OSS is aware of the device's reflective and/or emissive properties, the OSS can use the measured relative signal strength(s) to improve the position determination by eliminating, or at least reducing the likelihood of, ambiguities in the correlation function between the transmitted and the reflected versions of the first optical signal.
  • the OSS can be aware of the device's reflective and/or emissive properties in various ways.
  • an operating parameter of the optical device such as an intensity setting, can be included in the information-bearing second optical signal.
  • This operating parameter can represent, or can be used by the OSS to determine, the device's reflective and/or emissive properties.
  • the OSS can determine the device's properties from a device ID included in the information-bearing second optical signal. For example, after receiving the device ID, the OSS can use the received ID to reference stored information about the device's properties.
  • the stored information can relate to the particular device transmitting the ID, or to a generic device type ( e.g ., model) that represents the particular device.
  • the information can be stored locally ( e.g ., within the OSS) or remotely.
  • the OSS can use information about its own emissive properties together with the device's reflective properties to determine the relative signal strength. For example, the OSS can learn the devices reflective properties by conducting one or more eight-phase reference measurements, then subsequently use that information in subsequent relative signal strength determinations. In other embodiments, a combination of the two above approaches can be used.
  • various exemplary embodiments of the combined optical sensing system (OSS) and dual-mode (i.e ., reflective and emissive) optical device can be utilized to facilitate user interaction in AR and VR applications.
  • the optical device is associated with (e.g ., attached to and/or co-located with) and object
  • various exemplary embodiments can facilitate interaction with the object via a user interface associated with the object.
  • Figure 8 is a flow diagram of an exemplary method and/or procedure for facilitating user interaction with an object associated with an optical device, according to various embodiments of the present disclosure.
  • OSS optical sensing system
  • Figure 8 shows blocks in a particular order, this order is merely exemplary, and the operations can be performed in a different order than shown in Figure 8 and can be combined and/or divided into blocks having different functionality. Blocks comprising optional operations are shown using dashed lines.
  • the exemplary method and/or procedure shown in Figure 8 can include the operation of block 810, in which the OSS emits a first optical signal having a first wavelength and comprising a plurality of intermittent active and inactive durations.
  • the exemplary method and/or procedure shown in Figure 8 can also include the operation of block 820, in which the OSS receives, during one or more of the active durations, a reflected version of the first optical signal.
  • the exemplary method and/or procedure shown in Figure 8 can also include the operation of block 830, in which the OSS receives, during one or more of the inactive durations, a second optical signal having a second wavelength substantially similar to the first wavelength of the first optical signal.
  • the first and second optical signals can have infrared wavelengths.
  • the second optical signal can be an information-bearing signal, with the information encoded by the transmitting optical device using, e.g ., PSK modulation.
  • the exemplary method and/or procedure shown in Figure 8 can also include the operation of block 840, in which the OSS determines a location of at least one of the optical device and the object, based on the first optical signal and the received reflected version of the first optical signal. For example, this can be done by performing a time-of-flight (ToF) measurement based on the first optical signal and the reflected version of the first optical signal, and determining the location based on the ToF measurement.
  • ToF time-of-flight
  • the exemplary method and/or procedure shown in Figure 8 can also include the operation of block 850, in which the OSS determines a user interface (UI) associated with the object, based on information included in the second optical signal. For example, this can be done by retrieving (e.g ., demodulating and/or decoding) the information encoded within the second optical signal by the transmitting optical device. In some embodiments, retrieving the information can be done by demodulating information that was encoded using PSK modulation. This information can comprise elements of the UI itself ( e.g ., a Javascript), or an identifier for the optical device, the object, and/or the UI that can facilitate retrieval of the UI for the object.
  • UI user interface
  • the identifier can be a URL or a URI that facilitates retrieving the UI information from a record in a database that can be stored locally (e.g ., on the OSS or the AR/VR device) or remotely (e.g ., on a server).
  • the UI associated with the object can also be determined based on the location of the object relative to the OSS. For example, there can be multiple UIs associated with an object, with the selection among the options based on the distance to the object. As such, a ToF measurement performed in block 840 can be used in determination of the UI associated with the object. Likewise, relative signal strength information can also be used in this determination, e.g ., to reduce and/or eliminate the distance ambiguity in the correlation function described above in relation to Figure 7 .
  • the exemplary method and/or procedure shown in Figure 8 can also include the operation of block 860, in which the AR/VR device comprising the OSS captures images of a field of view using a camera, with the field of view including the object.
  • An exemplary system usable in this manner is illustrated in Figure 9 . More specifically, Figure 9 shows an exemplary AR device (i.e ., a smartphone or tablet device incorporating AR functionality) comprising a color camera, a display (e.g ., color LCD or OLED), and an OSS (labelled "ToF”) according to any of the various exemplary embodiments described above.
  • the AR device is also capable of communicating with an optical device (associated with the light bulb shown in Figure 9 ) for the purposes of ToF measurements and receiving information, as described above.
  • the AR device is capable of providing viewfinder functionality in which an image stream from the color camera is shown on the display. For example, if the object is within the captured field of view, the AR device can show the object at a corresponding location on the display.
  • the exemplary method and/or procedure shown in Figure 8 can also include the operation of block 870, in which the AR/VR device can render the UI information associated with the object on the display.
  • the UI information can be rendered together with the captured images on the display, e.g ., in an region of the display corresponding to a position of the object in the field of view.
  • the UI information can be shown as overlapping with, or adjacent to, the object on the display.
  • rendering of the object and/or the UI information on the display can involve and/or require a geometrical transformation of OSS-centric position determined for the optical device into a coordinate system for the AR or VR device, here denoted C AR or C VR .
  • C AR or C VR a coordinate system for the AR or VR device
  • the particular transformation into C AR or C VR varies with the type of AR/VR device.
  • a VR device can utilized a 3D coordinate system by which a rendering engine place virtual objects over an augmented view.
  • C AR is the 2D coordinate space of the color camera.
  • Figure 10 illustrates an exemplary coordinate transformation between an object location determined via ToF measurement in OSS-centric coordinate system and a camera coordinate system, denoted RGB.
  • the exemplary method and/or procedure shown in Figure 8 can also include the operation of block 880, in which the AR/VR device can receive user input associated with the object.
  • the AR/VR device can also perform an action associated with the user input.
  • Figure 11 illustrates various exemplary embodiments in which a smartphone-type AR device, such as shown in Figure 9 , is used to receive user input associated with an object.
  • Figure 11(a) shows the case where the object (e.g ., the lightbulb) and a UI associated with the object are shown together on the display.
  • Figure 11(b) shows an embodiment in which the user touches a region of the display (e.g ., a touchscreen display) in which the user-interface information was rendered.
  • the user input comprises a signal indicating the user touch in this region of the display.
  • Figure 11 (c) shows an embodiment in which the user occludes a portion of the field of view of the camera ( e.g ., a portion corresponding to the region of the display in which the user-interface information was rendered). This occlusion can occur, e.g ., by the user placing a finger within the field of view.
  • the user input can comprise a signal indicating detection of this occlusion.
  • the user input can comprise a signal indicating detection of this gesture.
  • the user input can comprise a signal indicating detection of this gesture.
  • an action associated with the user input can include selecting a particular UI element that is touched on the display or occluded in the image stream shown on the display. Such selection can then trigger an action on the object, such as turning on or off the exemplary light bulb shown in Figure 11 .
  • This can be done, for example, by sending a command to a light controller.
  • the light controller is communicable with an optical device that includes an optical receiver, the command can be sent from the OSS to the optical device via an optical signal modulated with information representing the command. Alternately, such a command can be sent via a secondary communication link (e.g ., an RF link) between the AR device and the light controller.
  • Figure 12 illustrates various exemplary embodiments in which an exemplary head-mounted AR device (e.g ., glasses) is used to receive user input associated with an object.
  • the exemplary head-mounted AR device shown in Figure 12 can incorporate any of the optical sensing system (OSS) embodiments described above.
  • OSS optical sensing system
  • the UI information does not augment a video stream from a camera. Instead, the UI information can be rendered or shown, on one or more of the lenses, in a region corresponding to a position of the object in the user's field of view. Due to the stereoscopic nature of such devices, the object UI can be displayed in 3D. As such, the 3D position of the object can also be used to adjust the focus of the displayed UI information on devices which feature light field displays.
  • swipe glasses which generally refers to head-mounted devices that do not provide an AR overlay on the lenses (such as illustrated in Figure 12 ), but rather include a small display on the side of the field of view.
  • the UI information associate with the object can be shown on this small display, and the user can use gestures (e.g ., occlusion, gesture, and/or gesture) to interact with the object via the UI elements.
  • gestures e.g ., occlusion, gesture, and/or gesture
  • FIG. 1 Various embodiments described above can also be incorporated into various VR devices, such as VR goggles.
  • objects associated with an optical device at a location determined (e.g ., by ToF measurements) in a physical coordinate system e.g ., an OSS-centric coordinate system and/or a world coordinate system
  • a physical coordinate system e.g ., an OSS-centric coordinate system and/or a world coordinate system
  • FIG. 1 Various embodiments described above can also be incorporated into various VR devices, such as VR goggles.
  • objects associated with an optical device at a location determined (e.g ., by ToF measurements) in a physical coordinate system e.g ., an OSS-centric coordinate system and/or a world coordinate system
  • Various transformations known to skilled persons can be used for this mapping between coordinate systems.
  • Embodiments of optical devices described herein can be used to mark other people, physical obstacles, computer input devices ( e.g ., mouse or keyboard), etc . In this manner,
  • Example 1 of a method of receiving, in a time-multiplexed manner, a plurality of optical signals having substantially similar wavelengths comprises emitting a first optical signal having a first wavelength and comprising a plurality of intermittent active and inactive durations; receiving, during one or more of the active durations, a reflected version of the first optical signal and performing a time-of-flight (ToF) measurement based on the first optical signal and the reflected version of the first optical signal; and receiving, only during one or more of the inactive durations of the first optical signal, a second optical signal having a second wavelength substantially similar to the first wavelength of the first optical signal.
  • ToF time-of-flight
  • the reflected version of the first optical signal and the second optical signal are received from a single optical device.
  • the method is performed by a time-of-flight (TOF) measurement device.
  • TOF time-of-flight
  • the first and second wavelengths are associated with infrared light.
  • one of examples 2 to 4 further comprises determining a location of the optical device based on the ToF measurement; and retrieving, from the received second optical signal, information encoded in said received second optical signal.
  • the method of example 5 further comprises determining a relative signal strength of at least one of: the received reflections of the first optical signal, and the received second optical signal; and determining the location of the optical device further based on the determined relative signal strength and a correlation function between the first optical signal and the reflected version of the first optical signal.
  • the method of example 5 or 6 further comprises transmitting information identifying one or more of at least one of an active duration and an inactive duration of the first optical signal; and an identifier of the optical device.
  • the second optical signal comprises information encoded based on phase-shift keying (PSK) modulation; and retrieving the information comprises demodulating the PSK-encoded information.
  • PSK phase-shift keying
  • Example 9 is an optical sensing system arranged to receive, in a time-multiplexed manner, a plurality of optical signals having substantially similar wavelengths, comprises an optical transmitter configured to emit a first optical signal having a first wavelength and comprising a plurality of intermittent active and inactive durations; and an optical sensor configured to receive, during one or more of the active durations, a reflected version of the first optical signal; and receive, only during one or more of the inactive durations of the first optical signal, a second optical signal having a second wavelength substantially similar to the first wavelength of the first optical signal, one or more processing circuits configured to perform a time-of-flight (ToF) measurement based on the first optical signal and the reflected version of the first optical signal.
  • TOF time-of-flight
  • the one or more processing circuits and the optical sensor are further configured to cooperatively determine a location of the optical device based on the ToF measurement; and retrieve, from the received second optical signal, information encoded in said received second optical signal.
  • Example is 17 a dual-mode optical device comprising an optical reflector configured to reflect an incident first optical signal having a first wavelength, wherein the first optical signal comprises a plurality of intermittent active and inactive durations; an optical transmitter configured to emit a second optical signal having a second wavelength substantially similar to the first wavelength; and a detector configured to determine the active and inactive durations of the first optical signal; and inhibit the optical transmitter from emitting the second optical signal during the detected active durations of the first optical signal.
  • the optical transmitter of example 17 is configured to encode information using phase-shift keying (PSK) modulation.
  • PSK phase-shift keying
  • the first and second wavelengths are associated with infrared light.
  • the detector is further configured to enable the optical transmitter to emit the second optical signal during the detected inactive durations of the first optical signal.
  • Example 26 is a method of facilitating user interaction with an object associated with an optical device, the method comprising emitting a first optical signal having a first wavelength and comprising a plurality of intermittent active and inactive durations receiving, during one or more of the active durations, a reflected version of the first optical signal receiving, only during one or more of the inactive durations of the first optical signal, a second optical signal having a second wavelength substantially similar to the first wavelength of the first optical signal determining a location of at least one of the optical device and the object, based on the first optical signal and the received reflected version of the first optical signal; and determining a user interface associated with the object based on the location and information encoded in the second optical signal.
  • determining the location in example 26 comprises performing a time-of-flight (ToF) measurement based on the first optical signal and the reflected version of the first optical signal; and determining the location based on the ToF measurement.
  • ToF time-of-flight
  • determining the user interface associated with the object in example 26 or 27 comprises retrieving user-interface information from a record in a database, the record being associated with the information encoded in the second optical signal.
  • one of examples 26 to 28, further comprises capturing images of a field of view using a camera, the field of view including the object; and rendering the user-interface information together with the captured images on a display, wherein the user-interface information is rendered in an region of the display corresponding to a position of the object in the field of view.
  • determining the location of at least one of the optical device and the object comprises determining the location relative to an optical sensing system that performs the emitting and receiving operations; and determining the user interface associated with the object is further based on the location relative to the optical sensing system.
  • determining the location relative to the optical sensing system is further based on a correlation function between the first optical signal and the reflected version of the first optical signal; and a relative signal strength of at least one of the received reflected version of the first optical signal; and the received second optical signal.
  • the reflected version of the first optical signal and the second optical signal are received from a single optical device.
  • example 33 the method in example 32 is performed by a time-of-flight (TOF) measurement device.
  • TOF time-of-flight
  • the first and second wavelengths are associated with infrared light.

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Electromagnetism (AREA)
  • Signal Processing (AREA)
  • General Physics & Mathematics (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Optical Radar Systems And Details Thereof (AREA)
  • Length Measuring Devices By Optical Means (AREA)
EP19179806.5A 2018-06-13 2019-06-12 Dispositifs optiques à double mode pour détection de temps de vol et transfert d'informations et appareil, systèmes et procédés les utilisant Pending EP3581960A1 (fr)

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
US16/007,086 US10630384B2 (en) 2018-06-13 2018-06-13 Dual-mode optical devices for time-of-flight sensing and information transfer, and apparatus, systems, and methods utilizing same

Publications (1)

Publication Number Publication Date
EP3581960A1 true EP3581960A1 (fr) 2019-12-18

Family

ID=66857682

Family Applications (1)

Application Number Title Priority Date Filing Date
EP19179806.5A Pending EP3581960A1 (fr) 2018-06-13 2019-06-12 Dispositifs optiques à double mode pour détection de temps de vol et transfert d'informations et appareil, systèmes et procédés les utilisant

Country Status (3)

Country Link
US (2) US10630384B2 (fr)
EP (1) EP3581960A1 (fr)
CN (1) CN110662162B (fr)

Families Citing this family (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US11272156B2 (en) * 2019-02-15 2022-03-08 Analog Devices International Unlimited Company Spatial correlation sampling in time-of-flight imaging
CN112153184B (zh) * 2019-06-28 2022-03-22 Oppo广东移动通信有限公司 移动终端
US11483070B2 (en) * 2020-12-04 2022-10-25 Eric Clifton Roberts Systems, methods, and devices for infrared communications
US12057919B2 (en) * 2021-01-14 2024-08-06 Qualcomm Incorporated Reporting angular offsets across a frequency range

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20140313376A1 (en) * 2012-01-10 2014-10-23 Softkinetic Sensors Nv Processing of time-of-flight signals
US20160353989A1 (en) * 2005-12-14 2016-12-08 Digital Signal Corporation System and Method for Tracking Motion
US9915528B1 (en) * 2014-08-13 2018-03-13 Amazon Technologies, Inc. Object concealment by inverse time of flight

Family Cites Families (19)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7663502B2 (en) * 1992-05-05 2010-02-16 Intelligent Technologies International, Inc. Asset system control arrangement and method
US6198528B1 (en) * 1998-05-22 2001-03-06 Trimble Navigation Ltd Laser-based three dimensional tracking system
US20010005830A1 (en) * 1999-12-27 2001-06-28 Tadashi Kuroyanagi Information medium with respect to food and drink, health control terminal and health control support system
DE102006050303A1 (de) * 2005-12-05 2007-06-14 Cedes Ag Sensoranordnung sowie Sensorvorrichtung für eine Sensoranordnung
US8659658B2 (en) * 2010-02-09 2014-02-25 Microsoft Corporation Physical interaction zone for gesture-based user interfaces
US9365303B2 (en) * 2012-05-03 2016-06-14 Raytheon Company Position and elevation acquisition for orbit determination
EP2703836B1 (fr) 2012-08-30 2015-06-24 Softkinetic Sensors N.V. Système d'éclairage TOF et caméra TOF et procédé de fonctionnement, avec supports de commande de dispositifs électroniques situés dans la scène
US9791921B2 (en) 2013-02-19 2017-10-17 Microsoft Technology Licensing, Llc Context-aware augmented reality object commands
US9118435B2 (en) * 2013-03-08 2015-08-25 National Chiao Tung University Integrated passive optical network (PON) system
US9459696B2 (en) * 2013-07-08 2016-10-04 Google Technology Holdings LLC Gesture-sensitive display
DE102014115310A1 (de) * 2014-10-21 2016-04-21 Infineon Technologies Ag Bilderzeugungsvorrichtungen und ein Laufzeit-Bilderzeugungsverfahren
US9823352B2 (en) * 2014-10-31 2017-11-21 Rockwell Automation Safety Ag Absolute distance measurement for time-of-flight sensors
US20160183057A1 (en) * 2014-12-18 2016-06-23 Intel Corporation Method and system for hybrid location detection
US10277317B2 (en) * 2015-02-10 2019-04-30 Brightcodes Technologies Ltd. System and method for providing optically coded information
US10397546B2 (en) * 2015-09-30 2019-08-27 Microsoft Technology Licensing, Llc Range imaging
US20170351336A1 (en) * 2016-06-07 2017-12-07 Stmicroelectronics, Inc. Time of flight based gesture control devices, systems and methods
CA3028156A1 (fr) * 2016-06-30 2018-01-04 Bossa Nova Robotics Ip, Inc. Systeme a cameras multiples pour suivi d'inventaire
DE102017126378A1 (de) 2017-11-10 2019-05-16 Infineon Technologies Ag Verfahren zum Verarbeiten eines Rohbildes einer Time-of-Flight-Kamera, Bildverarbeitungsvorrichtung und Computerprogramm
DE102017128369A1 (de) 2017-11-30 2019-06-06 Infineon Technologies Ag Vorrichtung und verfahren zum lokalisieren eines ersten bauelements, lokalisierungsvorrichtung und verfahren zur lokalisierung

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20160353989A1 (en) * 2005-12-14 2016-12-08 Digital Signal Corporation System and Method for Tracking Motion
US20140313376A1 (en) * 2012-01-10 2014-10-23 Softkinetic Sensors Nv Processing of time-of-flight signals
US9915528B1 (en) * 2014-08-13 2018-03-13 Amazon Technologies, Inc. Object concealment by inverse time of flight

Also Published As

Publication number Publication date
US10630384B2 (en) 2020-04-21
CN110662162B (zh) 2024-07-05
US10944474B2 (en) 2021-03-09
US20190386744A1 (en) 2019-12-19
CN110662162A (zh) 2020-01-07
US20200244358A1 (en) 2020-07-30

Similar Documents

Publication Publication Date Title
US10944474B2 (en) Dual-mode optical devices for time-of-flight sensing and information transfer
Afzalan et al. Indoor positioning based on visible light communication: A performance-based survey of real-world prototypes
US10718849B2 (en) Wireless beacon-enabled luminaire identification system and method for determining the position of a portable device
US11789150B2 (en) Localization apparatus and method
EP3049821B1 (fr) Cartographie photo hybride extérieure
US10062178B2 (en) Locating a portable device based on coded light
CN107683420A (zh) 位置跟踪系统和方法
JP2018535410A (ja) 3次元空間検出システム、測位方法及びシステム
US10694328B2 (en) Method of locating a mobile device in a group of mobile devices
US20160005174A1 (en) System and method for synchronizing fiducial markers
Köhler et al. Tracksense: Infrastructure free precise indoor positioning using projected patterns
US11269061B2 (en) System and method of scanning and aquiring images of an environment
Jiang et al. VisBLE: Vision-enhanced BLE device tracking
Wen et al. Enhanced pedestrian navigation on smartphones with vlp-assisted pdr integration
JP2017055446A (ja) カメラを持つリモコンからの制御フィーチャの推定
US20180143313A1 (en) Tracking using encoded beacons
US10830875B2 (en) Auxiliary apparatus for a lighthouse positioning system
US11762096B2 (en) Methods and apparatuses for determining rotation parameters for conversion between coordinate systems
Bera et al. A Truly 3D Visible Light Positioning System using Low Resolution High Speed Camera, LIDAR, and IMU Sensors
Billah et al. Fusing Computer Vision and Wireless Signal for Accurate Sensor Localization in AR View
Hong et al. Line-of-sight signaling and positioning for mobile devices
TW202426962A (zh) 定位方法、操作方法及定位裝置
KR20230055387A (ko) 딥러닝 기반 실내 약도 생성 및 증강현실 실내 내비게이션을 지원하는 사용자 단말 및 컴퓨터 프로그램

Legal Events

Date Code Title Description
PUAI Public reference made under article 153(3) epc to a published international application that has entered the european phase

Free format text: ORIGINAL CODE: 0009012

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: THE APPLICATION HAS BEEN PUBLISHED

AK Designated contracting states

Kind code of ref document: A1

Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR

AX Request for extension of the european patent

Extension state: BA ME

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: REQUEST FOR EXAMINATION WAS MADE

17P Request for examination filed

Effective date: 20200618

RBV Designated contracting states (corrected)

Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: EXAMINATION IS IN PROGRESS

17Q First examination report despatched

Effective date: 20211122