EP3639122B1 - Détection d'objet et identification de mouvement à l'aide d'un rayonnement électromagnétique - Google Patents

Détection d'objet et identification de mouvement à l'aide d'un rayonnement électromagnétique Download PDF

Info

Publication number
EP3639122B1
EP3639122B1 EP18821570.1A EP18821570A EP3639122B1 EP 3639122 B1 EP3639122 B1 EP 3639122B1 EP 18821570 A EP18821570 A EP 18821570A EP 3639122 B1 EP3639122 B1 EP 3639122B1
Authority
EP
European Patent Office
Prior art keywords
emitters
electromagnetic radiation
receivers
activated
power level
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
EP18821570.1A
Other languages
German (de)
English (en)
Other versions
EP3639122A4 (fr
EP3639122A1 (fr
Inventor
Juan Pablo Forero CORTES
Santiago Ortega Avila
Sajid Sadi
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Samsung Electronics Co Ltd
Original Assignee
Samsung Electronics Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Samsung Electronics Co Ltd filed Critical Samsung Electronics Co Ltd
Publication of EP3639122A4 publication Critical patent/EP3639122A4/fr
Publication of EP3639122A1 publication Critical patent/EP3639122A1/fr
Application granted granted Critical
Publication of EP3639122B1 publication Critical patent/EP3639122B1/fr
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • G06F3/042Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by opto-electronic means
    • G06F3/0421Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by opto-electronic means by interrupting or reflecting a light beam, e.g. optical touch-screen
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/017Gesture based interaction, e.g. based on a set of recognized hand gestures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/0304Detection arrangements using opto-electronic means
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/0304Detection arrangements using opto-electronic means
    • G06F3/0325Detection arrangements using opto-electronic means using a plurality of light emitters or reflectors or a plurality of detectors forming a reference frame from which to derive the orientation of the object, e.g. by triangulation or on the basis of reference deformation in the picked up image
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • G06F3/046Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by electromagnetic means
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04883Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures for inputting data by handwriting, e.g. gesture or text

Definitions

  • This disclosure generally relates to electronic detection of an object.
  • Smart devices and electronic systems may monitor health, daily routines, activities, habits, preferences, etc. of a user.
  • the monitoring may be achieved through the interactions with smart devices and electronic systems.
  • Such interactions may be touchless for a variety of considerations such as hygiene, security, convenience, increased functionality, etc.
  • in-the-air hand interactions without touching any controllers may be desired to enable effective interactions between human and smart devices and electronic systems.
  • In-the-air hand interactions may have a variety of advantages including providing larger interaction space between a user and a device or system, more freedom for a user during the interaction, eliminating hygiene issues as no touch is needed, convenience, privacy protection, intuitiveness, etc.
  • In-the-air interactions may require effective detection and localization of objects, motion tracking of objects, and motion identification of objects.
  • Existing solutions for the aforementioned tasks mainly use different combinations of sensors including ultrasound, laser, magnetic field, cameras, non-focused light receivers, and RADAR. However, these solutions are relatively expensive, demand many computational and processing resources, and have relatively poor energy efficiency.
  • US 2015/258432 A1 is concerned with a method for enhancing detection of a user's hand relative to a head-mounted display (HMD).
  • HMD head-mounted display
  • US 8515128 B1 is concerned with a method for detecting objects hovering above a display as a mechanism for user input.
  • US 2014/125813 A1 is concerned with an imaging system and methods optimize illumination of objects for purposes of detection, recognition and/or tracking by tailoring the illumination to the position of the object within the detection space.
  • EP 2738647 A1 is concerned with a method for identifying contactless gestures on a electronic device by monitoring an amplitude of received electromagnetic radiation, detecting a proximity event, continuing to monitor the amplitude of the received electromagnetic radiation and performing an analysis.
  • the touchless sensing system described herein may allow a user to interact with a touchless sensing system using in-the-air hand interactions.
  • the touchless sensing system may use electromagnetic radiation (e.g., near infrared light) to sense in-the-air hand interactions by tracking hand movement and detecting gestures performed in a three dimensional (3D) interaction space at a low cost and low computational load.
  • the touchless sensing system may use one or more infrared LEDs (light-emitting diodes) and one or more photodiodes to sense in-the-air hand interactions.
  • the acquired data corresponding to the interactions may be more compact and easier to process, which is more economical.
  • the frames of the data may contain only a few bytes in length, which may allow for the usage of a combined solution of efficient computational techniques to reconstruct and process the data.
  • the touchless sensing system may use a set of algorithms executed by a microcontroller with a single core and a set of sensors to analyze the data and obtain accurate detection of in-the-air interactions.
  • Particular embodiments of the touchless sensing system described herein may have a relatively high sensing resolution, particularly when compared to the relatively minimal processing and energy requirements.
  • Particular embodiments of the touchless sensing system may take a small and adaptable form so it can be embedded wherever necessary according to the requirements of applications.
  • Particular embodiments of the touchless sensing system may include efficient algorithms to process data acquired by sensors in real time with minimum computational cost, which may enable in-the-air interactions to be applied to a broader range of devices and systems.
  • this disclosure describes and illustrates particular touchless sensing systems for detecting, tracking, and identifying in-the-air objects and motions, this disclosure contemplates any suitable touchless sensing systems for detecting, tracking, and identifying in-the-air objects and motions in any suitable manner.
  • FIG. 1 illustrates an example touchless sensing system in a variety of use cases.
  • a touchless sensing system may detect that an object (e.g., hand) is in its field of view, such as a field of view of one or more of the LEDs or photodiodes of the touchless sensing system.
  • the touchless sensing system may further use stochastic (probabilistic) methods for estimating whether a detected object is present and/or moving.
  • the touchless sensing system may additionally use one or more algorithms for identifying the motion of the detected object (e.g., hand gesture).
  • a touchless sensing system may include a plurality of sensing modules based on a plurality of components that have simple structures, low cost, and low power consumption. For example, these components may include LEDs and photodiodes.
  • Using simple, cheap and power-efficient components and stochastic methods may enable a touchless sensing system to have one or more advantages.
  • One advantage may include saving power with respect to both battery and computation.
  • Another advantage may include having no requirement for calculating time of flight of a projectile of an object.
  • a touchless sensing system may use relatively simple mathematical calculations. Particular embodiments of the touchless sensing system described herein may also be small, adaptable, cheap and computationally efficient.
  • a touchless sensing system may be integrated with a plurality of devices and systems for a plurality of use cases. For example, as illustrated in FIG. 1 , a touchless sensing system may be integrated with a smart watch, car system, smart phone, public display, laptop, medical device, home appliance, and/or smart TV.
  • this disclosure illustrates a particular scenario involving in-the-air interaction with a touchless sensing system and a plurality of particular use cases
  • this disclosure contemplates any suitable scenario involving in-the-air interaction(s) with any suitable touchless sensing systems and any suitable use cases in any suitable manner.
  • FIG. 2 illustrates an example prototype of a touchless sensing system 200.
  • the touchless sensing system 200 may include a plurality of emitters 202 of electromagnetic radiation.
  • the electromagnetic radiation may include near infrared (NIR) light.
  • NIR near infrared
  • Each emitter 202 may include one or more light-emitting diodes (LEDs) and may correspond to a different field of view. For instance, each emitter 202 can be directed or pointed toward a different field of view.
  • the touchless sensing system 200 may also include a plurality of receivers 203 of electromagnetic radiation. Each receiver 203 may include one or more photodiodes and may correspond to a different field of view.
  • the touchless sensing system 200 may also include one or more non-transitory storage media embodying instructions.
  • the touchless sensing system 200 may additionally include one or more processors.
  • the one or more processors may be operable to execute the instructions to activate at least some of the emitters 202 according to an illumination pattern.
  • the one or more processors may be operable to execute the instructions to also detect a presence of an object or a motion of the object in a corresponding field of view of at least one of the receivers 203, based at least on the illumination pattern and on electromagnetic radiation received or detected by one or more receivers 203.
  • the touchless sensing system 200 may include a plurality of sensing modules 201.
  • Each of the plurality of sensing modules 201 may include one or more emitters 202 of the plurality of emitters 202 of electromagnetic radiation.
  • Each emitter 202 in the same sensing module 201 may have the same field of view or a different field of view.
  • Each of the plurality of sensing modules 201 may also include one or more receivers 203 of the plurality of receivers 203 of electromagnetic radiation.
  • Each of the plurality of sensing modules 201 may additionally include one or more microcontrollers.
  • the one or more microcontrollers of each of the plurality of sensing modules 201 may be configured to communicate with the control board 205.
  • the one or more microcontrollers of each of the plurality of sensing modules 201 may be configured to also modulate the electromagnetic radiation emitted by the one or more emitters 202 of the corresponding sensing module 201.
  • the one or more microcontrollers of each of the plurality of sensing modules 201 may be configured to additionally regulate emission power of the one or more emitters 202 of the corresponding sensing module 201.
  • the one or more microcontrollers of each of the plurality of sensing modules 201 may be configured to further process the electromagnetic radiation received by the one or more receivers 203 of the corresponding sensing module 201.
  • sensing module While this disclosure describes various aspects of sensors, receivers, and microcontrollers within a sensing module, this disclosure contemplates that those components may by implemented in a touchless sensing system separately from a sensing module, that a sensing module may include only some of those components, and/or that a sensing module may include additional components of the touchless sensing system.
  • the touchless sensing system 200 may also include one or more control modules 204.
  • the touchless sensing system 200 may be a modular design, which may allow for different physical distributions of sensing modules 201 and different numbers of sensing modules 201 in the touchless sensing system 200.
  • the modular design may also allow for better scalability of the touchless sensing system 200.
  • the touchless sensing system 200 may include eight sensing modules 201, one control module 204, and one control board 205 across the surface of the touchless sensing system 200, as illustrated in FIG. 2 .
  • the control module 204 may be located at the center of the control board 205.
  • FIG. 3 illustrates another example prototype of the touchless sensing system 200.
  • the touchless sensing system 200 may include five sensing modules 201, one control module 204 and one control board 205.
  • Four sensing modules 201 may be located at the corners of the control board 205 and the remaining one may be located at the center of the control board 205.
  • the touchless sensing system 200 may prioritize the plurality of sensing modules 201 differently based on different tasks.
  • the sensing modules at the edges of a touchless sensing system such as sensing modules 201 at the corners of the control board 205, may be prioritized over other sensing modules 201 if the touchless sensing system 200 is identifying a motion of an object (e.g., hand gesture).
  • each of the plurality of sensing modules 201 may have the same priority if the touchless sensing system 200 is determining a position of an object.
  • the touchless sensing system 200 may prioritize the plurality of sensing modules 201 using both the aforementioned strategies if the touchless sensing system 200 is tracking a motion of an object.
  • the touchless sensing system 200 may activate the emitters in the plurality of sensing modules 201 with different patterns (e.g., illumination patterns).
  • the touchless sensing system 200 may activate the emitters in the plurality of sensing modules 201 of the example prototype in FIG. 2 sequentially.
  • the touchless sensing system 200 may activate the emitters in one sensing module 201 at a time (e.g., at the same time) and activate all receivers of all sensing modules after activating the emitters in the one sensing module.
  • the other sensing modules 201 remain inactive.
  • the touchless sensing system 200 may proceed to activate the emitters in other sensing modules 201 in this way until all the sensing modules 201 have been activated.
  • the touchless sensing system 200 may activate emitters in the plurality of sensing modules 201 of the example prototype in FIG. 3 in the following way.
  • the touchless sensing system 200 may first activate the emitters in the sensing modules 201 at the corners of the control board 205, and then activate the emitters in the sensing module 201 at the center of the control board 205. The touchless sensing system 200 may also then activate the emitters of all the sensing modules 201 sequentially as illustrated in the previous example. As this example illustrates, a touchless sensing system may implement a plurality of different illumination patterns in sequence. Although this disclosure illustrates particular ways to activate the emitters of various sensing modules, this disclosure contemplates any suitable way to activate the sensing modules in any suitable manner.
  • the touchless sensing system 200 may be configured with particular parameters with respect to different components.
  • the distance between two of the plurality of sensing modules 201 may be a particular value.
  • the distance may be 2 cm or less.
  • the height of the isolation walls of the plurality of sensing modules 201 may be a particular value.
  • the height may be 4 mm.
  • the material of the isolation walls of the plurality of sensing modules 201 may be a particular type of material.
  • the material may be photopolymer resin RS-F2-GPBK-04.
  • the power used by the emitters of the plurality of sensing modules 201 may be at a particular level.
  • each sensing module 201 may emit electromagnetic radiation at 2.6 mW in bursts of 1200 ⁇ s.
  • Each sensing module 201 may be active for approximately 100 ms within one second.
  • the example prototype of the touchless sensing system 200 as illustrated in FIG. 2 may have eight sensing modules 201. As a result, all the eight sensing modules 201 may consume approximately 2mW per second.
  • the field of view of the receivers of the plurality of sensing modules 201 may be of particular extent.
  • the field of view may be 150 degrees (i.e., -75° ⁇ +75° relative to direction normal to the sensor). If an emitter is placed at an angle that is less than -75°, the receiver may receive half or less than half of the power of that emitter).
  • this disclosure illustrates particular parameters of the touchless sensing system, this disclosure contemplates any suitable parameters of the touchless sensing system in any suitable manner.
  • the touchless sensing system 200 may modulate emitted NIR light at 57.6 KHz and send it in short bursts to reduce environmental noise such as sunlight or fluorescent indoor light.
  • the photodiode of a receiver 203 may have a wavelength sensitivity peak at 940 nm, a band pass filter and an automatic gain control module.
  • the receivers 203 may detect the envelope of the NIR light and dynamically adjust the gain to compensate for different ambiance light.
  • the touchless sensing system 200 may fully control the emission power of each of the emitters 202 of each sensing module 201 through the corresponding microcontroller using digital potentiometers.
  • the touchless sensing system 200 may precisely manage the total energy radiated by each sensing module 201.
  • the control board 205 may include a microcontroller, a power management section and a Bluetooth transceiver. The control board 205 may synchronize up to all the sensing modules 201.
  • the control board 205 may provide a stable power source for the touchless sensing system 200, coordinate all the sensing modules 201, cluster all the raw data samples collected from the receivers 203, and effectively handle communication, such as Bluetooth communication, with an entity that processes and models raw data samples from the receivers 203.
  • the physical distribution of sensing modules 201 may have a direct impact on the spatial resolution in a 2D space measured by X and Y axes, and on the field of view of the touchless sensing system 200.
  • the physical distribution of sensing modules 201 illustrated in FIG. 2 may support a spatial resolution of two centimeters along the X and Y axes and a resolution of fifty centimeters depth range along the Z axis.
  • the touchless sensing system 200 may adjust the emission power of the emitters 202, which may enable the touchless sensing system to determine a depth of an object.
  • the receivers 203 may perceive less reflected NIR light from objects located at a greater distance.
  • FIG. 4 illustrates an example structure of the touchless sensing system 200 detecting an object and/or identifying a motion.
  • the touchless sensing system 200 may include both hardware and software.
  • the hardware may include one or more control modules 204 and a plurality of sensing modules 201.
  • the hardware may sense NIR light emitted by one or more emitters.
  • the software may include a Low Level Logic 600 and a High Level Logic 700.
  • the Low Level Logic 600 may control the hardware including both the control modules 204 and the sensing modules 201.
  • the Low Level Logic 600 may generate a plurality of illumination patterns, modulate and control the emission power of the emitters 202, acquire raw data samples from the plurality of receivers 203 and construct a raw data frame based on the acquired raw data samples.
  • the High Level Logic 700 may reconstruct the raw data frame, interpret it and process it for a plurality of tasks.
  • the tasks may include determining the presence of an object, determining the position of an object with respect to the X and Y axes (e.g., a plane parallel to a plane in which the emitters are distributed), determining the angle of a motion of the object, and identifying the motion of the object on the X-Y plane (e.g., a hand gesture) and along the Z axis (e.g., a push action).
  • this disclosure illustrates a particular structure of the touchless sensing system, this disclosure contemplates any suitable structure of the touchless sensing system in any suitable manner.
  • FIG. 5 illustrates an example back-end flow diagram of the touchless sensing system 200.
  • the touchless sensing system 200 may start with step 510.
  • the touchless sensing system 200 may perform a pre-detection of a presence of an object in the field of view.
  • one sensing module 201 of the touchless sensing system 200 may emit NIR light at a predetermined interval, such as every second, to detect the presence of an object and start low-level processing.
  • the touchless sensing system 200 may determine whether to implement one or more subsequent steps.
  • the one or more subsequent steps may include implementing a Low Level Logic 600 and implementing a High Level Logic 700.
  • the touchless sensing system 200 may perform an initialization to identify the type of the object detected at step 510.
  • the type of the object may include a hand, glove, or metal.
  • the touchless sensing system 200 may further adjust the power level of the emitters 202 with respect to the NIR light based on the type of the object.
  • the touchless sensing system 200 may implement the Low Level Logic 600.
  • the touchless sensing system 200 may implement the High Level Logic 700.
  • FIG. 6 illustrates an example functional diagram of the operation of the Low Level Logic 600 corresponding to the example prototype of the touchless sensing system 200 illustrated in FIG. 2 .
  • the Low Level Logic 600 may control the hardware of the touchless sensing system 200.
  • the hardware may include both the control modules 204 and the sensing modules 201.
  • the Low Level Logic 600 in FIG. 6 may include a Master logic 601 and a Slave logic 602 running in different microprocessors.
  • the Low Level Logic 600 may be embedded in a single microcontroller unit (MCU).
  • the Low Level Logic 600 may include a customized serial communication protocol.
  • the communication time between all the modules of the Low Level Logic 600 may be less than a threshold amount of time.
  • the Low Level Logic 600 may additionally include an extra channel for byte detection and synchronization of a raw data vector, and a fixed set of broadcast commands.
  • the Low Level Logic 600 may first initialize different modules at step 603.
  • the Low Level Logic 600 may generate a plurality of illumination patterns. Based on one or more of the plurality of illumination patterns, the Master logic 601 may determine, for the plurality of emitters 202 of electromagnetic radiation, a power level selected from a plurality of pre-defined power levels at step 604. The Master logic 601 may send a corresponding command at step 608 to the Slave logic 602. The Slave logic 602 may receive such a command at step 609 and run the command. The Master logic 601 may further instruct the plurality of emitters 202 of electromagnetic radiation to emit electromagnetic radiation at the selected power level at step 605.
  • the Master logic 601 may send a corresponding command at step 608 to the Slave logic 602.
  • the Slave logic 602 may receive the command at step 609 and run the command.
  • the pre-defined power levels may include Low Energy, Medium Energy, and High Energy.
  • Each power level may determine a distinct phase.
  • the Low Level Logic 600 may control the plurality of emitters 202 and the plurality of receivers 203 in the following way.
  • the Low Level Logic 600 may sequentially activate the one or more emitters 202 of each of the plurality of sensing modules 201 to emit electromagnetic radiation at the determined power level, such that the emitters 202 of only one sensing module 201 are active at a time.
  • the touchless sensing system 200 may proceed to step 606 of sensing raw data samples.
  • the Master logic 601 may also send a corresponding command at step 608 to the Slave logic 602.
  • the Slave logic 602 may receive the command at step 609 and run the command.
  • the touchless sensing system 200 may receive, at one or more of the plurality of receivers 203, reflected electromagnetic radiation corresponding to the determined power level each time after the one or more emitters 202 are activated for each of the plurality of sensing modules 201.
  • the one or more of the plurality of receivers 203 may be selected based on whether the reflected electromagnetic radiation reaches the one or more of the plurality of receivers 203.
  • the touchless sensing system 200 may sequentially activate the one or more emitters 202 of the plurality of sensing modules 201, sense the raw data samples, and repeat the process until all the sensing modules 201 have been activated. For example, the process may be repeated eight times if the touchless sensing system 200 includes eight sensing modules 201, as illustrated in FIG. 2 .
  • the Low Level Logic 600 may determine another power level and start another phase accordingly. The Low Level Logic 600 may continue the cycle of determining a power level, activating emitters 202 and receiving reflected electromagnetic radiation at receivers 203 until all the pre-determined power levels have been selected. For example, the cycle may be repeated three times corresponding to the three power levels, as illustrated in FIG. 6 .
  • the cycle may be repeated continuously at a 30 Hz rate.
  • the Master logic 601 may collect the raw data samples received from the receivers 203.
  • the Master logic 601 may then compile the raw data samples at step 607.
  • the Master logic 601 may further provide the compiled raw data samples as a raw data vector 610 to the High Level Logic 700.
  • the Low Level Logic 600 may generate the raw data vector including a plurality of bytes. Each of the plurality bytes is based on received electromagnetic radiation corresponding to each of the plurality of pre-defined power levels with respect to each of the plurality sensing modules 201. For example, as illustrated in Fig.
  • the Low Level Logic may output raw data of 24 bytes, with each byte representing the power of the electromagnetic radiation received by a sensing module at one of the power levels.
  • a sensing module includes a plurality of receivers
  • a byte may represent the average power of the electromagnetic radiation received by the receivers of that sensing module.
  • the Low Level Logic 600 may activate the plurality of emitters 202 using a different illumination pattern than that described above.
  • the Low Level Logic 600 may activate the plurality of emitters 202 of the example prototype of the touchless sensing system 200 illustrated in FIG. 3 as follows.
  • the Low Level Logic 600 may first activate the one or more emitters 202 of each of the plurality of sensing modules 201 at the corners of the control board 205 to emit electromagnetic radiation at the determined power level, such that the emitters 202 of all the sensing modules 201 at the corners are active at the same time.
  • the Low Level Logic 600 may then activate the one or more emitters 202 of the sensing module 201 at the center of the control board 205 to emit electromagnetic radiation at the determined power level.
  • the Low Level Logic 600 may further sequentially activate the one or more emitters 202 of each of the plurality of sensing modules 201 to emit electromagnetic radiation at the determined power level, such that the emitters 202 of only one sensing module 201 are active at a time.
  • FIG. 7 illustrates an example functional diagram of the operation of the High Level Logic 700.
  • the High Level Logic 700 may be embedded in a single MCU.
  • the High Level Logic 700 may achieve a processing rate of 1.5 milliseconds per data frame with a 180 MHz 512 KB flash and 128 KB RAM microprocessor.
  • the High Level Logic 700 may receive raw data, such as the raw data vector 610, from the Low Level Logic 600.
  • the raw data vector may include a plurality of bytes, each corresponding to the receivers of one sensing module 201 at one power level.
  • the raw data vector 610 corresponding to the example prototype of FIG.
  • the High Level Logic 700 may include a plurality of modules.
  • the High Level Logic 700 illustrated in FIG. 7 may include Map Matrices 701 and Sensors Raw 702.
  • the High Level Logic 700 may generate a plurality of data matrices corresponding to the plurality of pre-defined power levels respectively, based on the raw data vector 610.
  • the plurality of data matrices may map the total energy received by the sensing modules 201 into a virtual physical space.
  • each matrix 703, 704, and 705 may represent a region of X-Y space in the field of view of the touchless sensing system.
  • the region of X-Y space is divided into 28 cells (or zones). Each zone corresponds to a set of X-Y coordinates in physical space. While the virtual physical space represented by matrices 703, 704, and 705 is divided into 28 rectangular zones arranged in a particular geometric configuration, this disclosure contemplates dividing an X-Y region of space in the field of view of a touchless sensing system into any suitable number of zones arranged in any suitable geometric shape.
  • the High Level Logic 700 may generate three data matrices including Low Energy Matrix 703, Medium Energy Matrix 704, and High Energy Matrix 705, corresponding to the power levels of Low Energy, Medium Energy, and High Energy, respectively.
  • Each data matrix may include twenty eight bytes corresponding to twenty eight different zones.
  • the number of zones, the location of zones in the virtual physical space, or both may depend on the number of sensing modules used and the physical configuration of those sensors.
  • Each of these twenty eight zones is a representation of a location of the virtual physical space.
  • Each sensing module 201 may have a fixed physical location in the touchless sensing system 200. As a result, each sensing module 201 may be associated with a particular zone or number of zones.
  • the High Level Logic 700 may determine the number of activated emitters 202 corresponding to each of the plurality of pre-defined power levels respectively at step 715.
  • FIG. 7 illustrates a particular functional diagram of the operation of the High Level Logic, this disclosure contemplates any suitable functional diagram of the operation of the High Level Logic in any suitable manner.
  • the High Level Logic 700 may determine the position of an object based on a plurality of steps.
  • the object may be hovering within the field of view of the touchless sensing system 200.
  • the plurality of steps may include Compensate Matrix 712, Calculate Position 713 and Filter Raw Positions 714 as illustrated in FIG. 7 .
  • the High Level Logic 700 may implement Compensate Matrix 712 in the following way.
  • the High Level Logic 700 may first filter the High Energy Matrix 705 to eliminate noisy information from the raw data samples received from the receivers 203.
  • the raw data samples may contain information associated with both a hand and an arm of a person.
  • the High Level Logic 700 may subtract energy from the lower zones of the High Energy Matrix 705 if energy is present in the upper zones of the High Energy Matrix 705.
  • the upper zones of the High Energy Matrix 705 may be normalized and averaged out and the lower zones of the High Energy Matrix 705 may be reduced accordingly.
  • the High Level Logic 700 may perform Calculate Position 713 after the High Energy Matrix 705 is compensated.
  • Each of the plurality of energy zones may have an energy and an associated location in the virtual physical space. Therefore, the High Level Logic 700 may weight each point in a set of X-Y coordinates based on the received energy at that point. The weighted points may be noisy.
  • the High Level Logic 700 may perform Filter Raw Positions 706 by implementing a plurality of filters.
  • the plurality of filters may include two filters, one for the coordinate on the X axis and the other for the coordinate on the Y axis.
  • the plurality of filters may be One Euro filters.
  • One Euro filter is an adaptive low pass filter. A benefit of these filters may include effectively cleaning the raw position of the object, which may further compensate for both slow and fast movement of the object.
  • the High Level Logic 700 may take the filtered coordinates as the determined position of the object.
  • the High Level Logic 700 may calculate the energy distribution to estimate the amount of energy received by each zone based on the amount of power of the NIR light perceived by each neighboring sensing module 201 at step 706. The calculation may result in a plurality of energy distributions corresponding to a plurality of pre-defined power levels. As an example and not by way of limitation, three energy distributions including Low Level Energy 707, Medium Level Energy 708, and High Level Energy 709, may be obtained corresponding to the power levels of Low Energy, Medium Energy, and High Energy, respectively.
  • the High Level Logic 700 may further calculate the steadiness of an object at step 710 for determining a motion along the Z axis at step 711 based on the energy distributions.
  • the High Level Logic 700 may calculate hand steadiness at step 710 for determining a push action at step 711.
  • object steadiness may be determined by analyzing the temporal change of energy detected in one or more zones of one or more energy matrices.
  • the High Level Logic 700 may determine an action on the X-Y plane at step 716 based on the determined number of activated emitters 202 and the determined position of the object.
  • determining an action at step 716 may include recognizing a gesture, determining an angle of the gesture, and tracking a position of the object.
  • FIG. 8 illustrates an example flow diagram of the High Level Logic 700 determining an action.
  • determining an action as illustrated in FIG. 8 may include hand tracking 802, gesture recognition 803 and categorizing as undefined action 804.
  • the High Level Logic 700 may first evaluate if the number of activated emitters 202 is larger than a threshold number at step 801. After this criteria is met, the High Level Logic 700 may proceed to subsequent steps.
  • the High Level Logic 700 may track the motion of the object (e.g., hand tracking) based on the generated data matrices (e.g., Low Energy Matrix 703, Medium Energy Matrix 704, and High Energy Matrix 705) and the determined number of activated emitters 202, if more than a threshold number of emitters 202 are activated for more than a threshold amount of time.
  • the threshold amount of time may be one second.
  • the High Level Logic 700 may identify the motion of the object (e.g., gesture recognition) based on the generated data matrices (e.g., Low Energy Matrix 703, Medium Energy Matrix 704, and High Energy Matrix 705) and the determined number of activated emitters 202 if more than a threshold number of emitters 202 are activated for a duration within a pre-defined range of time.
  • the pre-defined range of time may be more than one hundred milliseconds and less than one second.
  • the High Level Logic 700 may categorize the motion of the object as an undefined action if more than a threshold number of emitters 202 are activated for less than a threshold amount of time.
  • the threshold time may be one hundred milliseconds.
  • the High Level Logic 700 may perform motion identification (e.g. gesture recognition) in the following way.
  • the High Level Logic 700 may first normalize all the data samples with respect to the X and Y axes at step 805.
  • the High Level Logic 700 may then calculate the trajectory along the X axis and the trajectory along the Y axis at step 806.
  • the High Level Logic 700 may then calculate an angle that indicates the relationship between the two coordinates using arctangent at step 807.
  • the High Level Logic 700 may identify the motion as a lateral swipe 808; otherwise, the High Level Logic 700 may identify the motion as a vertical swipe 810. If the motion is a lateral swipe 808, the High Level Logic 700 may implement a linear regression at step 809 to find the equation of the line that best fits all the data points. As a result, the outliers may be corrected for and a more accurate gesture recognition 803 may be achieved. If the motion is a vertical swipe 810, the High Level Logic 700 may sum up all the angles between the first point where the first sample was collected and all the following points at step 811.
  • the High Level Logic 700 may further calculate the average value of the summed angle as the final angle of the gesture. Thus, as described herein, the High Level Logic 700 may determine a particular gesture by analyzing the temporal change in energy of one or more energy zones of the energy matrices described with respect to Fig. 7 . Although this disclosure illustrates a particular flow diagram of the High Level Logic determining an action, this disclosure contemplates any suitable flow diagram of the High Level Logic determining any suitable action in any suitable manner.
  • the plurality of pre-defined power levels may correspond to a plurality of pre-defined depths with respect to the touchless sensing system 200.
  • the High Level Logic 700 may identify a motion of an object along the Z axis (e.g., a push action of a user) based on the variability of the energy received by the receivers 203.
  • the emitted NIR light at the three power levels i.e., Low Energy, Medium Energy, and High Energy
  • the distances may change based on the surface of the object that reflects the NIR light.
  • the distance associated with a particular energy level may depend on the outcome of the initialization process described above.
  • the correlation between the power of emitted NIR light and the sensing distance may be useful for determining an action along the Z axis. In some embodiments, for the following reasons, such correlation may require additional processing or steps to determine the action.
  • a hand crossing different sensing distances corresponding to different energy levels may not necessarily indicate that a user is performing an action along the Z axis. Instead, the user may be just moving his/her hand across the touchless sensing system 200.
  • the user may perform a push action without crossing different sensing distances corresponding to different energy levels.
  • the pattern of a push action along the Z axis crossing different sensing distances corresponding to different energy levels may be similar to that of a motion along the X and Y axes.
  • the High Level Logic 700 may determine a push action based on a plurality of steps including Calculate Energy Distribution 706, Calculate Steadiness 710, and Determine Push 711.
  • FIG. 9 illustrates an example flow diagram of the High Level Logic 700 calculating the energy distribution.
  • the High Level Logic 700 may first mask the Medium Energy Matrix 704 and High Energy Matrix 705 using Mask Probabilities 901, respectively.
  • the Mask Probabilities 901 may contain a plurality of probabilities corresponding to each of the plurality of zones, respectively. As an example and not by way of limitation, the probability may indicate how likely the energy detected at a corresponding zone is reflected from a hand instead of an arm.
  • the High Level Logic 700 may then process the Masked Medium Energy Matrix 902 and the Masked High Energy Matrix 903 separately.
  • the High Level Logic 700 may select the four areas with the highest energy at step 904 and draw an area around these four areas. The High Level Logic 700 may then generate an influence-area defined high energy distribution matrix 908 for the Masked High Energy Matrix 903 after step 906. As a result, all the high energy may be concentrated in a specific area. In particular embodiments, the High Level Logic 700 may additionally use the Masked Medium Energy Matrix 902 to correct and soothe the influence-area defined high energy distribution matrix 908. With respect to the Masked Medium Energy Matrix 902, the High Level Logic 700 may first select the two areas with the highest probabilities at step 904.
  • the High Level Logic 700 may then mask the Masked Medium Energy Matrix 902 with two highest probabilities using Gaussian distribution at step 905.
  • a medium energy distribution matrix 907 for the Masked Medium Energy Matrix 902 may be generated accordingly and the two highest probabilities may indicate the peaks of two Gaussian distributions.
  • the High Level Logic 700 may then create a cluster of energy with a distinct energy peak and an area of influence appropriately distributed based on matrix 907 and matrix 908.
  • the created cluster of energy may also have low noise, low latency, and low drift.
  • the created cluster of energy may additionally soften the energy peaks caused by hand movements in all axes.
  • the High Level Logic 700 may further sum up the total energy of the matrix 908 and the total energy of the matrix 907 at step 909, which may result in the High Level Energy 709. In particular embodiments, the High Level Logic 700 may also sum up the total energy of the matrix 907 at step 909, which may result in the Medium Level Energy 708.
  • this disclosure illustrates a particular flow diagram of the High Level Logic calculating the energy distribution, this disclosure contemplates any suitable flow diagram of the High Level Logic calculating any suitable energy distribution in any suitable manner.
  • FIG. 10 illustrates an example flow diagram of the High Level Logic 700 calculating the steadiness of an object.
  • the High Level Logic 700 may detect if a user has his/her hand steady in a very short period of time.
  • the High Level Logic 700 may filter the Medium Level Energy 708 using, e.g., a One Euro Filter 1001.
  • the High Level Logic 700 may further vectorize the filtered Medium Level Energy, which may result in a Vector MLE (Medium Level Energy) 1002.
  • the High Level Logic 700 may filter the High Level Energy 709 using a One Euro Filter 1001.
  • the High Level Logic 700 may further vectorize the filtered High Level Energy, which may result in a Vector HLE (High Level Energy) 1003.
  • the High Level Logic 700 may store Vector MLE 1002 and Vector HLE 1003 in two different lists.
  • the High Level Logic 700 may also vectorize the High Energy Matrix 705, which may result in Vector Matrices 1004.
  • the High Level Logic 700 may store Vector Matrices 1004 in another list.
  • the High Level Logic 700 may calculate the correlation of all the vectorized matrices including Vector MLE 1002, Vector HLE 1003, and Vector Matrices 1004 at step 1005. The calculated correlation value may indicate the level of similarity among these vectorized matrices.
  • the High Level Logic 700 may use the correlation value as the input of a Dynamic Threshold Truncation algorithm 1007 if the correlation value is non-zero.
  • the High Level Logic 700 may use a random number generated by a Random Number Generator as the input of the Dynamic Threshold Truncation algorithm 1007 if the correlation value is zero.
  • the Dynamic Threshold Truncation algorithm 1007 may generate an index of variability indicating how much the information with respect to the motion of the object is changing based on standard deviation of the data corresponding to the motion of the object.
  • the Dynamic Threshold Truncation algorithm 1007 may further determine that an object is steady if the index of variability is lower than a threshold value for a threshold period of time.
  • this disclosure illustrates a particular flow diagram of the High Level Logic calculating the steadiness of an object, this disclosure contemplates any suitable flow diagram of the High Level Logic calculating any suitable steadiness of any suitable object in any suitable manner.
  • the motion identification along Z axis may be triggered if an object is determined to be steady by the Dynamic Threshold Truncation algorithm 1007.
  • Determine Push 711 may be triggered if a user's hand is determined to be steady.
  • the High Level Logic 700 may then analyze the energy levels and calculate the derivative across the data samples of each energy level to find the tendency of the data samples.
  • the High Level Logic 700 may further weight the tendencies of the Medium Level Energy 708 and the High Level Energy 709 accordingly to get a unique value. If the unique value is negative, the High Level Logic 700 may determine that the action is a push.
  • this disclosure illustrates a particular way of determining a push action, this disclosure contemplates any suitable way of determining a push action in any suitable manner.
  • the touchless sensing system 200 may create a 3D interaction space of approximately fifty centimeters in radius in front of the touchless sensing system 200.
  • the touchless sensing system 200 may determine the position of a hand that enters the sensing space and identify hand gestures performed in the field of view.
  • the hand gestures may include a lateral swipe, vertical swipe, and push.
  • the touchless sensing system 200 may additionally identify other hand gestures by modeling the 3D interaction space in different ways, changing the physical distribution of sensing modules 201, changing the number of sensing modules 201, and/or changing the illumination patterns.
  • An advantage of the touchless sensing system 200 may include requiring little sensory information, low computational load and low cost.
  • the touchless sensing system 200 may be integrated with a plurality of devices and systems for a plurality of applications.
  • the touchless sensing system 200 may be integrated with wearable devices including a smart watch, virtual reality headset, fitness wristband, headphone, smart glove, etc.
  • the touchless sensing system 100 may be integrated with portable devices including an e-reader, tablet, smartphone, digital camera, sport camera, laptop, music system, portable gaming device, etc.
  • the touchless sensing system 200 may be integrated with home appliances and devices including a smart TV, computer, remote controller for TV, controller for gaming system, DJ mixer, keyboard, etc.
  • the touchless sensing system 200 may be integrated with medical devices and/or in sterile environments, such as in association with a monitor, bed, stand for medicine supply, dialysis machine, lab equipment, etc.
  • the touchless sensing system 200 may be integrated into cars for controlling a car entertaining system, air conditioner, etc.
  • the touchless sensing system 200 may be integrated with public space systems including a public display, elevator, ATM machine, shop window, ticket machine, etc.
  • the touchless sensing system 200 may be integrated with devices that can potentially compromise users' security and privacy, including an ATM machine, combination lock, safe, public keyboard, etc.
  • devices that can potentially compromise users' security and privacy, including an ATM machine, combination lock, safe, public keyboard, etc.
  • this disclosure illustrates particular applications of the touchless sensing system, this disclosure contemplates any suitable applications of the touchless sensing system in any suitable manner.
  • FIG. 11 illustrates an example method 1100 for sensing an object based on electromagnetic radiation.
  • the emitters 202 may be activated according to an illumination pattern.
  • an illumination pattern may be generated based on the power level of Low Energy.
  • the method may activate at least some of the emitters 202 based on this particular illumination pattern by instructing these emitters 202 to emit electromagnetic radiation at the power level of Low Energy.
  • the method may detect, based at least on the illumination pattern and on electromagnetic radiation detected by one or more receivers 203, a presence of an object or a motion of the object in the field of view of at least one of the receivers 203, wherein at least part of the electromagnetic radiation detected by the one or more receivers 203 is reflected from the object to the one or more receivers 203.
  • the electromagnetic radiation may be reflected back by an object if the object is in the field of view.
  • the reflected electromagnetic radiation may reach some receivers 203 but not reach other receivers 203, which means some receivers 203 may get data samples whereas other receivers 203 may not.
  • the method may then detect the presence of the object or the motion of the object based on the data samples received at some of the receivers 203.
  • Particular embodiments may repeat one or more steps of the method of FIG. 11 , where appropriate.
  • this disclosure describes and illustrates particular steps of the method of FIG. 11 as occurring in a particular order, this disclosure contemplates any suitable steps of the method of FIG. 11 occurring in any suitable order.
  • this disclosure describes and illustrates an example method for sensing an object based on electromagnetic radiation, including the particular steps of the method of FIG. 11 , this disclosure contemplates any suitable method for sensing an object based on electromagnetic radiation, which may include all, some, or none of the steps of the method of FIG. 11 , where appropriate.
  • this disclosure describes and illustrates particular components, devices, or systems carrying out particular steps of the method of FIG. 11
  • this disclosure contemplates any suitable combination of any suitable components, devices, or systems carrying out any suitable steps of the method of FIG. 11 .
  • FIG. 12 illustrates an example computer system 1200.
  • one or more computer systems 1200 perform one or more steps of one or more methods described or illustrated herein.
  • one or more computer systems 1200 provide functionality described or illustrated herein.
  • software running on one or more computer systems 1200 performs one or more steps of one or more methods described or illustrated herein or provides functionality described or illustrated herein.
  • Particular embodiments include one or more portions of one or more computer systems 1200.
  • reference to a computer system may encompass a computing device, and vice versa, where appropriate.
  • reference to a computer system may encompass one or more computer systems, where appropriate.
  • computer system 1200 may be an embedded computer system, a system-on-chip (SOC), a single-board computer system (SBC) (such as, for example, a computer-on-module (COM) or system-on-module (SOM)), a desktop computer system, a laptop or notebook computer system, an interactive kiosk, a mainframe, a mesh of computer systems, a mobile telephone, a personal digital assistant (PDA), a server, a tablet computer system, an augmented/virtual reality device, or a combination of two or more of these.
  • SOC system-on-chip
  • SBC single-board computer system
  • COM computer-on-module
  • SOM system-on-module
  • computer system 1200 may include one or more computer systems 1200; be unitary or distributed; span multiple locations; span multiple machines; span multiple data centers; or reside in a cloud, which may include one or more cloud components in one or more networks.
  • one or more computer systems 1200 may perform without substantial spatial or temporal limitation one or more steps of one or more methods described or illustrated herein.
  • one or more computer systems 1200 may perform in real time or in batch mode one or more steps of one or more methods described or illustrated herein.
  • One or more computer systems 1200 may perform at different times or at different locations one or more steps of one or more methods described or illustrated herein, where appropriate.
  • computer system 1200 includes a processor 1202, memory 1204, storage 1206, an input/output (I/O) interface 1208, a communication interface 1210, and a bus 1212.
  • processor 1202 memory 1204, storage 1206, an input/output (I/O) interface 1208, a communication interface 1210, and a bus 1212.
  • I/O input/output
  • this disclosure describes and illustrates a particular computer system having a particular number of particular components in a particular arrangement, this disclosure contemplates any suitable computer system having any suitable number of any suitable components in any suitable arrangement.
  • processor 1202 includes hardware for executing instructions, such as those making up a computer program.
  • processor 1202 may retrieve (or fetch) the instructions from an internal register, an internal cache, memory 1204, or storage 1206; decode and execute them; and then write one or more results to an internal register, an internal cache, memory 1204, or storage 1206.
  • processor 1202 may include one or more internal caches for data, instructions, or addresses. This disclosure contemplates processor 1202 including any suitable number of any suitable internal caches, where appropriate.
  • processor 1202 may include one or more instruction caches, one or more data caches, and one or more translation lookaside buffers (TLBs).
  • TLBs translation lookaside buffers
  • Instructions in the instruction caches may be copies of instructions in memory 1204 or storage 1206, and the instruction caches may speed up retrieval of those instructions by processor 1202.
  • Data in the data caches may be copies of data in memory 1204 or storage 1206 for instructions executing at processor 1202 to operate on; the results of previous instructions executed at processor 1202 for access by subsequent instructions executing at processor 1202 or for writing to memory 1204 or storage 1206; or other suitable data.
  • the data caches may speed up read or write operations by processor 1202.
  • the TLBs may speed up virtual-address translation for processor 1202.
  • processor 1202 may include one or more internal registers for data, instructions, or addresses. This disclosure contemplates processor 1202 including any suitable number of any suitable internal registers, where appropriate. Where appropriate, processor 1202 may include one or more arithmetic logic units (ALUs); be a multi-core processor; or include one or more processors 1202. Although this disclosure describes and illustrates a particular processor, this disclosure contemplates any suitable processor.
  • ALUs
  • memory 1204 includes main memory for storing instructions for processor 1202 to execute or data for processor 1202 to operate on.
  • computer system 1200 may load instructions from storage 1206 or another source (such as, for example, another computer system 1200) to memory 1204.
  • Processor 1202 may then load the instructions from memory 1204 to an internal register or internal cache.
  • processor 1202 may retrieve the instructions from the internal register or internal cache and decode them.
  • processor 1202 may write one or more results (which may be intermediate or final results) to the internal register or internal cache.
  • Processor 1202 may then write one or more of those results to memory 1204.
  • processor 1202 executes only instructions in one or more internal registers or internal caches or in memory 1204 (as opposed to storage 1206 or elsewhere) and operates only on data in one or more internal registers or internal caches or in memory 1204 (as opposed to storage 1206 or elsewhere).
  • One or more memory buses (which may each include an address bus and a data bus) may couple processor 1202 to memory 1204.
  • Bus 1212 may include one or more memory buses, as described below.
  • one or more memory management units reside between processor 1202 and memory 1204 and facilitate accesses to memory 1204 requested by processor 1202.
  • memory 1204 includes random access memory (RAM).
  • This RAM may be volatile memory, where appropriate Where appropriate, this RAM may be dynamic RAM (DRAM) or static RAM (SRAM). Moreover, where appropriate, this RAM may be single-ported or multi-ported RAM. This disclosure contemplates any suitable RAM.
  • Memory 1204 may include one or more memories 1204, where appropriate. Although this disclosure describes and illustrates particular memory, this disclosure contemplates any suitable memory.
  • storage 1206 includes mass storage for data or instructions.
  • storage 1206 may include a hard disk drive (HDD), a floppy disk drive, flash memory, an optical disc, a magneto-optical disc, magnetic tape, or a Universal Serial Bus (USB) drive or a combination of two or more of these.
  • Storage 1206 may include removable or non-removable (or fixed) media, where appropriate.
  • Storage 1206 may be internal or external to computer system 1200, where appropriate.
  • storage 1206 is non-volatile, solid-state memory.
  • storage 1206 includes read-only memory (ROM).
  • this ROM may be mask-programmed ROM, programmable ROM (PROM), erasable PROM (EPROM), electrically erasable PROM (EEPROM), electrically alterable ROM (EAROM), or flash memory or a combination of two or more of these.
  • This disclosure contemplates mass storage 1206 taking any suitable physical form.
  • Storage 1206 may include one or more storage control units facilitating communication between processor 1202 and storage 1206, where appropriate. Where appropriate, storage 1206 may include one or more storages 1206. Although this disclosure describes and illustrates particular storage, this disclosure contemplates any suitable storage.
  • I/O interface 1208 includes hardware, software, or both, providing one or more interfaces for communication between computer system 1200 and one or more I/O devices.
  • Computer system 1200 may include one or more of these I/O devices, where appropriate.
  • One or more of these I/O devices may enable communication between a person and computer system 1200.
  • an I/O device may include a keyboard, keypad, microphone, monitor, mouse, printer, scanner, speaker, still camera, stylus, tablet, touch screen, trackball, video camera, another suitable I/O device or a combination of two or more of these.
  • An I/O device may include one or more sensors. This disclosure contemplates any suitable I/O devices and any suitable I/O interfaces 1208 for them.
  • I/O interface 1208 may include one or more device or software drivers enabling processor 1202 to drive one or more of these I/O devices.
  • I/O interface 1208 may include one or more I/O interfaces 1208, where appropriate. Although this disclosure describes and illustrates a particular I/O interface, this disclosure contemplates any suitable I/O interface.
  • communication interface 1210 includes hardware, software, or both providing one or more interfaces for communication (such as, for example, packet-based communication) between computer system 1200 and one or more other computer systems 1200 or one or more networks.
  • communication interface 1210 may include a network interface controller (NIC) or network adapter for communicating with an Ethernet or other wire-based network or a wireless NIC (WNIC) or wireless adapter for communicating with a wireless network, such as a WI-FI network.
  • NIC network interface controller
  • WNIC wireless NIC
  • WI-FI network wireless network
  • computer system 1200 may communicate with an ad hoc network, a personal area network (PAN), a local area network (LAN), a wide area network (WAN), a metropolitan area network (MAN), or one or more portions of the Internet or a combination of two or more of these.
  • PAN personal area network
  • LAN local area network
  • WAN wide area network
  • MAN metropolitan area network
  • computer system 1200 may communicate with a wireless PAN (WPAN) (such as, for example, a BLUETOOTH WPAN), a WI-FI network, a WI-MAX network, a cellular telephone network (such as, for example, a Global System for Mobile Communications (GSM) network), or other suitable wireless network or a combination of two or more of these.
  • Computer system 1200 may include any suitable communication interface 1210 for any of these networks, where appropriate.
  • Communication interface 1210 may include one or more communication interfaces 1210, where appropriate.
  • bus 1212 includes hardware, software, or both coupling components of computer system 1200 to each other.
  • bus 1212 may include an Accelerated Graphics Port (AGP) or other graphics bus, an Enhanced Industry Standard Architecture (EISA) bus, a front-side bus (FSB), a HYPERTRANSPORT (HT) interconnect, an Industry Standard Architecture (ISA) bus, an INFINIBAND interconnect, a low-pin-count (LPC) bus, a memory bus, a Micro Channel Architecture (MCA) bus, a Peripheral Component Interconnect (PCI) bus, a PCI-Express (PCIe) bus, a serial advanced technology attachment (SATA) bus, a Video Electronics Standards Association local (VLB) bus, or another suitable bus or a combination of two or more of these.
  • Bus 1212 may include one or more buses 1212, where appropriate.
  • a computer-readable non-transitory storage medium or media may include one or more semiconductor-based or other integrated circuits (ICs) (such, as for example, field-programmable gate arrays (FPGAs) or application-specific ICs (ASICs)), hard disk drives (HDDs), hybrid hard drives (HHDs), optical discs, optical disc drives (ODDs), magneto-optical discs, magneto-optical drives, floppy diskettes, floppy disk drives (FDDs), magnetic tapes, solid-state drives (SSDs), RAM-drives, SECURE DIGITAL cards or drives, any other suitable computer-readable non-transitory storage media, or any suitable combination of two or more of these, where appropriate.
  • ICs such, as for example, field-programmable gate arrays (FPGAs) or application-specific ICs (ASICs)
  • HDDs hard disk drives
  • HHDs hybrid hard drives
  • ODDs optical disc drives
  • magneto-optical discs magneto-optical drives

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Human Computer Interaction (AREA)
  • General Physics & Mathematics (AREA)
  • Electromagnetism (AREA)
  • Circuit Arrangement For Electric Light Sources In General (AREA)
  • User Interface Of Digital Computer (AREA)
  • Geophysics And Detection Of Objects (AREA)
  • Position Input By Displaying (AREA)

Claims (11)

  1. Appareil comprenant :
    une pluralité d'émetteurs (202) de rayonnement électromagnétique, où chacun de la pluralité d'émetteurs correspond à un champ de vision différent ;
    une pluralité de récepteurs (203) de rayonnement électromagnétique, où chacun de la pluralité de récepteurs correspond à un champ de vision différent ; un ou plusieurs processeurs (1202) ; et
    une pluralité de modules de détection (201), où chacun de la pluralité de modules de détection (201) comprend :
    un ou plusieurs émetteurs de la pluralité d'émetteurs (202) de rayonnement électromagnétique,
    un ou plusieurs récepteurs de la pluralité de récepteurs (203) de rayonnement électromagnétique ;
    l'un ou plusieurs processeurs (1202) configurés pour :
    déterminer un niveau de puissance inclus dans une pluralité de niveaux de puissance prédéfinis ;
    activer séquentiellement au moins l'un de la pluralité d'émetteurs pour émettre un rayonnement électromagnétique au niveau de puissance déterminé ;
    détecter une rayonnement électromagnétique par au moins l'un de la pluralité de récepteurs où au moins une partie du rayonnement électromagnétique détecté par au moins l'un de la pluralité de récepteurs est le rayonnement électromagnétique émis au niveau de puissance déterminé par l'au moins un émetteur activé de la pluralité d'émetteurs et ensuite réfléchi d'un objet vers au moins l'un de la pluralité de récepteurs,
    où la détermination d'un niveau de puissance, l'activation au niveau de puissance déterminé d'au moins l'un de la pluralité d'émetteurs et la détection de rayonnement électromagnétique par au moins l'un de la pluralité de récepteurs sont répétées séquentiellement pour chacun de la pluralité de niveaux de puissance prédéfinis ;
    caractérisé en ce que l'un ou plusieurs processeurs (1202) sont en outre configurés pour
    générer un vecteur de données comprenant une pluralité d'octets correspondant chaque octet au niveau de puissance détecté par un récepteur de la pluralité de récepteurs (203) de la pluralité des modules de détection ;
    générer une pluralité de matrices de données correspondant respectivement à la pluralité de niveaux de puissance prédéfinis, sur la base du vecteur de données généré ; où la pluralité de matrices de données cartographient le rayonnement électromagnétique reçu par chacun de l'un ou plusieurs récepteurs de la pluralité de récepteurs (203) de la pluralité des modules de détection (201) dans un espace physique virtuel ;
    déterminer un nombre d'émetteurs activés correspondant à chacun de la pluralité de niveaux de puissance prédéfinis, respectivement, sur la base d'une association de chacun de la pluralité de modules de détection (201) avec une ou plusieurs zones dans l'espace physique virtuel ;
    déterminer une position de l'objet sur la base de la pluralité générée de matrices de données et du nombre déterminé d'émetteurs activés, en considérant une pluralité de points dans un ensemble de coordonnées sur la base du rayonnement électromagnétique reçu à la pluralité de points par l'un ou plusieurs récepteur de la pluralité de récepteurs (203) de la pluralité de modules de détection (201) ou
    suivre le mouvement de l'objet sur la base de la pluralité générée de matrices de données et du nombre déterminé d'émetteurs activés, si plus d'un nombre seuil d'émetteurs sont activés pendant plus d'une quantité de temps seuil, en déterminant (710) la stabilité de l'objet ou
    identifier le mouvement de l'objet sur la base de la pluralité générée de matrices de données et du nombre déterminé d'émetteurs activés, en analysant un changement temporel dans le rayonnement électromagnétique de la pluralité de matrices de données si plus d'un nombre seuil d'émetteurs sont activés pour un intervalle de temps prédéfini.
  2. Appareil selon la revendication 1, où le rayonnement électromagnétique comprend une lumière infrarouge proche.
  3. Appareil selon la revendication 1, où chacun de la pluralité d'émetteurs comprend une ou plusieurs diodes électroluminescentes, LED.
  4. Appareil selon la revendication 1, où chacun de la pluralité de récepteurs comprend un ou plusieurs photodiodes.
  5. Appareil selon la revendication 1, où l'un ou plusieurs processeurs sont en outre configurés pour ordonner (605) à la pluralité d'émetteurs de rayonnement électromagnétique d'émettre le rayonnement électromagnétique au niveau de puissance déterminé.
  6. Appareil selon la revendication 5, où la pluralité de niveaux de puissance prédéfinis correspondent à une pluralité de profondeurs prédéfinies par rapport à l'appareil.
  7. Appareil selon la revendication 1, où chacun de la pluralité des modules de détection comprend un ou plusieurs microcontrôleurs, et
    où l'un ou plusieurs microcontrôleurs sont configurés pour :
    communiquer avec l'un ou plusieurs processeurs ;
    moduler le rayonnement électromagnétique émis par l'un ou plusieurs émetteurs du module de détection correspondant ;
    réguler la puissance d'émission de l'un ou plusieurs émetteurs du module de détection correspondant ; et
    traiter le rayonnement électromagnétique reçu par l'un ou plusieurs récepteurs du module de détection correspondant.
  8. Appareil selon la revendication 1, où l'un ou plusieurs processeurs sont en outre configurés pour :
    activer séquentiellement l'un ou plusieurs émetteurs de chacun de la pluralité de modules de détection pour émettre un rayonnement électromagnétique au niveau de puissance déterminé de sorte que les émetteurs d'un seul module de détection sont actifs à la fois ; et
    après un quelconque de l'un ou plusieurs émetteurs d'un module de détection quelconque est activé :
    recevoir, d'un ou plusieurs de la pluralité de récepteurs, un rayonnement électromagnétique réfléchi correspondant au niveau de puissance déterminé chaque fois après que l'un ou plusieurs émetteurs sont activés pour chacun de la pluralité de modules de détection, où l'un ou plusieurs de la pluralité de récepteurs sont déterminés lorsque le rayonnement électromagnétique réfléchi atteint l'un ou plusieurs de la pluralité de récepteurs.
  9. Un ou plusieurs supports de stockage non transitoires lisibles par ordinateur contenant des instructions qui, lorsqu'elles sont exécutées par un ou plusieurs processeurs, amènent l'un ou plusieurs processeurs à effectuer des opérations comprenant :
    déterminer un niveau de puissance inclus dans une pluralité de niveaux de puissance prédéfinis ;
    activer séquentiellement au moins l'un d'une pluralité d'émetteurs pour émettre un rayonnement électromagnétique au niveau de puissance déterminé,
    où chacun de la pluralité d'émetteurs correspond à un champ de vision différent :
    détecter un rayonnement électromagnétique par au moins l'un de la pluralité de récepteurs où au moins une partie du rayonnement électromagnétique détecté par l'au moins un de la pluralité de récepteurs est le rayonnement électromagnétique émis au niveau de puissance déterminé par l'au moins un activé de la pluralité d'émetteurs et ensuite réfléchi d'un objet vers l'au moins un de la pluralité de récepteurs,
    où chacun de la pluralité de récepteurs correspond à un champ de vision différent ; où la détermination d'un niveau de puissance, l'activation au niveau de puissance déterminé d'au moins l'un de la pluralité d'émetteurs et la détection de rayonnement électromagnétique par au moins l'un de la pluralité de récepteur sont répétées séquentiellement pour chacun de la pluralité des niveaux de puissance prédéfinis ;
    caractérisé en ce que les opérations comprennent en outre :
    générer un vecteur de données comprenant une pluralité d'octets correspondant chaque octet au niveau de puissance détecté par un récepteur de la pluralité de récepteurs (203) d'une pluralité de modules de détection, où chacun de la pluralité de modules de détection comprend un ou plusieurs émetteurs de la pluralité d'émetteurs (202) de rayonnement électromagnétique et l'un ou plusieurs récepteurs de la pluralité de récepteurs (203) de rayonnement électromagnétique ;
    générer une pluralité de matrices de données correspondant respectivement à la pluralité de niveaux de puissance prédéfinis, sur la base du vecteur de données généré ; où la pluralité de matrices de données cartographient le rayonnement électromagnétique reçu par chacun de l'un ou plusieurs récepteurs de la pluralité de récepteurs (203) de la pluralité des modules de détection (201) dans un espace physique virtuel ;
    déterminer un nombre d'émetteurs activés correspondant à chacun de la pluralité des niveaux de puissance prédéfinis, respectivement, sur la base d'une association de chacun de la pluralité de modules de détection (201) avec une ou plusieurs zones dans l'espace physique virtuel ;
    déterminer une position de l'objet sur la base de la pluralité générée de matrices de données et du nombre déterminé d'émetteurs activés, en considérant une pluralité de points dans un ensemble de coordonnées sur la base du rayonnement électromagnétique reçu à la pluralité de points par l'un ou plusieurs de la pluralité de récepteurs (203) de la pluralité des modules de détection (201) ; ou
    suivre le mouvement de l'objet sur la base de la pluralité générée de matrices des données et du nombre déterminé d'émetteurs activés, si plus d'un nombre seuil d'émetteurs sont activés pendant plus d'une quantité de temps seuil, en déterminant (710) la stabilité de l'objet ; ou
    identifier le mouvement de l'objet sur la base de la pluralité générée de matrices de données et du nombre déterminé d'émetteurs activés, en analysant un changement temporel dans le rayonnement électromagnétique de la pluralité de matrices de données si plus d'un nombre seuil d'émetteurs sont activés pour une durée dans un intervalle de temps prédéfini.
  10. L'un ou plusieurs supports de stockage non transitoires lisibles par ordinateur selon la revendication 9,
    où les opérations comprennent en outre activer la pluralité d'émetteurs selon des modèles d'illumination.
  11. Procédé exécuté par l'appareil selon la revendication 1, le procédé comprenant :
    déterminer un niveau de puissance inclus dans une pluralité de niveaux de puissance prédéfinis ;
    séquentiellement activer au moins l'un de la pluralité d'émetteurs pour émettre un rayonnement électromagnétique au niveau de puissance déterminé,
    où chacun de la pluralité d'émetteurs correspond à un champ de vision différent ;
    détecter un rayonnement électromagnétique par au moins l'un de la pluralité de récepteurs où au moins une partie du rayonnement électromagnétique détecté par l'au moins un de la pluralité de récepteurs est le rayonnement électromagnétique émis au niveau de puissance déterminé par l'au moins un activé de la pluralité d'émetteurs et ensuite réfléchi d'un objet vers l'au moins un de la pluralité de récepteurs,
    où chacun de la pluralité de récepteurs correspond à un champ de vision différent ; où la détermination d'un niveau de puissance, l'activation au niveau de puissance déterminé d'au moins l'un de la pluralité d'émetteurs et la détection du rayonnement électromagnétique par au moins l'un de la pluralité de récepteurs sont répétées séquentiellement pour chacun de la pluralité de niveaux de puissance prédéfinis ;
    caractérisé en ce que le procédé comprend en outre :
    générer un vecteur des données comprenant une pluralité d'octets correspondant chaque octet au niveau de puissance détecté par un récepteur de la pluralité de récepteurs (203) de la pluralité de modules de détection, où chacun de la pluralité de modules de détection comprend un ou plusieurs émetteurs de la pluralité d'émetteurs (202) de rayonnement électromagnétique et l'un ou plusieurs récepteurs de la pluralité de récepteurs (203) de rayonnement électromagnétique ;
    générer une pluralité de matrices de données correspondant respectivement à la pluralité de niveaux de puissance prédéfinis, sur la base du vecteur de données généré ; où la pluralité de matrices de données cartographient le rayonnement électromagnétique reçu par chacun de l'un ou plusieurs récepteurs de la pluralité de récepteurs (203) de la pluralité des modules de détection (201) dans un espace physique virtuel ;
    déterminer un nombre d'émetteurs activés correspondant à chacun de la pluralité de niveaux de puissance prédéfinis, respectivement, sur la base d'une association de chacun de la pluralité de modules de détection (201) avec une ou plusieurs zones dans l'espace physique virtuel ;
    déterminer une position de l'objet sur la base de la pluralité générée de matrices de données et du nombre déterminé d'émetteurs activés, en considérant une pluralité de points dans un ensemble de coordonnées sur la base du rayonnement électromagnétique reçu à la pluralité de points par l'un ou plusieurs de la pluralité de récepteurs (203) de la pluralité de modules de détection (201) ; ou
    suivre le mouvement de l'objet sur la base de la pluralité générée de matrices de données et du nombre déterminé d'émetteurs activés, si plus d'un nombre seuil d'émetteurs sont activés pendant plus d'une quantité de temps seuil, en déterminant (710) la stabilité de l'objet ; ou
    identifier le mouvement de l'objet sur la base de la pluralité générée de matrices de données et du nombre déterminé d'émetteurs activés, en analysant un changement temporel dans le rayonnement électromagnétique de la pluralité de matrices de données si plus d'un nombre seuil d'émetteurs sont activés pour une durée dans un intervalle de temps prédéfini.
EP18821570.1A 2017-06-21 2018-05-21 Détection d'objet et identification de mouvement à l'aide d'un rayonnement électromagnétique Active EP3639122B1 (fr)

Applications Claiming Priority (3)

Application Number Priority Date Filing Date Title
US201762523153P 2017-06-21 2017-06-21
US15/885,497 US10481736B2 (en) 2017-06-21 2018-01-31 Object detection and motion identification using electromagnetic radiation
PCT/KR2018/005768 WO2018236056A1 (fr) 2017-06-21 2018-05-21 Détection d'objet et identification de mouvement à l'aide d'un rayonnement électromagnétique

Publications (3)

Publication Number Publication Date
EP3639122A4 EP3639122A4 (fr) 2020-04-22
EP3639122A1 EP3639122A1 (fr) 2020-04-22
EP3639122B1 true EP3639122B1 (fr) 2023-08-23

Family

ID=64693180

Family Applications (1)

Application Number Title Priority Date Filing Date
EP18821570.1A Active EP3639122B1 (fr) 2017-06-21 2018-05-21 Détection d'objet et identification de mouvement à l'aide d'un rayonnement électromagnétique

Country Status (5)

Country Link
US (1) US10481736B2 (fr)
EP (1) EP3639122B1 (fr)
JP (1) JP7226888B2 (fr)
CN (1) CN110785728B (fr)
WO (1) WO2018236056A1 (fr)

Families Citing this family (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107788992B (zh) * 2016-08-29 2022-03-08 松下知识产权经营株式会社 识别生物体的装置以及识别生物体的方法
US11461907B2 (en) * 2019-02-15 2022-10-04 EchoPixel, Inc. Glasses-free determination of absolute motion
JP7457625B2 (ja) * 2020-10-07 2024-03-28 パラマウントベッド株式会社 ベッドシステム

Family Cites Families (33)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7164117B2 (en) * 1992-05-05 2007-01-16 Automotive Technologies International, Inc. Vehicular restraint system control system and method using multiple optical imagers
US4672364A (en) * 1984-06-18 1987-06-09 Carroll Touch Inc Touch input device having power profiling
DE69122782T2 (de) * 1990-01-16 1997-02-20 Carroll Touch Inc Infrarotberührungsempfindliche Eingabevorrichtung und Lichterzeugeraktivierungsschaltung
US7876424B2 (en) * 2008-08-20 2011-01-25 Microsoft Corporation Distance estimation based on image contrast
CN101609647A (zh) * 2009-07-30 2009-12-23 友达光电股份有限公司 触控式有机发光二极管显示装置及影像单元
US8547327B2 (en) * 2009-10-07 2013-10-01 Qualcomm Incorporated Proximity object tracker
EP2819410A1 (fr) * 2011-01-04 2014-12-31 Samsung Electronics Co., Ltd Appareil d'affichage, lunettes 3D et procédé de commande correspondant
US20140035812A1 (en) 2011-05-05 2014-02-06 Maxim Integrated Products, Inc. Gesture sensing device
US20120293404A1 (en) 2011-05-19 2012-11-22 Panasonic Corporation Low Cost Embedded Touchless Gesture Sensor
US9679215B2 (en) * 2012-01-17 2017-06-13 Leap Motion, Inc. Systems and methods for machine control
US9674436B2 (en) 2012-06-18 2017-06-06 Microsoft Technology Licensing, Llc Selective imaging zones of an imaging sensor
EP2872966A1 (fr) 2012-07-12 2015-05-20 Dual Aperture International Co. Ltd. Interface utilisateur basée sur des gestes
GB2504291A (en) * 2012-07-24 2014-01-29 St Microelectronics Ltd A proximity and gesture detection module
TW201415291A (zh) 2012-10-08 2014-04-16 Pixart Imaging Inc 基於物件追蹤的手勢辨識方法及系統
DE102012110460A1 (de) 2012-10-31 2014-04-30 Audi Ag Verfahren zum Eingeben eines Steuerbefehls für eine Komponente eines Kraftwagens
US9285893B2 (en) * 2012-11-08 2016-03-15 Leap Motion, Inc. Object detection and tracking with variable-field illumination devices
EP2738647B1 (fr) 2012-11-30 2018-08-29 BlackBerry Limited Procédé et dispositif pour identifier des gestes sans contact
CN108334204B (zh) * 2012-12-10 2021-07-30 因维萨热技术公司 成像装置
US9304594B2 (en) 2013-04-12 2016-04-05 Microsoft Technology Licensing, Llc Near-plane segmentation using pulsed light source
US20140376773A1 (en) 2013-06-21 2014-12-25 Leap Motion, Inc. Tunable operational parameters in motion-capture and touchless interface operation
JP2015032101A (ja) * 2013-08-01 2015-02-16 株式会社東芝 情報端末装置
KR102138510B1 (ko) * 2013-08-27 2020-07-28 엘지전자 주식회사 근접 터치 기능을 구비한 전자 장치 및 그 제어 방법
US9213102B2 (en) 2013-09-11 2015-12-15 Google Technology Holdings LLC Electronic device with gesture detection system and methods for using the gesture detection system
US20150139483A1 (en) * 2013-11-15 2015-05-21 David Shen Interactive Controls For Operating Devices and Systems
JP6307627B2 (ja) * 2014-03-14 2018-04-04 株式会社ソニー・インタラクティブエンタテインメント 空間感知を備えるゲーム機
EP2950596B1 (fr) * 2014-05-30 2019-06-26 Apple Inc. Procédé et système pour déterminer si un dispositif se trouve sur le corps d'un utilisateur
KR102294945B1 (ko) 2014-06-11 2021-08-30 삼성전자주식회사 기능 제어 방법 및 그 전자 장치
KR102278880B1 (ko) 2014-11-14 2021-07-20 삼성디스플레이 주식회사 백라이트 유닛, 이를 포함하는 표시 장치, 및 영상 표시 시스템
US10175768B2 (en) * 2014-12-01 2019-01-08 Singapore University Of Technology And Design Gesture recognition devices, gesture recognition methods, and computer readable media
US9746921B2 (en) 2014-12-31 2017-08-29 Sony Interactive Entertainment Inc. Signal generation and detector systems and methods for determining positions of fingers of a user
US10168193B2 (en) * 2015-01-07 2019-01-01 Infineon Technologies Ag Sensor with switching matrix switch
US9652047B2 (en) 2015-02-25 2017-05-16 Daqri, Llc Visual gestures for a head mounted device
US9910275B2 (en) 2015-05-18 2018-03-06 Samsung Electronics Co., Ltd. Image processing for head mounted display devices

Also Published As

Publication number Publication date
WO2018236056A1 (fr) 2018-12-27
EP3639122A4 (fr) 2020-04-22
US20180373391A1 (en) 2018-12-27
US10481736B2 (en) 2019-11-19
JP7226888B2 (ja) 2023-02-21
CN110785728B (zh) 2022-04-29
JP2020524856A (ja) 2020-08-20
CN110785728A (zh) 2020-02-11
EP3639122A1 (fr) 2020-04-22

Similar Documents

Publication Publication Date Title
US11237625B2 (en) Interaction engine for creating a realistic experience in virtual reality/augmented reality environments
US11875012B2 (en) Throwable interface for augmented reality and virtual reality environments
EP3639122B1 (fr) Détection d'objet et identification de mouvement à l'aide d'un rayonnement électromagnétique
KR102406327B1 (ko) 출력 장치를 제어하는 방법 및 장치
KR102510402B1 (ko) 터치 및 인-더-에어 인터랙션들을 위한 레이더 기반 감지 시스템
Zhang et al. Extending mobile interaction through near-field visible light sensing
US10073578B2 (en) Electromagnetic interference signal detection
AU2013290489B2 (en) Adjusting mobile device state based on user intentions and/or identity
JP6877642B2 (ja) レーダーベースのアプリケーションのためのレーダー画像シェイパー
KR101976605B1 (ko) 전자기기 및 그 동작 방법
CN104969148A (zh) 基于深度的用户界面手势控制
CN103135753A (zh) 手势输入的方法及系统
EP3335099B1 (fr) Détection de signaux d'interférence électromagnétique
Kaholokula Reusing ambient light to recognize hand gestures
US11959997B2 (en) System and method for tracking a wearable device
CN115104134A (zh) 联合的红外及可见光视觉惯性对象跟踪
Li et al. Control your home with a smartwatch
EP3350681B1 (fr) Détection de signal de brouillage électromagnétique
CN110764612B (zh) 超声波处理方法、装置、电子设备及计算机可读介质
KR20220074317A (ko) 전자장치 및 라이다 센서를 이용한 전자장치 제어 방법
US20190196019A1 (en) Method and device for determining position of a target
Czuszyński et al. Analysis of properties of an active linear gesture sensor
KR102063903B1 (ko) 기계 학습을 이용한 전자기 간섭 신호의 처리
CN116783573A (zh) 用于管理电子设备的运动检测的系统和方法以及相关联的电子设备
CN117558029A (zh) 用于红外手势识别的装置和方法

Legal Events

Date Code Title Description
STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: THE INTERNATIONAL PUBLICATION HAS BEEN MADE

PUAI Public reference made under article 153(3) epc to a published international application that has entered the european phase

Free format text: ORIGINAL CODE: 0009012

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: REQUEST FOR EXAMINATION WAS MADE

17P Request for examination filed

Effective date: 20200114

A4 Supplementary search report drawn up and despatched

Effective date: 20200313

AK Designated contracting states

Kind code of ref document: A1

Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR

AX Request for extension of the european patent

Extension state: BA ME

DAV Request for validation of the european patent (deleted)
DAX Request for extension of the european patent (deleted)
STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: EXAMINATION IS IN PROGRESS

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: EXAMINATION IS IN PROGRESS

17Q First examination report despatched

Effective date: 20210113

GRAP Despatch of communication of intention to grant a patent

Free format text: ORIGINAL CODE: EPIDOSNIGR1

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: GRANT OF PATENT IS INTENDED

INTG Intention to grant announced

Effective date: 20230503

RIN1 Information on inventor provided before grant (corrected)

Inventor name: SADI, SAJID

Inventor name: AVILA, SANTIAGO ORTEGA

Inventor name: CORTES, JUAN PABLO FORERO

GRAS Grant fee paid

Free format text: ORIGINAL CODE: EPIDOSNIGR3

GRAA (expected) grant

Free format text: ORIGINAL CODE: 0009210

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: THE PATENT HAS BEEN GRANTED

AK Designated contracting states

Kind code of ref document: B1

Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR

REG Reference to a national code

Ref country code: GB

Ref legal event code: FG4D

REG Reference to a national code

Ref country code: CH

Ref legal event code: EP

REG Reference to a national code

Ref country code: IE

Ref legal event code: FG4D

REG Reference to a national code

Ref country code: DE

Ref legal event code: R096

Ref document number: 602018056056

Country of ref document: DE

REG Reference to a national code

Ref country code: LT

Ref legal event code: MG9D

REG Reference to a national code

Ref country code: NL

Ref legal event code: MP

Effective date: 20230823

REG Reference to a national code

Ref country code: AT

Ref legal event code: MK05

Ref document number: 1603359

Country of ref document: AT

Kind code of ref document: T

Effective date: 20230823

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: GR

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20231124

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: IS

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20231223

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: SE

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20230823

Ref country code: RS

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20230823

Ref country code: PT

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20231226

Ref country code: NO

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20231123

Ref country code: NL

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20230823

Ref country code: LV

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20230823

Ref country code: LT

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20230823

Ref country code: IS

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20231223

Ref country code: HR

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20230823

Ref country code: GR

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20231124

Ref country code: FI

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20230823

Ref country code: AT

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20230823

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: PL

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20230823

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: ES

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20230823

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: SM

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20230823

Ref country code: RO

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20230823

Ref country code: ES

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20230823

Ref country code: EE

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20230823

Ref country code: DK

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20230823

Ref country code: CZ

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20230823

Ref country code: SK

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20230823

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: IT

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20230823

PLBE No opposition filed within time limit

Free format text: ORIGINAL CODE: 0009261

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: NO OPPOSITION FILED WITHIN TIME LIMIT