US20230305636A1 - Gesture recognition using a mobile device - Google Patents
Gesture recognition using a mobile device Download PDFInfo
- Publication number
- US20230305636A1 US20230305636A1 US18/324,881 US202318324881A US2023305636A1 US 20230305636 A1 US20230305636 A1 US 20230305636A1 US 202318324881 A US202318324881 A US 202318324881A US 2023305636 A1 US2023305636 A1 US 2023305636A1
- Authority
- US
- United States
- Prior art keywords
- depth information
- mobile device
- orientation
- change
- user gesture
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/017—Gesture based interaction, e.g. based on a set of recognized hand gestures
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F18/00—Pattern recognition
- G06F18/20—Analysing
- G06F18/285—Selection of pattern recognition techniques, e.g. of classifiers in a multi-classifier system
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/0304—Detection arrangements using opto-electronic means
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/033—Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
- G06F3/0346—Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor with detection of the device orientation or free movement in a 3D space, e.g. 3D mice, 6-DOF [six degrees of freedom] pointers using gyroscopes, accelerometers or tilt-sensors
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V10/00—Arrangements for image or video recognition or understanding
- G06V10/70—Arrangements for image or video recognition or understanding using pattern recognition or machine learning
- G06V10/87—Arrangements for image or video recognition or understanding using pattern recognition or machine learning using selection of the recognition techniques, e.g. of a classifier in a multiple classifier system
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V40/00—Recognition of biometric, human-related or animal-related patterns in image or video data
- G06V40/20—Movements or behaviour, e.g. gesture recognition
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V40/00—Recognition of biometric, human-related or animal-related patterns in image or video data
- G06V40/20—Movements or behaviour, e.g. gesture recognition
- G06V40/28—Recognition of hand or arm movements, e.g. recognition of deaf sign language
Definitions
- This disclosure relates to gesture recognition using a mobile device
- gesture recognition it has been proposed to use gesture recognition to control one or more operational features of a vehicle information and/or entertainment system.
- a mobile device such as a mobile telephone device, equipped with a depth camera, to detect the user gestures.
- An example embodiment provides a mobile device configured to perform gesture recognition for a vehicle information and/or entertainment system, the mobile device comprising:
- Another example embodiment provides a gesture recognition system for a vehicle information and/or entertainment system, the system comprising:
- Another example embodiment provides a method of operation of a mobile device to perform gesture recognition for a vehicle information and/or entertainment system, the method comprising:
- the present disclosure also provides computer software which, when executed by a processor of a mobile device having a depth camera, causes the mobile device to perform the method defined above.
- Example embodiments provide a machine-readable, non-transitory storage medium which stores such computer software.
- FIG. 1 is a schematic diagram of a mobile device
- FIG. 2 is a schematic diagram of a vehicle information and/or entertainment system
- FIGS. 3 a and 3 b schematically represent example gestures
- FIGS. 4 a and 4 b provide a schematic flowchart representing a gesture recognition algorithm
- FIGS. 5 and 6 schematically represent different views of a hand gesture
- FIG. 7 schematically represents a plurality of possible views of a hand gesture depending on the location and orientation of a mobile device
- FIG. 8 is a schematic flowchart representing the varying of a gesture recognition algorithm
- FIGS. 9 and 10 schematically represent the use of orientation sensors
- FIG. 11 schematically represents a vehicle interior as seen by a depth camera of a mobile device
- FIG. 12 is a schematic flowchart illustrating a method of orientation detection
- FIG. 13 schematically illustrates a mobile device and a vehicle information and/or entertainment system
- FIG. 14 is a schematic flowchart illustrating a method.
- FIG. 1 is a schematic diagram of a mobile device such as, for example, a mobile telephone device, although some telephony features have been omitted from FIG. 1 for clarity of the present discussion.
- the mobile device 100 comprises a depth camera 110 .
- the depth camera is operable to capture so-called depth images of a field of view, such that as well as capturing image data representing the appearance of the field of view as seen by the depth camera 110 , information is also captured representing the depth, which is to say the distance from the camera 110 , of each image feature.
- the diagram of FIG. 1 includes a representation 112 of a coordinate system applicable to the depth camera 110 , in which an x-axis and a y-axis are in the plane of the mobile device as drawn and a z-axis is perpendicular to that plane, extending in a direction of view of the depth camera 110 .
- the depth information captured by the depth camera can include an indication of a depth or z value for pixels at particular (x, y) positions in a captured image.
- the depth camera may be, for example, a so-called time of flight image sensor, a stereoscopic camera, or a structured light camera in which an emitter provides a known pattern of optical illumination, for example a grid of infra-red optical illumination, such that distortions in the captured images of the structured light pattern can indicate the depth of image features.
- a so-called time of flight image sensor a stereoscopic camera
- a structured light camera in which an emitter provides a known pattern of optical illumination, for example a grid of infra-red optical illumination, such that distortions in the captured images of the structured light pattern can indicate the depth of image features.
- an emitter provides a known pattern of optical illumination, for example a grid of infra-red optical illumination, such that distortions in the captured images of the structured light pattern can indicate the depth of image features.
- other types of depth camera may be used instead.
- images captured by the depth camera are provided to a processor 120 operating under the control of program instructions representing computer software stored in storage 130 , which may be a non-transitory machine-readable storage medium such as a non-volatile memory. Examples may include a flash memory, a read only memory (ROM) or a magnetic or optical disk storage.
- storage 130 may be a non-transitory machine-readable storage medium such as a non-volatile memory. Examples may include a flash memory, a read only memory (ROM) or a magnetic or optical disk storage.
- a user interface 140 is provided, for example in the form of a display element (not shown) and a touch panel (not shown).
- An interface 150 provides a wireless or wired connection to a vehicle information and/or entertainment system.
- An example of such an interface is a so-called Bluetooth 6 interface.
- one or more sensors 160 are provided.
- sensors can include one or more orientation sensors to detect a current orientation of the mobile device.
- these may include one or more sensors selected from the list consisting of: a gyroscopic sensor, an accelerometer, a gravity sensor, and a magnetic field sensor.
- the mobile device of FIG. 1 is connectable by a wired or wireless connection to a vehicle information and/or entertainment system to be discussed below, such that the mobile device 100 provides an example of a mobile device configured to perform gesture recognition for vehicle information and/or entertainment system, the mobile device comprising a depth camera 110 , an orientation sensor 160 and a processor 120 configured to detect one or more gestures from images captured by the depth camera 110 according to a gesture detection algorithm.
- the processor 110 is configured to vary the gesture detection algorithm (for example, by selecting a different algorithm or version of the algorithm and/or by varying parameters of the algorithm in use) in dependence upon an orientation of the mobile device 100 as detected by the orientation sensor 160 .
- FIG. 2 is a schematic diagram of a vehicle information and/or entertainment system comprising an interface 200 to cooperate with the interface 150 of the mobile device 100 and, in at least some examples, to receive control signals generated by the processor 120 of the mobile device 100 .
- These control signals 205 are passed to a control processor 210 which controls the operation of a so-called “infotainment” system 220 generating information and/or entertainment for presentation to a user by a user interface 230 such as a display screen and/or one or more loudspeakers.
- a control processor 210 which controls the operation of a so-called “infotainment” system 220 generating information and/or entertainment for presentation to a user by a user interface 230 such as a display screen and/or one or more loudspeakers.
- the interface 200 may also be a Bluetooth 6 wireless interface. It is noted that the interface 200 may also be used to receive audio signals 215 from the mobile device 100 , for example music or telephony signals, which do not represent control signals themselves and can be passed directly to the infotainment system 220 , for example for output to the user.
- audio signals 215 from the mobile device 100 , for example music or telephony signals, which do not represent control signals themselves and can be passed directly to the infotainment system 220 , for example for output to the user.
- the arrangement of FIG. 2 therefore comprises an example of a vehicle information and/or entertainment apparatus comprising an information and/or entertainment system 220 configured to receive (via the interface 200 ) control signals from the mobile device 100 and to vary (by the control processor 210 ) the operation of the information and/or entertainment system according to the received control signals.
- an information and/or entertainment system 220 configured to receive (via the interface 200 ) control signals from the mobile device 100 and to vary (by the control processor 210 ) the operation of the information and/or entertainment system according to the received control signals.
- the mobile device 100 or FIG. 1 and the vehicle information and/or entertainment system of FIG. 2 each comprise a respective interface such as a wireless interface 150 , 200 to communicate the control signals between the mobile device 100 and the vehicle information and/or entertainment system.
- gestures can be used to control the infotainment system 220 include matters such as the adjustment of audio volume, changing radio channels, initiating a phone call, altering air conditioning settings or the like.
- a gesture based system is considered to be well suited to the driver of a vehicle being able to control the various vehicle systems without losing concentration by having to search for physical controls or touch screen menu options.
- FIGS. 3 a and 3 b schematically illustrate and represent example gestures which can be detected and recognised by the mobile device of FIG. 1 .
- FIG. 3 a provides a graphical representation of a plurality of gestures
- FIG. 3 b provides a list of terminology associated with the set of gestures.
- Several of the gestures involve detecting motion of the user's hand such as 300 , 305 , 310 , 320 , 330 , 335 , 340 , 345 .
- Other gestures can involve detecting a static configuration of the user's hand such as gestures 350 , 355 .
- FIGS. 4 a and 4 b provide a schematic flow chart representing an example gesture recognition algorithm.
- depth images are acquired by the depth camera 110 of the mobile device 110 .
- a step 410 is a gesture detection step which will be discussed in more detail below, and a step 420 involves generating control signals, for example for supply to the infotaintment system, in dependence upon the detected gestures.
- a further example of such a gesture detection and/or recognition technique is disclosed in WO2015/104257 A1, the contents of which are incorporated by reference into the present description.
- the step 410 is shown in more detail, such that at a step 430 , so-called feature points such as fingertips and the like are detected in the captured images.
- motion and/or configuration of the feature points is detected, and at a step 450 the detected motion and/or configuration are configured to a library in which examples of motion and/or configuration of feature points are indicative of particular gestures.
- gestures 300 , 305 these will be discussed in more detail with reference to FIGS. 5 to 7 .
- FIG. 5 represents a side view of a user's hand in a pointing configuration, making a generally circular movement of the index finger.
- FIG. 6 represents the same hand making the same movement but viewed along the axis of the index finger.
- the user In the context of a system operating in a vehicle, the user has a generally fixed location, being confined by the location of the driver's or passenger's seat, but the mobile device 100 can take various different locations within the vehicle, for example being positioned between the two front seats in a generally horizontal upward-facing orientation or being mounted to a window or air vent mount in a rearward facing orientation.
- FIG. 7 shows the user's hand 700 trying to execute a consistent gesture, but a plurality of possible locations for a mobile device 710 and its associated field of view 720 from a low-down upward-looking orientation to a higher rearward looking orientation.
- the position and orientation are somewhat linked together, given that there is generally a limited range of places in which the user can stow or mount a mobile device (for example, between the front seats facing upwards, on a vent or windscreen mount, or the like, such that at each location, the user will fully appreciate that the orientation of the device needs to be towards the user if the user is to detected making hand gestures.
- the processor 120 can be configured to vary the detection algorithm in dependence upon an orientation of the mobile device as detected by the orientation sensor 160 .
- the orientation sensor 160 it may be that two or more gesture detection algorithms are provided, one which is more suited to a range 730 of locations and orientations of the mobile device 710 and one which is more suited to a range 740 of locations and orientations of the mobile device 710 .
- calibration data can be used to calibrate, for example, the expected views by the depth camera of the hand undertaking a gesture to be recognised.
- FIG. 8 is an example flow chart schematically representing this process.
- the processor 120 detects orientation data from the sensor 160 .
- the processor 120 selects a gesture detection algorithm or gesture detector from amongst the candidate detectors 810 .
- the processor 120 calibrates the selected detection algorithm, which is to say varies its settings or parameters according to the detected orientation.
- the processor 120 applies the selected gesture detection algorithm.
- step 820 provides an example of an arrangement in which the processor 120 is configured to select between two or more candidate gesture detection algorithms according to the orientation of the mobile device detected by the orientation sensor.
- the step 830 which can be used with or without the step 820 , provides an example in which the processor 120 is configured to vary a calibration parameter of the gesture detection algorithm, or in the case of the use of the step 820 , the selected gesture detection algorithm) according to the orientation of the mobile device as detected by the orientation sensor.
- FIGS. 9 and 10 schematically represent the use of orientation sensors, with particular technical reference to an in-vehicle system.
- an orientation sensor 900 detects the orientation of the mobile device 100 relative to a gravity or downwards vector 910 , so as to provide an indication of the orientation of the device relative to the vector 910 . This can be performed, for example, a gravity, acceleration or gyroscopic sensor.
- the processor 120 is configured to vary the gesture detection algorithm less frequently than an image capture rate of the depth camera, for example no more frequently then every n captured images, where n may be for example 100.
- the processor 120 may be configured to apply a smoothing operation such as a low pass filtering process to variations of the gesture detection algorithm, so that for example the detected orientation is smoothed or low pass filtered (for example, with a time constant of a few seconds, for example 60 seconds, which is considered longer than a typical sharp turn takes to execute in a vehicle) and the smoothed detected orientation is applied to control the variation of the gesture detection algorithm.
- a smoothing operation such as a low pass filtering process
- the processor 120 can act under program instruction control to implement a filter and/or delay operation with respect to the detections of orientation and/or the variations to be applied to the detection algorithm.
- FIGS. 11 and 12 Another example arrangement will be described with reference to FIGS. 11 and 12 , again with particular technical reference to an in-vehicle system where the device's environment will tend to be fixed (rather than a free space or room-based system where the environment could be subject to many more variations).
- FIG. 11 schematically represents an example vehicle interior as seen by a depth camera of a mobile device mounted in a generally rearward direction, in a mounting position towards the front of a vehicle.
- the rear window 1100 and the driver and passenger seats 1110 can be seen (the driver and passenger being omitted for clarity), along with the vehicle doors, 1120 , 1130 .
- a so-called optical flow technique can be used to confirm or reject orientation changes detected by the orientation sensor 160 . In other words, if an orientation change is detected but it is inconsistent with image motion of the background image, the change can be rejected.
- in-vehicle features could be used as reference points or markers (such as a rear wiper 1140 ) or one or more (preferably two or more spaced apart) beacons such as continuous or pulse-coded infra-red emitters 1150 , 1160 could be provided within the vehicle for use as optical flow reference points.
- orientation sensor can be considered to comprise a detector to detect changes in an image location of objects within the vehicle in the images captured by the depth camera.
- the processor may be configured to detect whether a change in orientation detected by the orientation sensor is consistent with a change in image location of one or more objects within the vehicle within the images captured by the depth camera.
- FIG. 12 is a schematic flow chart illustrating an example of such a technique.
- the processor 120 detects a background portion of the captured images, for example as a portion (which may in practice be the majority of the captured images which does not substantially change from image to image over a period of for example 20 images.
- a step 1210 represents the detection of an orientation change by the sensor 160 corresponding to the step 800 of FIG. 8 .
- the processor 120 detects whether the detected orientation change by the sensor is consistent with any changes or lack of changes detected in the background portion of the captured images. If the outcome is yes then the detected change by the sensor is accepted and implemented at a step 1230 . If the answer is no then the detected change is either rejected or deferred for implementation when confirmed later at a step 1240 .
- FIG. 13 schematically illustrates a mobile device 1300 and a vehicle information and/or entertainment system (“IVI” or “in-vehicle infotainment” [information and/or entertainment] system) 1310 .
- a depth camera 1320 communicates with software 1330 running on the processor 120 of FIG. 1 , which is also responsive to signals from sensors 1340 corresponding to the sensors 160 of FIG. 1 .
- the images from the depth camera are subject to filtering 1332 , calibration 1334 , background removal 1336 , segmentation 1338 and hand pose classification 1342 to provide an input to the selected gesture detection algorithm selection 1344 .
- the gesture detection algorithm includes at least the hand pose classification 1342 .
- control signals 1350 are communicated, for example by a wireless communication link to the IVI 1310 .
- embodiments of the present disclosure encompass a gesture recognition system for a vehicle information and/or entertainment system, the system comprising: a mobile device having a depth camera and an orientation sensor; and a processor configured to detect one or more gestures from images captured by the depth camera according to a gesture detection algorithm; in which the processor is configured to vary the gesture detection algorithm in dependence upon an orientation of the mobile device detected by the orientation sensor.
- the processor can be physically part of the IVI or the mobile device, and/or the processing tasks can be shared between the two devices.
- FIG. 14 is a schematic flowchart illustrating a method of operation of a mobile device to perform gesture recognition for a vehicle information and/or entertainment system, the method comprising:
- the above method can be performed by the processor 120 of the mobile device of FIG. 1 , according to software stored in the storage 130 of FIG. 1 .
- example embodiments can be implemented by computer software operating on a general purpose computing system such as a games machine.
- computer software which when executed by a computer, causes the computer to carry out any of the methods discussed above is considered as an embodiment of the present disclosure.
- embodiments of the disclosure are provided by a non-transitory, machine-readable storage medium which stores such computer software.
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- General Engineering & Computer Science (AREA)
- Human Computer Interaction (AREA)
- Computer Vision & Pattern Recognition (AREA)
- General Health & Medical Sciences (AREA)
- Multimedia (AREA)
- Health & Medical Sciences (AREA)
- Social Psychology (AREA)
- Psychiatry (AREA)
- Artificial Intelligence (AREA)
- Evolutionary Computation (AREA)
- Computing Systems (AREA)
- Medical Informatics (AREA)
- Software Systems (AREA)
- Databases & Information Systems (AREA)
- Data Mining & Analysis (AREA)
- Life Sciences & Earth Sciences (AREA)
- Bioinformatics & Cheminformatics (AREA)
- Bioinformatics & Computational Biology (AREA)
- Evolutionary Biology (AREA)
- User Interface Of Digital Computer (AREA)
- Image Analysis (AREA)
Abstract
A mobile device configured to perform gesture recognition for a vehicle information and/or entertainment system comprises a depth camera; an orientation sensor; and a processor configured to detect one or more gestures from images captured by the depth camera according to a gesture detection algorithm; in which the processor is configured to vary the gesture detection algorithm in dependence upon an orientation of the mobile device detected by the orientation sensor.
Description
- The present application claims the benefit under 35 U.S.C. § 120 as a continuation application of U.S. application Ser. No. 16/958,024, filed on Jun. 25, 2020, which claims the benefit under 35 U.S.C. § 371 as a U.S. National Stage Entry of International Application No. PCT/EP2018/097032, filed in the European Patent Office as a Receiving Office on Dec. 27, 2018, which claims priority to European Patent Application Number 18150191.7, filed in the European Patent Office on Jan. 3, 2018, each of which applications is hereby incorporated by reference in its entirety.
- This disclosure relates to gesture recognition using a mobile device
- The “background” description provided herein is for the purpose of generally presenting the context of the disclosure. Work of the presently named inventors, to the extent it is described in this background section, as well as aspects of the description which may not otherwise qualify as prior art at the time of filing, are neither expressly or impliedly admitted as prior art against the present disclosure.
- It has been proposed to use gesture recognition to control one or more operational features of a vehicle information and/or entertainment system.
- It has also been proposed to use a mobile device such as a mobile telephone device, equipped with a depth camera, to detect the user gestures.
- The foregoing paragraphs have been provided by way of general introduction, and are not intended to limit the scope of the following claims. The described embodiments, together with further advantages, will be best understood by reference to the following detailed description taken in conjunction with the accompanying drawings.
- An example embodiment provides a mobile device configured to perform gesture recognition for a vehicle information and/or entertainment system, the mobile device comprising:
-
- a depth camera;
- an orientation sensor; and
- a processor configured to detect one or more gestures from images captured by the depth camera according to a gesture detection algorithm;
- in which the processor is configured to vary the gesture detection algorithm in dependence upon an orientation of the mobile device detected by the orientation sensor.
- Another example embodiment provides a gesture recognition system for a vehicle information and/or entertainment system, the system comprising:
-
- a mobile device having a depth camera and an orientation sensor; and
- a processor configured to detect one or more gestures from images captured by the depth camera according to a gesture detection algorithm;
- in which the processor is configured to vary the gesture detection algorithm in dependence upon an orientation of the mobile device detected by the orientation sensor.
- Another example embodiment provides a method of operation of a mobile device to perform gesture recognition for a vehicle information and/or entertainment system, the method comprising:
-
- detecting images using a depth camera of the mobile device;
- detecting an orientation of the mobile device; and
- detecting one or more gestures from images captured by the depth camera according to a gesture detection algorithm;
- in which the step of detecting one of more gestures comprises varying the gesture detection algorithm in dependence upon an orientation of the mobile device detected by the orientation sensor.
- The present disclosure also provides computer software which, when executed by a processor of a mobile device having a depth camera, causes the mobile device to perform the method defined above.
- Example embodiments provide a machine-readable, non-transitory storage medium which stores such computer software.
- Various other aspects and features of the present disclosure are defined in the appended claims and within the text of the accompanying description and include at least a head mountable apparatus such as a display and a method of operating a head-mountable apparatus as well as a computer program.
- A more complete appreciation of the disclosure and many of the attendant advantages thereof will be readily obtained as the same becomes better understood by reference to the following detailed description when considered in connection with the accompanying drawings, in which:
-
FIG. 1 is a schematic diagram of a mobile device; -
FIG. 2 is a schematic diagram of a vehicle information and/or entertainment system; -
FIGS. 3 a and 3 b schematically represent example gestures; -
FIGS. 4 a and 4 b provide a schematic flowchart representing a gesture recognition algorithm; -
FIGS. 5 and 6 schematically represent different views of a hand gesture; -
FIG. 7 schematically represents a plurality of possible views of a hand gesture depending on the location and orientation of a mobile device; -
FIG. 8 is a schematic flowchart representing the varying of a gesture recognition algorithm; -
FIGS. 9 and 10 schematically represent the use of orientation sensors; -
FIG. 11 schematically represents a vehicle interior as seen by a depth camera of a mobile device; -
FIG. 12 is a schematic flowchart illustrating a method of orientation detection; -
FIG. 13 schematically illustrates a mobile device and a vehicle information and/or entertainment system; and -
FIG. 14 is a schematic flowchart illustrating a method. -
FIG. 1 is a schematic diagram of a mobile device such as, for example, a mobile telephone device, although some telephony features have been omitted fromFIG. 1 for clarity of the present discussion. - The
mobile device 100 comprises adepth camera 110. The depth camera is operable to capture so-called depth images of a field of view, such that as well as capturing image data representing the appearance of the field of view as seen by thedepth camera 110, information is also captured representing the depth, which is to say the distance from thecamera 110, of each image feature. In this regard, the diagram ofFIG. 1 includes arepresentation 112 of a coordinate system applicable to thedepth camera 110, in which an x-axis and a y-axis are in the plane of the mobile device as drawn and a z-axis is perpendicular to that plane, extending in a direction of view of thedepth camera 110. The depth information captured by the depth camera can include an indication of a depth or z value for pixels at particular (x, y) positions in a captured image. - The depth camera may be, for example, a so-called time of flight image sensor, a stereoscopic camera, or a structured light camera in which an emitter provides a known pattern of optical illumination, for example a grid of infra-red optical illumination, such that distortions in the captured images of the structured light pattern can indicate the depth of image features. The skilled person will appreciate that other types of depth camera may be used instead.
- Referring back to
FIG. 1 , images captured by the depth camera are provided to aprocessor 120 operating under the control of program instructions representing computer software stored instorage 130, which may be a non-transitory machine-readable storage medium such as a non-volatile memory. Examples may include a flash memory, a read only memory (ROM) or a magnetic or optical disk storage. - A
user interface 140 is provided, for example in the form of a display element (not shown) and a touch panel (not shown). Aninterface 150 provides a wireless or wired connection to a vehicle information and/or entertainment system. An example of such an interface is a so-called Bluetooth 6 interface. - Finally with regard to
FIG. 1 , one ormore sensors 160 are provided. Examples of such sensors can include one or more orientation sensors to detect a current orientation of the mobile device. For example, these may include one or more sensors selected from the list consisting of: a gyroscopic sensor, an accelerometer, a gravity sensor, and a magnetic field sensor. - As discussed above, the mobile device of
FIG. 1 is connectable by a wired or wireless connection to a vehicle information and/or entertainment system to be discussed below, such that themobile device 100 provides an example of a mobile device configured to perform gesture recognition for vehicle information and/or entertainment system, the mobile device comprising adepth camera 110, anorientation sensor 160 and aprocessor 120 configured to detect one or more gestures from images captured by thedepth camera 110 according to a gesture detection algorithm. Using techniques to be discussed below, theprocessor 110 is configured to vary the gesture detection algorithm (for example, by selecting a different algorithm or version of the algorithm and/or by varying parameters of the algorithm in use) in dependence upon an orientation of themobile device 100 as detected by theorientation sensor 160. -
FIG. 2 is a schematic diagram of a vehicle information and/or entertainment system comprising aninterface 200 to cooperate with theinterface 150 of themobile device 100 and, in at least some examples, to receive control signals generated by theprocessor 120 of themobile device 100. Thesecontrol signals 205 are passed to acontrol processor 210 which controls the operation of a so-called “infotainment”system 220 generating information and/or entertainment for presentation to a user by auser interface 230 such as a display screen and/or one or more loudspeakers. - The
interface 200 may also be a Bluetooth 6 wireless interface. It is noted that theinterface 200 may also be used to receiveaudio signals 215 from themobile device 100, for example music or telephony signals, which do not represent control signals themselves and can be passed directly to theinfotainment system 220, for example for output to the user. - The arrangement of
FIG. 2 therefore comprises an example of a vehicle information and/or entertainment apparatus comprising an information and/orentertainment system 220 configured to receive (via the interface 200) control signals from themobile device 100 and to vary (by the control processor 210) the operation of the information and/or entertainment system according to the received control signals. - As discussed, therefore, the
mobile device 100 orFIG. 1 and the vehicle information and/or entertainment system ofFIG. 2 each comprise a respective interface such as awireless interface mobile device 100 and the vehicle information and/or entertainment system. - Examples of the way in which gestures can be used to control the
infotainment system 220 include matters such as the adjustment of audio volume, changing radio channels, initiating a phone call, altering air conditioning settings or the like. A gesture based system is considered to be well suited to the driver of a vehicle being able to control the various vehicle systems without losing concentration by having to search for physical controls or touch screen menu options. -
FIGS. 3 a and 3 b schematically illustrate and represent example gestures which can be detected and recognised by the mobile device ofFIG. 1 . In particular,FIG. 3 a provides a graphical representation of a plurality of gestures andFIG. 3 b provides a list of terminology associated with the set of gestures. Several of the gestures involve detecting motion of the user's hand such as 300, 305, 310, 320, 330, 335, 340, 345. Other gestures can involve detecting a static configuration of the user's hand such asgestures -
FIGS. 4 a and 4 b provide a schematic flow chart representing an example gesture recognition algorithm. At astep 400, depth images are acquired by thedepth camera 110 of themobile device 110. Astep 410 is a gesture detection step which will be discussed in more detail below, and astep 420 involves generating control signals, for example for supply to the infotaintment system, in dependence upon the detected gestures. A further example of such a gesture detection and/or recognition technique is disclosed in WO2015/104257 A1, the contents of which are incorporated by reference into the present description. - In
FIG. 4 b , thestep 410 is shown in more detail, such that at astep 430, so-called feature points such as fingertips and the like are detected in the captured images. At astep 440, motion and/or configuration of the feature points is detected, and at astep 450 the detected motion and/or configuration are configured to a library in which examples of motion and/or configuration of feature points are indicative of particular gestures. - Considering for example the
gestures FIGS. 5 to 7 . -
FIG. 5 represents a side view of a user's hand in a pointing configuration, making a generally circular movement of the index finger.FIG. 6 represents the same hand making the same movement but viewed along the axis of the index finger. - In the context of a system operating in a vehicle, the user has a generally fixed location, being confined by the location of the driver's or passenger's seat, but the
mobile device 100 can take various different locations within the vehicle, for example being positioned between the two front seats in a generally horizontal upward-facing orientation or being mounted to a window or air vent mount in a rearward facing orientation. However, it is desirable that the user of the system does not have to vary his or her gesture to achieve the same control result, simply because themobile device 100 is in a different location. This issue is illustrated in more detail inFIG. 7 which shows the user'shand 700 trying to execute a consistent gesture, but a plurality of possible locations for amobile device 710 and its associated field ofview 720 from a low-down upward-looking orientation to a higher rearward looking orientation. - In a vehicle environment, the position and orientation are somewhat linked together, given that there is generally a limited range of places in which the user can stow or mount a mobile device (for example, between the front seats facing upwards, on a vent or windscreen mount, or the like, such that at each location, the user will fully appreciate that the orientation of the device needs to be towards the user if the user is to detected making hand gestures.
- To address the issue of allowing the user to make a consistent gesture but still to be able to detect and recognise that gesture even if the
mobile device 100 is in a different location and orientation, theprocessor 120 can be configured to vary the detection algorithm in dependence upon an orientation of the mobile device as detected by theorientation sensor 160. For example, it may be that two or more gesture detection algorithms are provided, one which is more suited to arange 730 of locations and orientations of themobile device 710 and one which is more suited to arange 740 of locations and orientations of themobile device 710. Within arespective range -
FIG. 8 is an example flow chart schematically representing this process. - At a
step 800, theprocessor 120 detects orientation data from thesensor 160. - If there are a plurality of
candidate detection algorithms 810 available for use by theprocessor 120, each suitable (for example) to a sub-range of orientations of the mobile device, then at astep 820 theprocessor 120 selects a gesture detection algorithm or gesture detector from amongst thecandidate detectors 810. At astep 830, theprocessor 120 calibrates the selected detection algorithm, which is to say varies its settings or parameters according to the detected orientation. At astep 840, theprocessor 120 applies the selected gesture detection algorithm. - Therefore the
step 820 provides an example of an arrangement in which theprocessor 120 is configured to select between two or more candidate gesture detection algorithms according to the orientation of the mobile device detected by the orientation sensor. - The
step 830, which can be used with or without thestep 820, provides an example in which theprocessor 120 is configured to vary a calibration parameter of the gesture detection algorithm, or in the case of the use of thestep 820, the selected gesture detection algorithm) according to the orientation of the mobile device as detected by the orientation sensor. -
FIGS. 9 and 10 schematically represent the use of orientation sensors, with particular technical reference to an in-vehicle system. InFIG. 9 , anorientation sensor 900 detects the orientation of themobile device 100 relative to a gravity or downwardsvector 910, so as to provide an indication of the orientation of the device relative to thevector 910. This can be performed, for example, a gravity, acceleration or gyroscopic sensor. - Consider then the situation in which the vehicle executes a sharp turn, or accelerates or decelerates very sharply. An example sharp turn is illustrated by an
arrow 1000 inFIG. 10 . In this situation, the detected gravity vector will tend to skew from actual vertical, for example at an angle represented by avector 1010. This could in principle be erroneously detected by the process ofFIG. 8 as a change in orientation of themobile device 100, leading to a variation in the gesture detection algorithm as discussed above. Various measures can be provided to avoid this occurring. - In one example, the
processor 120 is configured to vary the gesture detection algorithm less frequently than an image capture rate of the depth camera, for example no more frequently then every n captured images, where n may be for example 100. - In addition, or as an alternative, the
processor 120 may be configured to apply a smoothing operation such as a low pass filtering process to variations of the gesture detection algorithm, so that for example the detected orientation is smoothed or low pass filtered (for example, with a time constant of a few seconds, for example 60 seconds, which is considered longer than a typical sharp turn takes to execute in a vehicle) and the smoothed detected orientation is applied to control the variation of the gesture detection algorithm. - To achieve these arrangements the
processor 120 can act under program instruction control to implement a filter and/or delay operation with respect to the detections of orientation and/or the variations to be applied to the detection algorithm. - Another example arrangement will be described with reference to
FIGS. 11 and 12 , again with particular technical reference to an in-vehicle system where the device's environment will tend to be fixed (rather than a free space or room-based system where the environment could be subject to many more variations). -
FIG. 11 schematically represents an example vehicle interior as seen by a depth camera of a mobile device mounted in a generally rearward direction, in a mounting position towards the front of a vehicle. Here, therear window 1100 and the driver andpassenger seats 1110 can be seen (the driver and passenger being omitted for clarity), along with the vehicle doors, 1120, 1130. Using these captured images which relate to items which do not move relative to the camera as long as the camera remains at the same position and orientation, or in other words the background, a so-called optical flow technique can be used to confirm or reject orientation changes detected by theorientation sensor 160. In other words, if an orientation change is detected but it is inconsistent with image motion of the background image, the change can be rejected. - It is not necessary to detect motion of the entire background; particular in-vehicle features could be used as reference points or markers (such as a rear wiper 1140) or one or more (preferably two or more spaced apart) beacons such as continuous or pulse-coded infra-
red emitters - It can in fact be considered that such an optical flow technique represents one of (or the only) orientation sensing arrangement(s), so that the orientation sensor can be considered to comprise a detector to detect changes in an image location of objects within the vehicle in the images captured by the depth camera.
- For example, the processor may be configured to detect whether a change in orientation detected by the orientation sensor is consistent with a change in image location of one or more objects within the vehicle within the images captured by the depth camera.
-
FIG. 12 is a schematic flow chart illustrating an example of such a technique. - At a
step 1200, theprocessor 120 detects a background portion of the captured images, for example as a portion (which may in practice be the majority of the captured images which does not substantially change from image to image over a period of for example 20 images. Astep 1210 represents the detection of an orientation change by thesensor 160 corresponding to thestep 800 ofFIG. 8 . - At a
step 1220, theprocessor 120 detects whether the detected orientation change by the sensor is consistent with any changes or lack of changes detected in the background portion of the captured images. If the outcome is yes then the detected change by the sensor is accepted and implemented at astep 1230. If the answer is no then the detected change is either rejected or deferred for implementation when confirmed later at astep 1240. -
FIG. 13 schematically illustrates amobile device 1300 and a vehicle information and/or entertainment system (“IVI” or “in-vehicle infotainment” [information and/or entertainment] system) 1310. As discussed above, adepth camera 1320 communicates withsoftware 1330 running on theprocessor 120 ofFIG. 1 , which is also responsive to signals fromsensors 1340 corresponding to thesensors 160 ofFIG. 1 . The images from the depth camera are subject tofiltering 1332,calibration 1334,background removal 1336,segmentation 1338 and hand poseclassification 1342 to provide an input to the selected gesturedetection algorithm selection 1344. The gesture detection algorithm includes at least the hand poseclassification 1342. Based on the detected gesture,control signals 1350 are communicated, for example by a wireless communication link to theIVI 1310. - Note that in other examples, the mobile device could be used simply to capture the depth images for transmission (for example, by the
interfaces 150/200) to the vehicle information and/or entertainment system, where the gesture recognition takes place. The mobile device would also need to capture its orientation so that variations of the gesture detection algorithm may be made as discussed above. In this regard, embodiments of the present disclosure encompass a gesture recognition system for a vehicle information and/or entertainment system, the system comprising: a mobile device having a depth camera and an orientation sensor; and a processor configured to detect one or more gestures from images captured by the depth camera according to a gesture detection algorithm; in which the processor is configured to vary the gesture detection algorithm in dependence upon an orientation of the mobile device detected by the orientation sensor. Note that the processor can be physically part of the IVI or the mobile device, and/or the processing tasks can be shared between the two devices. -
FIG. 14 is a schematic flowchart illustrating a method of operation of a mobile device to perform gesture recognition for a vehicle information and/or entertainment system, the method comprising: -
- detecting (at a step 1400) images using a depth camera of the mobile device;
- detecting (at a step 1410) an orientation of the mobile device; and
- detecting (at a step 1420) one or more gestures from images captured by the depth camera according to a gesture detection algorithm;
- in which the step of detecting one of more gestures comprises varying 1430 the gesture detection algorithm in dependence upon an orientation of the mobile device detected by the orientation sensor.
- As discussed, the above method can be performed by the
processor 120 of the mobile device ofFIG. 1 , according to software stored in thestorage 130 ofFIG. 1 . - It will be appreciated that example embodiments can be implemented by computer software operating on a general purpose computing system such as a games machine. In these examples, computer software, which when executed by a computer, causes the computer to carry out any of the methods discussed above is considered as an embodiment of the present disclosure. Similarly, embodiments of the disclosure are provided by a non-transitory, machine-readable storage medium which stores such computer software.
- It will also be apparent that numerous modifications and variations of the present disclosure are possible in light of the above teachings. It is therefore to be understood that within the scope of the appended claims, the disclosure may be practised otherwise than as specifically described herein.
- Respective aspects and features of embodiments of the present disclosure are defined by the following numbered clauses:
-
- 1. A mobile device configured to perform gesture recognition for a vehicle information and/or entertainment system, the mobile device comprising:
- a depth camera;
- an orientation sensor; and
- a processor configured to detect one or more gestures from images captured by the depth camera according to a gesture detection algorithm;
- in which the processor is configured to vary the gesture detection algorithm in dependence upon an orientation of the mobile device detected by the orientation sensor.
- 2. A mobile device according to
clause 1, in which the processor is configured to select between two or more candidate gesture detection algorithms according to the orientation of the mobile device detected by the orientation sensor. - 3. A mobile device according to clause 2, in which the processor is configured to vary a calibration parameter of the selected gesture detection algorithm according to the orientation of the mobile device detected by the orientation sensor.
- 4. A mobile device according to
clause 1, in which the processor is configured to vary a calibration parameter of the gesture detection algorithm according to the orientation of the mobile device detected by the orientation sensor. - 5. A mobile device according to any one of the preceding clauses, in which the depth camera comprises a sensor selected from the list consisting of:
- a time of flight image sensor;
- a stereoscopic camera; and
- a structured light camera.
- 6. A mobile device according to any one of the preceding clauses, in which the orientation sensor comprises one or more sensors selected from the list consisting of:
- a gyroscopic sensor;
- an accelerometer sensor;
- a gravity sensor; and
- a magnetic field sensor.
- 7. A mobile device according to any one of the preceding clauses, in which the processor is configured to detect whether a change in orientation detected by the orientation sensor is consistent with a change in image location of one or more objects within the vehicle within the images captured by the depth camera.
- 8. A mobile device according to any one of
clauses 1 to 5, in which the orientation sensor comprises a detector to detect changes in an image location of objects within the vehicle in the images captured by the depth camera. - 9. A mobile device according to any one of the preceding clauses, in which the processor is configured to vary the gesture detection algorithm less frequently than an image capture rate of the depth camera.
- 10. A mobile device according to any one of the preceding clauses, in which the processor is configured to apply a smoothing to variations of the gesture detection algorithm.
- 11. A mobile device according to any one of the preceding clauses, in which the processor is configured to generate control signals for the vehicle information and/or entertainment system in dependence upon the detected gestures.
- 12. A mobile device according to clause 11, comprising:
- a wireless interface to communicate the control signals with the vehicle information and/or entertainment system.
- 13. Vehicle information and/or entertainment apparatus comprising:
- a mobile device according to any one of
clauses 1 to 11 and operable to generate control signals dependent upon one or more detected gestures; - an information and/or entertainment system configured to receive the control signals and to vary the operation of the information and/or entertainment system according to the received control signals.
- a mobile device according to any one of
- 14. Apparatus according to clause 13, in which the mobile device and the vehicle information and/or entertainment system each comprise a respective wireless interface to communicate the control signals between the mobile device and the vehicle information and/or entertainment system.
- 15. A gesture recognition system for a vehicle information and/or entertainment system, the system comprising:
- a mobile device having a depth camera and an orientation sensor; and
- a processor configured to detect one or more gestures from images captured by the depth camera according to a gesture detection algorithm;
- in which the processor is configured to vary the gesture detection algorithm in dependence upon an orientation of the mobile device detected by the orientation sensor.
- 16. A method of operation of a mobile device to perform gesture recognition for a vehicle information and/or entertainment system, the method comprising:
- detecting images using a depth camera of the mobile device;
- detecting an orientation of the mobile device; and
- detecting one or more gestures from images captured by the depth camera according to a gesture detection algorithm;
- in which the step of detecting one of more gestures comprises varying the gesture detection algorithm in dependence upon an orientation of the mobile device detected by the orientation sensor.
- 17. Computer software which, when executed by a processor of a mobile device having a depth camera, causes the mobile device to perform the method of clause 16.
- 18. A non-transitory machine-readable storage medium which stores computer software according to clause 17.
- 1. A mobile device configured to perform gesture recognition for a vehicle information and/or entertainment system, the mobile device comprising:
Claims (19)
1-18. (canceled)
19. A device comprising processor circuitry configured to:
acquire a user gesture based on depth information captured with a source of depth information, the user gesture being recognizable as a same gesture from a number of different in-vehicle spatial locations of the source of depth information;
detect a gravitational change in the device by sensor circuitry of the device;
control user gesture detection based on the gravitational change and a set of gestures; and
convey control information based on the user gesture to an interface of circuitry of the vehicle.
20. The device as claimed in claim 19 , wherein the user gesture is recognisable by the device from the number of different in-vehicle spatial locations of the source of depth information by varying detection parameters used by the processor circuitry.
21. The device as claimed in claim 20 , wherein the detection parameters are controlled to prevent unintentional detection of a change in orientation of the source of the depth information.
22. The device as claimed in claim 20 , wherein the detection parameters are controlled to prevent unintentional detection of a change in location of the source of the depth information.
23. The device as claimed in claim 19 , wherein gravitational change relates to a change in acceleration.
24. The device as claimed in claim 23 , wherein the change in acceleration is a vehicle directional change.
25. The device as claimed in claim 19 , wherein the device is a mobile device.
26. The device as claimed in claim 19 , the device is a head mountable apparatus comprising a display.
27. The device as claimed in claim 19 , wherein the source of depth information of the device is selected from a list comprising a time of flight sensor; a stereoscopic camera; and a structured light camera emitting a structured light pattern.
28. The device as claimed in claim 19 , wherein the user gesture is recognized by the device based on a library stored in non-transitory storage of feature points indicative of different particular gestures among the set of gestures.
29. The device as claimed in claim 19 , wherein the sensor circuitry provides device orientation information.
30. The device as claimed in claim 29 , wherein the orientation information is smoothed over a time period to reduce a number of determined orientation changes within the time period, and the determined orientation changes are derived from a plurality of gravitational changes.
31. The device as claimed in claim 30 , wherein the time period is less that a time to execute a sharp turn in the vehicle.
32. The device as claimed in claim 19 , wherein detection parameters used by the processor circuitry are varied less frequently in time than a depth information capture rate of depth information from a source of depth information.
33. The device as claimed in claim 19 , wherein the gravitational change is determined by a change from vertical of a gravity vector.
34. The device as claimed in claim 33 , wherein the sensor circuitry comprises one or more of a gravity sensor, an acceleration sensor, and a gyroscopic sensor.
35. A method comprising:
acquiring a user gesture based on depth information captured from a source of depth information of the device, the user gesture being recognizable as a same gesture from a number of different in-vehicle spatial locations of the source of depth information;
detecting, using sensor circuitry, a gravitational change;
controlling user gesture detection based on the gravitational change and a set of gestures; and
conveying control information based on the user gesture to an interface of circuitry of the vehicle.
36. An in-vehicle information system comprising processor circuitry configured to:
receive information representative of a user gesture based on depth information captured from a source of depth information, the user gesture being recognizable as a same gesture from a number of different in-vehicle spatial locations of the source of depth information;
receive information representing a gravitational change from sensor circuitry;
control user gesture detection based on the gravitational change and a set of gestures; and
convey control information based on the user gesture to an interface of circuitry of the vehicle.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US18/324,881 US20230305636A1 (en) | 2018-01-03 | 2023-05-26 | Gesture recognition using a mobile device |
Applications Claiming Priority (5)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
EP18150191 | 2018-01-03 | ||
EP18150191.7 | 2018-01-03 | ||
PCT/EP2018/097032 WO2019134888A1 (en) | 2018-01-03 | 2018-12-27 | Gesture recognition using a mobile device |
US202016958024A | 2020-06-25 | 2020-06-25 | |
US18/324,881 US20230305636A1 (en) | 2018-01-03 | 2023-05-26 | Gesture recognition using a mobile device |
Related Parent Applications (2)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/EP2018/097032 Continuation WO2019134888A1 (en) | 2018-01-03 | 2018-12-27 | Gesture recognition using a mobile device |
US16/958,024 Continuation US11662827B2 (en) | 2018-01-03 | 2018-12-27 | Gesture recognition using a mobile device |
Publications (1)
Publication Number | Publication Date |
---|---|
US20230305636A1 true US20230305636A1 (en) | 2023-09-28 |
Family
ID=60935708
Family Applications (2)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US16/958,024 Active US11662827B2 (en) | 2018-01-03 | 2018-12-27 | Gesture recognition using a mobile device |
US18/324,881 Pending US20230305636A1 (en) | 2018-01-03 | 2023-05-26 | Gesture recognition using a mobile device |
Family Applications Before (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US16/958,024 Active US11662827B2 (en) | 2018-01-03 | 2018-12-27 | Gesture recognition using a mobile device |
Country Status (5)
Country | Link |
---|---|
US (2) | US11662827B2 (en) |
EP (1) | EP3735652A1 (en) |
JP (1) | JP7027552B2 (en) |
CN (1) | CN111417957B (en) |
WO (1) | WO2019134888A1 (en) |
Families Citing this family (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2022157090A1 (en) | 2021-01-25 | 2022-07-28 | Sony Semiconductor Solutions Corporation | Electronic device, method and computer program |
DE102021111712A1 (en) | 2021-05-05 | 2022-11-10 | Bayerische Motoren Werke Aktiengesellschaft | Method and device for calibrating a depth sensor in a vehicle interior |
US20230031200A1 (en) * | 2021-07-30 | 2023-02-02 | Jadelynn Kim Dao | Touchless, Gesture-Based Human Interface Device |
Family Cites Families (29)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
KR100575906B1 (en) * | 2002-10-25 | 2006-05-02 | 미츠비시 후소 트럭 앤드 버스 코포레이션 | Hand pattern switching apparatus |
JP2006143159A (en) | 2004-11-25 | 2006-06-08 | Alpine Electronics Inc | Vehicular motion recognition device |
US8972902B2 (en) * | 2008-08-22 | 2015-03-03 | Northrop Grumman Systems Corporation | Compound gesture recognition |
US8952832B2 (en) * | 2008-01-18 | 2015-02-10 | Invensense, Inc. | Interfacing application programs and motion sensors of a device |
JP2011525283A (en) * | 2008-06-18 | 2011-09-15 | オブロング・インダストリーズ・インコーポレーテッド | Gesture reference control system for vehicle interface |
WO2011066343A2 (en) * | 2009-11-24 | 2011-06-03 | Next Holdings Limited | Methods and apparatus for gesture recognition mode control |
US20110181510A1 (en) | 2010-01-26 | 2011-07-28 | Nokia Corporation | Gesture Control |
JP2012000165A (en) * | 2010-06-14 | 2012-01-05 | Sega Corp | Video game apparatus |
US8760432B2 (en) * | 2010-09-21 | 2014-06-24 | Visteon Global Technologies, Inc. | Finger pointing, gesture based human-machine interface for vehicles |
US9785335B2 (en) * | 2010-12-27 | 2017-10-10 | Sling Media Inc. | Systems and methods for adaptive gesture recognition |
WO2012141352A1 (en) * | 2011-04-13 | 2012-10-18 | Lg Electronics Inc. | Gesture recognition agnostic to device orientation |
CN103890695B (en) * | 2011-08-11 | 2017-10-13 | 视力移动技术有限公司 | Interface system and method based on gesture |
US20130155237A1 (en) * | 2011-12-16 | 2013-06-20 | Microsoft Corporation | Interacting with a mobile device within a vehicle using gestures |
US9223415B1 (en) * | 2012-01-17 | 2015-12-29 | Amazon Technologies, Inc. | Managing resource usage for task performance |
US8942881B2 (en) * | 2012-04-02 | 2015-01-27 | Google Inc. | Gesture-based automotive controls |
DE102012110460A1 (en) * | 2012-10-31 | 2014-04-30 | Audi Ag | A method for entering a control command for a component of a motor vehicle |
JP2014135668A (en) * | 2013-01-11 | 2014-07-24 | Sharp Corp | Portable terminal device |
US20140258942A1 (en) * | 2013-03-05 | 2014-09-11 | Intel Corporation | Interaction of multiple perceptual sensing inputs |
EP2972669B1 (en) * | 2013-03-14 | 2019-07-24 | Intel Corporation | Depth-based user interface gesture control |
US8886399B2 (en) * | 2013-03-15 | 2014-11-11 | Honda Motor Co., Ltd. | System and method for controlling a vehicle user interface based on gesture angle |
EP2891950B1 (en) | 2014-01-07 | 2018-08-15 | Sony Depthsensing Solutions | Human-to-computer natural three-dimensional hand gesture based navigation method |
US10613642B2 (en) * | 2014-03-12 | 2020-04-07 | Microsoft Technology Licensing, Llc | Gesture parameter tuning |
US10199008B2 (en) * | 2014-03-27 | 2019-02-05 | North Inc. | Systems, devices, and methods for wearable electronic devices as state machines |
JP2016081286A (en) * | 2014-10-16 | 2016-05-16 | 株式会社東芝 | Terminal operation support apparatus and terminal operation support method |
US10481696B2 (en) * | 2015-03-03 | 2019-11-19 | Nvidia Corporation | Radar based user interface |
US10747327B2 (en) * | 2016-06-28 | 2020-08-18 | Intel Corporation | Technologies for adaptive downsampling for gesture recognition |
JP6274264B2 (en) | 2016-06-29 | 2018-02-07 | カシオ計算機株式会社 | Portable terminal device and program |
WO2019091491A1 (en) * | 2017-11-13 | 2019-05-16 | Zyetric Gaming Limited | Gesture recognition based on depth information and computer vision |
JP2019219904A (en) * | 2018-06-20 | 2019-12-26 | ソニー株式会社 | Program, recognition apparatus, and recognition method |
-
2018
- 2018-12-27 US US16/958,024 patent/US11662827B2/en active Active
- 2018-12-27 EP EP18830853.0A patent/EP3735652A1/en active Pending
- 2018-12-27 CN CN201880077267.7A patent/CN111417957B/en active Active
- 2018-12-27 WO PCT/EP2018/097032 patent/WO2019134888A1/en unknown
- 2018-12-27 JP JP2020536953A patent/JP7027552B2/en active Active
-
2023
- 2023-05-26 US US18/324,881 patent/US20230305636A1/en active Pending
Also Published As
Publication number | Publication date |
---|---|
WO2019134888A1 (en) | 2019-07-11 |
CN111417957A (en) | 2020-07-14 |
JP2021509988A (en) | 2021-04-08 |
EP3735652A1 (en) | 2020-11-11 |
US11662827B2 (en) | 2023-05-30 |
JP7027552B2 (en) | 2022-03-01 |
CN111417957B (en) | 2023-10-27 |
US20210064147A1 (en) | 2021-03-04 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US20230305636A1 (en) | Gesture recognition using a mobile device | |
JP5261554B2 (en) | Human-machine interface for vehicles based on fingertip pointing and gestures | |
US9738158B2 (en) | Motor vehicle control interface with gesture recognition | |
US20190302895A1 (en) | Hand gesture recognition system for vehicular interactive control | |
US9020194B2 (en) | Systems and methods for performing a device action based on a detected gesture | |
KR101459441B1 (en) | System and method for providing a user interface using finger start points shape recognition in a vehicle | |
US10649587B2 (en) | Terminal, for gesture recognition and operation command determination, vehicle having the same and method for controlling the same | |
EP3457270B1 (en) | User gesture recognition | |
CN107533366B (en) | Information display device and information display method | |
KR101490908B1 (en) | System and method for providing a user interface using hand shape trace recognition in a vehicle | |
KR101535032B1 (en) | Method for extending interface in vehicle | |
KR101438615B1 (en) | System and method for providing a user interface using 2 dimension camera in a vehicle | |
KR102084032B1 (en) | User interface, means of transport and method for distinguishing a user | |
US10296101B2 (en) | Information processing system, information processing apparatus, control method, and program | |
US20160140760A1 (en) | Adapting a display on a transparent electronic display | |
US20190004667A1 (en) | System and method for predicting a touch position of a pointer on a touch-enabled unit or determining a pointing direction in 3d space | |
KR20150000076A (en) | Blind control system for vehicle | |
CN104508598A (en) | A projected virtual input system for a vehicle | |
CN105759955B (en) | Input device | |
JP5800361B2 (en) | Display control device and display device using the same | |
CN113226827B (en) | Operating system with portable interface unit and motor vehicle with operating system | |
KR101976498B1 (en) | System and method for gesture recognition of vehicle | |
US11262849B2 (en) | User interface, a means of transportation and a method for classifying a user gesture performed freely in space | |
JP2016157457A (en) | Operation input device, operation input method and operation input program | |
US20230123623A1 (en) | Gesture detecting apparatus and gesture detecting method |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
STPP | Information on status: patent application and granting procedure in general |
Free format text: DOCKETED NEW CASE - READY FOR EXAMINATION |
|
AS | Assignment |
Owner name: SONY SEMICONDUCTOR SOLUTIONS CORPORATION, JAPAN Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:ARORA, VARUN;PRABHAT, AVASARE;REEL/FRAME:064604/0445 Effective date: 20200408 |