US20210096651A1 - Vehicle systems and methods for interaction detection - Google Patents

Vehicle systems and methods for interaction detection Download PDF

Info

Publication number
US20210096651A1
US20210096651A1 US17/007,186 US202017007186A US2021096651A1 US 20210096651 A1 US20210096651 A1 US 20210096651A1 US 202017007186 A US202017007186 A US 202017007186A US 2021096651 A1 US2021096651 A1 US 2021096651A1
Authority
US
United States
Prior art keywords
processor
data
gesture
image sensor
proximity sensor
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
US17/007,186
Inventor
Itay Katz
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Eyesight Mobile Technologies Ltd
Original Assignee
Eyesight Mobile Technologies Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Eyesight Mobile Technologies Ltd filed Critical Eyesight Mobile Technologies Ltd
Priority to US17/007,186 priority Critical patent/US20210096651A1/en
Publication of US20210096651A1 publication Critical patent/US20210096651A1/en
Abandoned legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/017Gesture based interaction, e.g. based on a set of recognized hand gestures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F1/00Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
    • G06F1/26Power supply means, e.g. regulation thereof
    • G06F1/32Means for saving power
    • G06F1/3203Power management, i.e. event-based initiation of a power-saving mode
    • G06F1/3206Monitoring of events, devices or parameters that trigger a change in power modality
    • G06F1/3231Monitoring the presence, absence or movement of users
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F1/00Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
    • G06F1/26Power supply means, e.g. regulation thereof
    • G06F1/32Means for saving power
    • G06F1/3203Power management, i.e. event-based initiation of a power-saving mode
    • G06F1/3234Power saving characterised by the action undertaken
    • G06F1/325Power saving in peripheral device
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/0304Detection arrangements using opto-electronic means
    • G06K9/00201
    • G06K9/00355
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/60Type of objects
    • G06V20/64Three-dimensional objects
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/20Movements or behaviour, e.g. gesture recognition
    • G06V40/28Recognition of hand or arm movements, e.g. recognition of deaf sign language
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/64Computer-aided capture of images, e.g. transfer from script file into camera, check of taken image quality, advice or proposal for image composition or decision on when to take image
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/80Camera processing pipelines; Components thereof
    • H04N5/23222
    • H04N5/23229
    • G06K2209/40
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V2201/00Indexing scheme relating to image or video recognition or understanding
    • G06V2201/12Acquisition of 3D measurements of objects
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02DCLIMATE CHANGE MITIGATION TECHNOLOGIES IN INFORMATION AND COMMUNICATION TECHNOLOGIES [ICT], I.E. INFORMATION AND COMMUNICATION TECHNOLOGIES AIMING AT THE REDUCTION OF THEIR OWN ENERGY USE
    • Y02D10/00Energy efficient computing, e.g. low power processors, power management or thermal management

Definitions

  • the present disclosure relates to the field of touch-free gesture detection and, more particularly, devices and computer-readable media for using a proximity sensor and an image sensor to detect touch-free gestures.
  • Permitting a user to interact with a device or an application running on a device is useful in many different settings.
  • keyboards, mice, and joysticks are often included with electronic systems to enable a user to input data, manipulate data, and cause a processor of the system to execute a variety of other actions.
  • touch-based input devices such as keyboards, mice, and joysticks
  • a system may include an image sensor to capture images of a user, including, for example, a user's hands and/or fingers.
  • a processor may be configured to receive such images and initiate actions based on touch-free gestures performed by the user.
  • Image sensors often consume relatively large amounts of device resources. For example, an image sensor may require power (often supplied by a battery having a limited charge) and data from the image sensor may need to be analyzed by a processor. Therefore, it may be desirable to limit the resources required by an image sensor while maintaining touch-free gesture detection functionality. Improvements in techniques for detecting and acting upon touch-free gestures are desirable.
  • a dual sensor control device includes at least one processor for receiving information from a proximity sensor and an image sensor.
  • the at least one processor may be configured to receive first data from the proximity sensor while the image sensor is in a first state, determine, using the first data, a presence of an object in proximity to the proximity sensor, output, based on the determined presence of the object in proximity to the proximity sensor, a signal to the image sensor to cause the image sensor to enter a second state, different from the first state, receive second data from the image sensor in the second state, and output at least one of a message and a command associated with the second data.
  • a three-dimensional control device in another disclosed embodiment, includes at least one processor for receiving information from a proximity sensor and an image sensor.
  • the at least one processor may be configured to receive first data, associated with a detected object, from the proximity sensor while the proximity sensor operates at a first level of power consumption, wherein the first data is reflective of at least a one-dimensional position of the object relative to the proximity sensor, receive second data, associated with the detected object, from the image sensor while the image sensor operates at a second level of power consumption greater than the first level, wherein the second data is reflective of at least a two-dimensional position of the object relative to the image sensor, and coordinate the first data and the second data to obtain three-dimensional information associated with the detected object.
  • FIG. 1 illustrates an example dual sensor control device that may be used for implementing the disclosed embodiments.
  • FIG. 2 illustrates an example graphical representation of the dual sensor control device, in accordance with some of the disclosed embodiments.
  • FIG. 3 illustrates another example graphical representation of the dual sensor control device, in accordance with some of the disclosed embodiments.
  • FIG. 4 illustrates an example process in accordance with some of the disclosed embodiments.
  • FIG. 5 illustrates another example process in accordance with some of the disclosed embodiments.
  • FIGS. 6A-6F illustrate several graphical representations of gestures.
  • FIG. 7 illustrates a graphical representation of a gesture speed determination.
  • Inventive embodiments may include a dual sensor control device, which may include any device that employs at least two sensors for detecting one or more of proximity, image data, or environmental conditions.
  • FIG. 1 is a diagram illustrating one example of a dual sensor control device 100 that may be used for implementing the disclosed embodiments.
  • the dual sensor control device may be incorporated in, among other things, a mobile phone, smart-glasses, a personal computer (PC), an entertainment device, a set top box, a television, a mobile game machine, a tablet computer, an e-reader, a portable game console, a portable computer such as a laptop or ultrabook, a home appliance such as a kitchen appliance, a communication device, an air conditioning thermostat, a docking station, a game machine such as a mobile video gaming device, a digital camera, a watch, an entertainment device, speakers, a Smart Home device, a media player or media system, a location-based device, a pico projector or an embedded projector, a medical device such as a medical display device, a vehicle, an in-car/in-air infotainment system, a navigation system, a wearable device, an augmented reality-enabled device, wearable goggles, a robot, interactive digital signage, a digital kiosk, a vending machine, an automated teller machine
  • the dual sensor control device may be handheld (e.g., held by a user's hand) or non-handheld.
  • device 100 may include a processor 110 , an image sensor 120 , a proximity sensor 130 , a memory 140 , and a display 150 .
  • Embodiments of the invention may include at least one processor.
  • a processor may include, among other things, any electric circuit that may be configured to perform a logic operation on at least one input variable, including, for example one or more integrated circuits, microchips, microcontrollers, and microprocessors, which may be all or part of a central processing unit (CPU), a digital signal processor (DSP), a field programmable gate array (FPGA), a graphical processing unit (GPU), or any other circuit known to those skilled in the art that may be suitable for executing instructions or performing logic operations.
  • Exemplary processor 110 in FIG. 1 may receive information from a proximity sensor and an image sensor. For example, as depicted in FIG.
  • processor 110 may be connected to at least one image sensor 120 and at least one proximity sensor 130 such that processor 110 may receive information from the at least one image sensor 120 and the at least one proximity sensor 130 .
  • device 100 may include a plurality of processors, which may provide different processing capabilities (e.g., dedicated graphics processing) and/or that may provide parallel processing functionality. Multiple functions may be accomplished using a single processor or multiple related and/or unrelated functions may be divided among multiple processors.
  • Embodiments of the invention may involve an image sensor.
  • An image sensor may include, among other things, one or more of a CCD image sensor, a CMOS image sensor, a camera, a light sensor, an IR sensor, an ultrasonic sensor, a proximity sensor, a shortwave infrared (SWIR) image sensor, a reflectivity sensor, or any other device that is capable of sensing visual characteristics of an environment.
  • an image sensor may include, for example, a single photosensor or 1-D line sensor capable of scanning an area, a 2-D sensor, or a stereoscopic sensor that includes, for example, a plurality of 2-D image sensors.
  • Image sensor may be associated with a lens for focusing a particular area of light onto the image sensor.
  • Images captured by image sensor may be digitized by image sensor and input to a processor, or may be input to a processor in analog form and digitized by processor.
  • FIG. 1 depicts image sensor 120 as part of device 100 .
  • image sensor 120 may be external to device 100 .
  • Embodiments may also involve a proximity sensor.
  • exemplary proximity sensors may include, among other things, one or more of a capacitive sensor, a capacitive displacement sensor, a laser rangefinder, a sensor that uses time-of-flight (TOF) technology, an IR sensor, a sensor that detects magnetic distortion, or any other sensor that is capable of generating information indicative of the presence of an object in proximity to the proximity sensor.
  • the information generated by proximity sensor may include a distance of the object to the proximity sensor.
  • a proximity sensor may be a single sensor or may be a set of sensors.
  • FIG. 1 depicts proximity sensor 130 as part of device 100 . However, in alternative embodiments, proximity sensor 130 may be external to device 100 .
  • Embodiments may also include memory.
  • Memory may include, for example, one or more of persistent memory, ROM, EEPROM, EAROM, flash memory devices, magnetic disks, magneto optical disks, CD-ROM, DVD-ROM, Blu-ray, and the like, and may contain instructions (i.e., software or firmware) or other data.
  • FIG. 1 depicts memory 140 connected to processor 110 .
  • processor 110 may receive instructions and data stored by memory 140 .
  • processor 110 executes the software or firmware to perform functions by operating on input data and generating output.
  • processor 110 may also be, for example, dedicated hardware or an application-specific integrated circuit (ASIC) that performs processes by operating on input data and generating output.
  • ASIC application-specific integrated circuit
  • Processor 110 may be any combination of dedicated hardware, one or more ASICs, one or more general purpose processors, one or more DSPs, one or more GPUs, or one or more other processors capable of processing digital information.
  • FIG. 1 depicts memory 140 as part of device 100 . However, in alternative embodiments, memory 140 may be external to device 100 .
  • Embodiments may also include a display.
  • a display may include, for example, one or more of a television set, computer monitor, head-mounted display, broadcast reference monitor, a liquid crystal display (LCD) screen, a light-emitting diode (LED) based display, an LED-backlit LCD display, a cathode ray tube (CRT) display, an electroluminescent (ELD) display, an electronic paper/ink display, a plasma display panel, an organic light-emitting diode (OLED) display, thin-film transistor display (TFT), High-Performance Addressing display (HPA), a surface-conduction electron-emitter display, a quantum dot display, an interferometric modulator display, a swept-volume display, a carbon nanotube display, a variforcal mirror display, an emissive volume display, a laser display, a holographic display, a light field display, a projector and surface upon which images are projected, or any other electronic device for outputting visual information
  • FIG. 2 provides one example of a graphical representation of device 100 .
  • image sensor 120 may be positioned to capture images of an area that may be viewed on display 150 .
  • proximity sensor 130 may be positioned to capture proximity data associated with at least some display-viewable locations.
  • image sensor 120 is positioned in a central horizontal location and proximity sensor 130 is positioned to the side of image sensor 130 .
  • both image sensor 120 and proximity sensor 130 may be positioned in a central horizontal location or both image sensor 120 and proximity sensor 130 may be positioned in non-central horizontal locations.
  • proximity sensor 130 is non-directional (that is, it may be configured to provide proximity data associated with objects in all directions); in such embodiments, proximity sensor 130 may be positioned anywhere on or in device 100 , or outside of device 100 .
  • FIG. 3 provides another graphical representation of device 100 .
  • Broken lines 310 may be representative of a field of view of image sensor 120 .
  • Broken lines 320 may be representative of a region within which proximity sensor 130 may detect the presence of an object and/or detect the distance to an object.
  • the field of view of image sensor 120 may be narrower than the region within which proximity sensor 130 may detect the presence of an object and/or detect the distance to an object.
  • the field of view of proximity sensor 130 may be narrower than, or the same as, or pointing in a different direction than the field of view of image sensor 120 .
  • FIG. 4 illustrates an exemplary process 400 that processor 110 may be configured to perform.
  • processor 110 of device 100 may be configured to perform these operations by executing software or firmware stored in memory 140 , or may be configured to perform these operations using dedicated hardware or one or more ASICs.
  • At least one processor may be configured to receive first data from the proximity sensor while the image sensor is in a first state.
  • processor 110 may be configured to receive first data from proximity sensor 130 while image sensor 120 is in a first state.
  • the first data may include, any information indicative of a presence of an object to be detected.
  • the first data may include one-dimensional data reflective of a presence of an object within the region defined by broken lines 320 within a certain distance of proximity sensor 130 .
  • the one-dimensional data may indicate whether or not an object is within a predefined distance (e.g., 20 centimeters) from proximity sensor 130 .
  • the one-dimensional data may indicate a distance from proximity sensor 130 to a nearest object.
  • the one-dimensional data may indicate whether or not any object is within a field of view of proximity sensor 130 (e.g., within broken lines 320 ).
  • proximity sensor 130 may include a processor that determines whether or not any object is within a field of view of proximity sensor 130 and outputs the determination to processor 110 .
  • proximity sensor 130 may be configured to output to processor 110 an analog signal representative of the presence and/or distance to the object.
  • processor 110 may be configured to convert the analog signal to digital data indicative of the presence and/or distance to the object.
  • processor 110 may apply one or more thresholds to the analog signal to determine the presence and/or distance to the object.
  • processor 110 may receive digital data indicative of the presence and/or distance to the object directly from proximity sensor 130 .
  • the first data is associated with a first resolution.
  • the first resolution may be a single indication of a presence and/or distance of an object detected by a single sensor (e.g., one-dimensional data).
  • the first resolution may be a plurality of indications of a presence and/or distance of an object.
  • proximity sensor 130 may include a set of proximity sensors (e.g., a set of 16 proximity sensors) that each determine a presence and/or distance to an object within a region defined by broken lines 320 .
  • the first state of the image sensor may include any mode of operation of the image sensor.
  • image sensor 120 may be configured to operate in a first state and a second state, wherein the second state is different than the first state.
  • the first state of image sensor 120 may include at least one of an off state and a first power state.
  • the first power state may be, for example, a state in which image sensor 120 does not consume significant processor and/or battery resources as a byproduct of one or more settings associated with image sensor 120 .
  • the first power state may include a stand-by mode.
  • the first power state may include at least one of a first sampling rate, a first resolution, and a first exposure length.
  • image sensor 120 may be configured to capture only a limited small number of images. For example, image sensor 120 may only capture images after an express user request (e.g., after a user selects a picture capture button).
  • a first sampling rate may also include, for example, pictures captured at a low frequency (e.g., image sensor 120 may automatically capture an image once a minute).
  • a first resolution may include a scenario in which some sensors associated with image sensor 120 are deactivated.
  • image sensor 120 may be composed of a grid of 1024 sensors. When operating at the first resolution, image sensor 120 may be configured to only use 16 of the sensors.
  • a first exposure length may include, for example, a long exposure length (e.g., if processor resources are being conserved) or a short exposure length (e.g., if battery resources are being conserved).
  • the first state of image sensor 120 is not necessarily limited to power conservation states.
  • image sensor 120 may be used for a non-gesture detection related purpose (e.g., for photography).
  • the first state of image sensor 120 may be a setting in which image sensor 120 is optimized for the non-gesture detection related purpose.
  • the resolution, frame rate, and exposure length of image sensor 120 may be optimized for the non-gesture detection related purpose.
  • a higher resolution may be used than may be needed for gesture detection.
  • processor 110 is further configured to determine whether the proximity sensor and the image sensor are facing a user based on information received from at least one of an accelerometer and a gyroscope.
  • device 100 may include at least one of an accelerometer and a gyroscope (not shown in FIG. 1 ).
  • Processor 110 may be configured to receive data from the accelerometer and/or gyroscope.
  • Processor 110 may be configured to use the data received from the accelerometer and/or gyroscope to determine a position and movement of device 100 .
  • Memory 140 may store data regarding various possible positions and/or motions and associated possible actions.
  • processor 110 may analyze the determined position and movement of device 100 to determine whether device 100 is moving back and forth (e.g., in a user's pocket) or moved upwards and then stationary (e.g., picked up by a user and then facing the user). Processor 110 may be configured to ignore the first data if a determination is made that the proximity sensor and the image sensor are not facing the user.
  • At least one processor may be configured to determine, using the first data, a presence of an object in proximity to the proximity sensor.
  • a processor may be configured to determine object presence by analyzing, in any way that determines object presence, data from a proximity sensor. For example, in operation 420 in FIG. 4 , processor 110 may be configured to determine, using the first data, a presence of an object in proximity to proximity sensor 130 .
  • the object may include, for example, one or more hands, one or more fingers, one or more fingertips, one or more other parts of a hand, or one or more hand-held objects associated with a user.
  • the presence of the object may be determined based on the first data received from proximity sensor 130 .
  • processor 110 may determine the presence of the object anytime the first data indicates an object is within the region defined by broken lines 320 .
  • processor 110 may determine the presence of the object if the first data indicates that an object is within the region defined by broken lines 320 within a predetermined distance of proximity sensor 130 .
  • proximity sensor 130 may generate an analog signal that has a voltage level indicative of a distance to an object. If no object is in proximity to proximity sensor 130 , the voltage level may be at, or close to, zero. If an object is located adjacent to the proximity sensor 130 , the voltage level may be a maximum value such as, for example, 1 volt. If an object is within a predetermined distance to proximity sensor 130 , the voltage level may reflect how close the object is to the proximity sensor 130 (e.g., if the object is at a midway point between the predetermined distance and proximity sensor 130 , the voltage level may be at a middle value (e.g., 0.5 volts).
  • Proximity sensor 130 may include a processor that converts the analog value to distance data.
  • proximity sensor 130 may access a database that maps analog values to distance values.
  • the analog value may be transmitted to processor 110 that may convert the analog value to a distance value.
  • Processor 110 may be configured to enable gesture control based on the determined presence of the object in proximity to the proximity sensor. For example, as described in more detail below, processor 110 may be configured to change a state of image sensor 120 to permit gesture detection. Moreover, processor 110 may execute, for example, a gesture detection program stored in memory 140 . In some embodiments, processor 110 is also configured to turn on display 150 based on the determined presence of the object in proximity to proximity sensor 130 .
  • At least one processor may be configured to output, based on the determined presence of the object in proximity to the proximity sensor, a signal to the image sensor to cause the image sensor to enter a second state, different from the first state.
  • processor 110 may be configured to output, based on the determined presence of the object in proximity to proximity sensor 130 , a signal to image sensor 120 to cause image sensor 120 to enter a second state, different from the first state.
  • processor 110 may output the signal to image sensor 120 if processor 110 determines the presence of the object anywhere within a field of view of a proximity sensor. For example, processor 110 may output the signal to image sensor 120 if processor 110 determines the presence of the object within the region defined by broken lines 320 for a predetermined amount of time. As another example, processor 110 may output the signal to image sensor 120 if processor 110 determines the presence of the object within the region defined by broken lines 320 for any amount of time.
  • the second state of image sensor 120 may include, for example, at least one of an on state and a second power state higher than the first power state. For example, if the first state was an off state or stand-by mode, the second state may be an on state. Similarly, if the first state was a first power state, the second state may be a second power state higher than the first power state.
  • the second state is at least one of a second sampling rate different than the first sampling rate, a second resolution different than the first resolution, and a second exposure length different than the first exposure length.
  • image sensor 120 when operating with the second sampling rate, image sensor 120 may be configured to capture images more frequently than the first sampling rate.
  • image sensor 120 may be configured to use more sensors associated with image sensor 120 (e.g., the previously deactivated sensor of image sensor 120 may be reactivated).
  • the second exposure length may include, for example, a short exposure length (e.g., if processor resources are being conserved in the first state) or a long exposure length (e.g., if battery resources are being conserved in the first state).
  • the first state of image sensor 120 may be a setting in which image sensor 120 is optimized for a non-gesture detection related purpose that does not pertain to power conservation.
  • the resolution, frame rate, and exposure length of image sensor 120 may be optimized for gesture detection. For example, a resolution sufficient to detect the object, a frame rate sufficient to detect a gesture, and an exposure length sufficient to detect a gesture may be experimentally determined.
  • processor 110 may be further configured to output the signal to the image sensor to enter the second state before the object reaches a field of view of the image sensor. For example, in some embodiments, processor 110 may output the signal to image sensor 120 immediately, before performing any other operations, after processor 110 determines the presence of the object in proximity to proximity sensor 130 . As another example, memory 140 may store data regarding the amount of time a typical hand gesture would take to reach the field of view of image sensor 120 after being determined to be in proximity to proximity sensor 130 . Processor 110 may be configured to calculate, using the data in memory 140 , when to send the signal so that the signal arrives at image sensor 120 before the hand gesture enters the field of view of image sensor 120 .
  • At least one processor may be configured to receive second data from the image sensor in the second state.
  • processor 110 may be configured to receive second data from image sensor 120 in the second state.
  • the second data may include any information received by a processor from an image sensor.
  • the second data may include two-dimensional data.
  • the two-dimensional data may include, for example, image information captured by image sensor 120 .
  • the image information may include, for example, one or more of an analog image, a digital image, a subset of the analog image, a subset of the digital image, digital information processed by an image signal processor, a mathematical representation or transformation of information associated with data sensed by image sensor 120 , visual information such as one or more frequencies in an image, conceptual information such as presence of one or more objects in the field of view of the image sensor.
  • the image information may also include, for example, information indicative of the state of image sensor 120 , parameters of image sensor 120 such as exposure, frame rate, resolution of an image, color bit resolution, depth resolution, and field of view, information from other sensors during image capture, such as proximity sensor information, accelerometer information, information describing further processing that took place after an image is captured, illumination condition during image capture, features extracted from a digital image by image sensor 120 , or any other information associated with data sensed by image sensor 120 .
  • the image information may include, for example, a single static image or a set of images (e.g., video).
  • the second data is associated with a second resolution greater than the first resolution.
  • image sensor 120 when operating with the second resolution, image sensor 120 may be configured to use more sensors associated with image sensor 120 (e.g., the previously deactivated sensor of image sensor 120 may be reactivated).
  • the second data may include one or more images of the object.
  • the second data may include image information associated with the field of view defined by broken lines 310 . If a user places a hand, finger, or other object within the field of view defined by broken lines 310 , image sensor 120 may capture an image of the hand, finger or other object.
  • Processor 110 may be configured to detect an object in the second data based on, for example, the contour and/or location of an object in the second data. For example, processor 110 may access a filter mask associated with the object and apply the filter mask to the second data to determine if the object is present in the second data. That is, for example, the location in the second data most correlated to the filter mask may be determined as the location of the object associated with the. Processor 110 may also be configured to access a plurality of different filter masks associated with a plurality of different hand poses. Thus, for example, a filter mask from the plurality of different filter masks that has a best correlation to the image information may cause a determination that the hand pose associated with the filter mask is the hand pose of the object.
  • processor 110 may be configured to detect a central location of a hand and a central location of each a tip of each finger of the hand that is visible in the second data.
  • U.S. Pat. Nos. 7,584,113 and 8,199,115 disclose techniques for performing object detection, both of which are incorporated herein by reference in their entirety.
  • device 100 is configured to enable gesture control.
  • processor 110 is further configured to enable gesture control based on the determined presence of the object in proximity to the proximity sensor.
  • the gesture control may be based on, among other things, the second data.
  • the second data is reflective of a gesture.
  • the object may be performing a gesture at one or more times at which image sensor 120 captures an image of the object.
  • the second data may be reflective of the object performing the gesture.
  • a gesture is performed by at least one of a hand and a finger.
  • a gesture may include, for example, a swiping gesture, a pinching gesture of two fingers, a pointing gesture towards an object presented on a display, a left-to-right gesture, a right-to-left gesture, an upwards gesture, a downwards gesture, a pushing gesture, a waving gesture, a clapping gesture, a reverse clapping gesture, a gesture of splaying fingers on a hand, a reverse gesture of splaying fingers on a hand, a holding gesture associated with an object presented on a display for a predetermined amount of time, a clicking gesture associated with an object presented on a display, a double clicking gesture, a right clicking gesture, a left clicking gesture, a bottom clicking gesture, a top clicking gesture, a grasping gesture, a gesture towards an object presented on a display from a right side, a gesture towards an object presented on a display from a left side, a gesture passing through an object presented on a display, a blast gesture, a tipping gesture,
  • processor 110 is further configured to distinguish between a plurality of predefined gestures. For example, processor 110 may be configured to determine one or more locations associated with the object in the second data to determine the gesture performed by the object. For example, processor 110 may be configured to analyze the second data to determine that a hand of the user moves from a left side of the field of view of image sensor 120 to a right side of the field of view of image sensor 120 and, based on the determination, detect a left-to-right gesture. Similarly, processor 110 may be configured to analyze the second data to determine that one or more finger locations are moving back and forth in a circular motion and, based on the determination, detect a waving gesture. In some embodiments, processor 110 is further configured to determine the gesture based, at least in part, on an analysis of the first data and the second data. For example, processor 110 may be configured to determine the gesture in accordance with the operations described in more detail below for process 500 .
  • At least one processor may be configured to output at least one of a message and a command associated with the second data.
  • processor 110 may be configured to output at least one of a message and a command associated with the second data.
  • processor 110 may be configured to address the message or command to any type of destination including, but not limited to, an operating system, one or more services, one or more applications, one or more devices, one or more remote applications, one or more remote services, or one or more remote devices.
  • a message may include, for example, a message to an application running on the external device, a service running on the external device, an operating system running on the external device, a process running on the external device, one or more applications running on a processor of the external device, a software program running in the background of the external device, or to one or more services running on the external device.
  • the message may include a message to an application running on a device, a service running on the device, an operating system running on the device, a process running on the device, one or more applications running on a processor of the device, a software program running in the background of the device, or to one or more services running on the device.
  • the message may also include, for example, responsive to a selection of a graphical element, a message requesting data relating to a graphical element identified in an image from an application running on the external device, a service running on the external device, an operating system running on the external device, a process running on the external device, one or more applications running on a processor of the external device, a software program running in the background of the external device, or to one or more services running on the external device.
  • the message may also include, for example, responsive to a selection of a graphical element, a message requesting a data relating to a graphical element identified in an image from an application running on a device, a service running on the device, an operating system running on the device, a process running on the device, one or more applications running on a processor of the device, a software program running in the background of the device, or to one or more services running on the device.
  • a command may include, for example, a command to run an application on the external device or website, a command to stop an application running on the external device or website, a command to activate a service running on the external device or website, a command to stop a service running on the external device or website, or a command to send data relating to a graphical element identified in an image.
  • a message to a device may be a command. The command may be selected, for example, from a command to run an application on the device, a command to stop an application running on the device or website, a command to activate a service running on the device, a command to stop a service running on the device, or a command to send data relating to a graphical element identified in an image.
  • the second data is reflective of at least a two-dimensional position of the object relative to the image sensor.
  • processor 110 may be configured to determine a gesture based on the second data that may be reflective of at least a two-dimensional position.
  • the at least one of a message and a command is further associated with the two-dimensional position.
  • processor 110 may be configured to determine a particular message or a particular command based on a gesture associated with the two-dimensional position.
  • memory 140 may store a database associating a plurality of different gestures to a plurality of different messages and/or a plurality of different commands.
  • Processor 110 may be configured to select a particular message or command based on the database.
  • a left-to-right gesture may cause processor 110 to select a command to cause a picture shown on display 150 to move to the right.
  • a waving gesture may cause processor 110 to select a command to cause an application running on device 100 to close.
  • the first data is reflective of at least a one-dimensional position of the object relative to the proximity sensor.
  • processor 110 may be configured to determine a gesture based on the first data. For example, processor 110 may be configured to detect a particular gesture if a user places an object in proximity to proximity sensor 130 in accordance with a predefined pattern (e.g., in proximity for a first predetermined time, then out of proximity, and then in proximity for a second predetermined time).
  • the at least one of a message and a command is further associated with the one-dimensional position.
  • processor 110 may be configured to select at least one of a message and a command based on the detected particular gesture.
  • processor 110 may be configured to select at least one of a message and a command by choosing at least one of a message and a command associated with the detected particular gesture in the database.
  • processor 110 may be configured to determine a gesture based on the first data, which may include at least a one-dimensional position of the object, and the second data, which may include at least a two-dimensional position of the object. As discussed in more detail below with respect to process 500 , processor 110 may be configured to determine a gesture based on the first data and the second data. Moreover, as discussed in more detail below with respect to process 500 , in some embodiments the at least one of a message and a command is further associated with the one-dimensional position and the two-dimensional position. For example, processor 110 may be configured to select at least one of a message and a command based on a determined gesture associated with the one-dimensional position and the two-dimensional position.
  • the at least one of a message and a command is further associated with the first data obtained before the object reaches the field of view of the image sensor and the second data obtained after the object reaches the field of view of the image sensor.
  • the first data from proximity sensor 130 obtained before the object (e.g., a hand or finger) reaches the field of view of image sensor 140 may be combined with the second data from image sensor 120 to determine three-dimensional information about the object.
  • Processor 110 may be configured to perform the combination in the manner described below with respect to process 500 .
  • the three-dimensional information may be analyzed to determine a gesture in the manner described below with respect to process 500 .
  • processor 110 is further configured to output a second signal to the image sensor to cause the image sensor to enter a third state after a predefined condition is met.
  • processor 110 may be configured to determine, using data from the proximity sensor, if the object exits a field of view of the proximity sensor, and output, based on the determined exit, a second signal to the image sensor to cause the image sensor to enter the third state.
  • the third state may be, for example, any state that is different than the second state, any of the states described above with respect to the first state, or any state that requires less power than the second state, but more power than the first state.
  • the predefined condition may include, for example a determination that the object exits a field of view of proximity sensor 130 .
  • the predefined condition may include, for example, a button selection that turns off device 100 , a determination that the object exits a field of view of image sensor 120 , a determination that no object has been detected by proximity sensor 130 for a predetermined amount of time, a button selection that turns off display 150 , a predetermined gesture, a predetermined state of the device, a position of the device, an application running in foreground, and an application is running at background.
  • processor 110 is further configured to output a second signal to the proximity sensor to cause the proximity sensor to change to a different state.
  • the different state of proximity sensor 130 may include at least one of an off state, a state having a changed sampling rate, and a state having a changed power mode.
  • Processor 110 may be configured to output the second signal to proximity sensor 130 to cause proximity sensor 130 to change to a different state based on, for example, a button selection that turns off device 100 or a button selection that turns off display 150 .
  • FIG. 5 illustrates an exemplary process 500 that processor 110 may be configured to perform.
  • processor 110 of device 100 may be configured to perform these operations by executing software or firmware stored in memory 140 , or may be configured to perform these operations using dedicated hardware or one or more ASICs.
  • At least one processor may be configured to receive first data, associated with a detected object, from the proximity sensor while the proximity sensor operates at a first level of power consumption, wherein the first data is reflective of at least a one-dimensional position of the object relative to the proximity sensor.
  • processor 110 may be configured to receive first data, associated with a detected object, from proximity sensor 130 while proximity sensor 130 operates at a first level of power consumption.
  • the first data may be reflective of at least a one-dimensional position of the object relative to proximity sensor 130 .
  • the detected object may include, for example, any of the objects described above.
  • the detected object may include one or more hands, one or more fingers, one or more fingertips, one or more other parts of a hand, or one or more hand-held objects associated with a user.
  • the first data may include, for example, a one-dimensional position reflective of a presence of an object within the region defined by broken lines 320 within a certain distance of proximity sensor 130 .
  • a one-dimensional position may indicate whether or not an object is within a predefined distance (e.g., 20 centimeters) from proximity sensor 130 .
  • the one-dimensional position may indicate a distance from proximity sensor 130 to a nearest object.
  • the one-dimensional position may reflect the strength of a signal detected by proximity sensor 130 .
  • proximity sensor 130 may be configured to output to processor 110 an analog signal representative of the presence and/or distance to the object.
  • processor 110 may be configured to convert the analog signal to digital data indicative of the presence and/or distance to the object.
  • processor 110 may apply one or more thresholds to the analog signal, or map the analog signal to distances using a database, to determine the presence and/or distance to the object.
  • processor 110 may receive digital data indicative of the presence and/or distance to the object directly from proximity sensor 130 .
  • a one-dimensional position may include, for example, any information reflective of a distance to an object, such as an actual distance from proximity sensor 130 , a relative distance between the object and another location, a range of possible distance, an indication of whether the object is moving away from proximity sensor 130 , an indication of whether the object is moving toward proximity sensor 130 , and the like.
  • the first data is associated with a first resolution.
  • the first resolution may be a single indication of a presence and/or distance of an object detected by a single sensor (e.g., one-dimensional data).
  • the first resolution may be a plurality of indications of a presence and/or distance of an object.
  • proximity sensor 130 may include a set of 16 proximity sensors that each determine a presence and/or distance to an object within a region defined by broken lines 320 .
  • the first level of power consumption associated with proximity sensor 130 may include, for example, a relatively low power consumption.
  • the first level of power consumption may be lower than a power consumption associated with image sensor 120 .
  • each of the sensors associated with proximity sensor 130 may require a particular amount of power to operate.
  • proximity sensor 130 may be configured to operate at various different power consumptions based on a rate of sensing. For example, proximity sensor 130 may be configured to sense an object in proximity to it once every 10 milliseconds, once every second, once every 10 seconds, once every minute, or at any other frequency. In some embodiments, proximity sensor 130 may operate at a first frequency until the object is detected, and then proximity sensor 130 may operate at a second frequency greater than the first frequency.
  • At least one processor may be configured to receive second data, associated with the detected object, from the image sensor while the image sensor operates at a second level of power consumption greater than the first level, wherein the second data is reflective of at least a two-dimensional position of the object relative to the image sensor.
  • processor 110 may be configured to receive second data, associated with the detected object, from image sensor 120 while image sensor 120 operates at a second level of power consumption greater than the first level.
  • the second data may be reflective of at least a two-dimensional position of the object relative to image sensor 120 .
  • the second data may include, for example, two-dimensional data.
  • the two-dimensional data may include, for example, image information captured by image sensor 120 .
  • the image information may include, for example, one or more of an analog image, a digital image, a subset of the analog image, a subset of the digital image, digital information processed by an image signal processor, a mathematical representation or transformation of information associated with data sensed by image sensor 120 , visual information such as one or more frequencies in an image, conceptual information such as presence of one or more objects in the field of view of the image sensor.
  • the image information may also include, for example, information indicative of the state of image sensor 120 , parameters of image sensor 120 such as exposure, frame rate, resolution of an image, color bit resolution, depth resolution, and field of view, information from other sensors during image capture, such as proximity sensor information, accelerometer information, information describing further processing that took place after an image is captured, illumination condition during image capture, features extracted from a digital image by image sensor 120 , or any other information associated with data sensed by image sensor 120 .
  • the image information may include, for example, a single static image or a set of images (e.g., video).
  • the second data is associated with a second resolution greater than the first resolution.
  • image sensor 120 when operating with the second resolution, image sensor 120 may be configured to use more sensors associated with image sensor 120 (e.g., the previously deactivated sensor of image sensor 120 may be reactivated).
  • the second data may include one or more images of the object.
  • the second data may include image information associated with the field of view defined by broken lines 310 . If a user places a hand, finger, or other object within the field of view defined by broken lines 310 , image sensor 120 may capture an image of the hand, finger or other object.
  • the first data may be reflective of a distance to the object
  • the second data may be reflective of the horizontal and vertical position of the object.
  • the two-dimensional position may include a horizontal position of the object and a vertical position of the object.
  • the horizontal and vertical positions may be, for example, offsets from the position of image sensor 120 , may be relative positions within an image captured by image sensor 120 , or may be some other values reflective of a horizontal position and vertical position of the object.
  • Processor 110 may be configured to detect an object in the second data based on, for example, the contour and/or location of an object in the second data. For example, processor 110 may access a filter mask associated with the object and apply the filter mask to the second data to determine if the object is present in the second data. That is, for example, the location in the second data most correlated to the filter mask may be determined as the location of the object associated with the. Processor 110 may also be configured to access a plurality of different filter masks associated with a plurality of different hand poses. Thus, for example, a filter mask from the plurality of different filter masks that has a best correlation to the image information may cause a determination that the hand pose associated with the filter mask is the hand pose of the object.
  • processor 110 may be configured to detect a central location of a hand and a central location of each a tip of each finger of the hand that is visible in the second data.
  • U.S. Pat. Nos. 7,584,113 and 8,199,115 disclose techniques for performing object detection, both of which are incorporated herein by reference in their entirety.
  • the second level of power consumption associated with image sensor 120 may be higher than the first level of power consumption associated with proximity sensor 130 .
  • image sensor 120 may include a large number of sensors that each require a relatively large amount of power as compared to the one or more sensors of proximity sensor 130 and/or may operate at a high frequency (requiring more overall power) than proximity sensor 130 .
  • processor 110 is further configured to determine whether the proximity sensor and the image sensor are facing a user based on information received from at least one of an accelerometer and a gyroscope.
  • device 100 may include at least one of an accelerometer and a gyroscope (not shown in FIG. 1 ) and may be configured to receive data from the accelerometer and/or gyroscope.
  • Processor 110 may be configured to use the data received from the accelerometer and/or gyroscope to determine a position and movement of device 100 .
  • Memory 140 may store data regarding various possible positions and/or motions and associated possible actions.
  • processor 110 may analyze the determined position and movement of device 100 to determine whether device 100 is moving back and forth (e.g., in a user's pocket) or moved upwards and then stationary (e.g., picked up by a user and then facing the user). Processor 110 may be configured to ignore the first data and the second data if a determination is made that the proximity sensor and the image sensor are not facing the user.
  • At least one processor may be configured to coordinate the first data and the second data to obtain three-dimensional information associated with the detected object.
  • processor 110 may be configured to coordinate the first data and the second data to obtain three-dimensional information associated with the detected object.
  • the coordination of the first data and the second data may include, for example, generating, for one or more times, three-dimensional information.
  • the first data may include one or more times at which the first data was captured.
  • the second data may include one or more times at which the second data was captured.
  • Processor 110 may be configured, for example, to match information in the first data and information in the second data associated with the same time or substantially the same time (e.g., captured within a time interval in which a hand or finger will exhibit little or no movement). However, processor 110 may also be configured to match information in the first data and information in the second data regardless of time.
  • processor 110 may be configured to determine information regarding the distance to the object from the first data and information regarding the horizontal and vertical position of the object from the second data.
  • Processor 110 may be configured to combine the distance information, the horizontal position information, and the vertical position information to obtain the three-dimensional information.
  • the distance information, horizontal information, and vertical information may each be associated with the location of the object.
  • processor 110 may be configured to determine a plurality of objects in the second data (e.g., a plurality of finger tips and a whole hand).
  • the distance information associated with the first data may be applied to the horizontal and vertical locations of each of the plurality of objects detected in the second data.
  • proximity sensor 130 includes a resolution sufficient to distinguish between a plurality of objects, different portions of the first data representative of the distance to a plurality of the objects may be associated with corresponding horizontal and vertical positions of each of the plurality of objects.
  • processor 110 may be configured to determine a series of three-dimensional positions of the object, determined from a series of information regarding distance, horizontal position, and vertical position captured at a plurality of different times. Each three-dimensional position of the object may be determined, for example, in the manner described above. Moreover, in some embodiments, processor 110 may be configured to determine a series of three-dimensional positions of each of the plurality of objects (e.g., a series of three-dimensional positions associated with a user's finger tips and whole hand).
  • processor 110 is further configured to determine a gesture using the three-dimensional information.
  • the gesture is performed by at least one of a hand and a finger.
  • the gesture may include, for example, a zoom-in gesture, a zoom-out gesture, a clockwise circular gesture, a counter-clockwise circular gesture, a gesture parallel to a display, a gesture non-parallel to a display, a directional gesture in which the object starts close to the device and then moves away, or a directional gesture in which the object starts away from the device and then moves close to the device.
  • FIG. 6A graphically depicts an example of a zoom-in gesture.
  • a zoom-in gesture may include, for example, a movement in which two or more fingers move away from each other as a hand moves toward display 150 .
  • FIG. 6B graphically depicts an example of a zoom-out gesture.
  • a zoom-out gesture may include, for example, a movement in which two or more fingers move towards each other as a hand moves away from display 150 . If using only horizontal and vertical position data, it may be difficult to distinguish between a zoom-in gesture and a zoom-out gesture. For example, a user may initially have a plurality of finger tips separated before bringing them together to start a zoom-in gesture; the initial phase may be confused for a zoom-out gesture.
  • Processor 110 may be configured to distinguish between a zoom-in gesture and a zoom-out gesture using the three-dimensional information. For example, processor 110 may be configured to determine, using the first data, whether the detected object is moving toward or away from the device, to determine, using the second data, whether one or more fingers associated with the detected object are spread or closed, and to determine at least one of zoom-in gesture and a zoom-out gesture based on the determination of whether the detected object is moving toward or away from the device and the determination of whether the one or more fingers associated with the detected object are spread or closed.
  • processor 110 may detect a zoom-in gesture. If processor 110 determines that the detected object is moving away from the device and the one or more fingers associated with the detected object are initially spread and the closed as the detected object moves away from the device, processor 110 may detect a zoom-out gesture.
  • processor 110 may be configured to detect a zoom-in gesture if processor 110 detects fingers that are spread while a hand moves toward the device, initially stay open while the hand moves away from the device, and then close while the hand is away from the device.
  • processor 110 may be configured to detect a zoom-out gesture if processor 110 detects fingers that are spread while a hand moves toward the device, are closed as the hand when the hand is near the device, and stay closed as the hand moves backwards away from the device.
  • processor 110 is further configured to determine a circular gesture.
  • a circular gesture may include, for example, a clockwise circular gesture or a counter-clockwise circular gesture.
  • FIG. 6C graphically depicts an example of a clockwise circular gesture.
  • a clockwise circular gesture may include, for example, a gesture in which the object (e.g., a hand or finger) make a circular, approximately circular, elliptical, or approximately elliptical motion that approaches and moves away from display 150 in a first direction.
  • FIG. 6D graphically depicts an example of a counter-clockwise circular gesture.
  • a counter-clockwise circular gesture may include, for example, a gesture in which the object (e.g., a hand or finger) make a circular, approximately circular, elliptical, or approximately elliptical motion that approaches and moves away from display 150 in a second direction opposite the first direction. If only two-dimensional horizontal and vertical position data from the second data were analyzed, it may be difficult to distinguish between a clockwise circular gesture and a counter-clockwise direction, since in both scenarios image sensor 120 may see the object moving left-to-right and right-to-left.
  • the object e.g., a hand or finger
  • processor 110 is configured to distinguish between a clockwise circular gesture and a counter-clockwise circular gesture.
  • processor 110 may be configured to analyze the distance of the object to proximity sensor 130 as the object moves to distinguish between a clockwise circular gesture and a counter-clockwise circular gesture. As one example, if processor 110 determines that the object approaches device 110 moving left-to-right, continues to approach device 110 moving right-to-left, then begins to move away from device 110 still moving right-to-left, and then continues to move away from device 110 moving left-to-right, processor 110 may determine a clockwise circular gesture.
  • processor 110 may determine a counter-clockwise circular gesture.
  • processor 110 is further configured to determine, using the first data, whether the object performed a gesture parallel to a display. For example, as depicted in FIG. 6E , the object may move parallel to display 150 in a right-to-left direction. In contrast, as depicted in FIG. 6F , the object may move in a right-to-left direction while moving away from proximity sensor 130 . As another example of a non-parallel gesture, the object may move in a right-to-left direction while moving toward proximity sensor 130 . Parallel and non-parallel gestures that follow the same horizontal and vertical path, but a different distance path, may be difficult to distinguish using only the second data.
  • processor 110 may be configured to determine whether the object performed a gesture parallel to display 150 by analyzing the first data. For example, processor 110 may determine a gesture parallel to display 150 if the distance to proximity sensor 150 remains constant, decreases at a constant rate associated with the offset between proximity sensor 130 and display 150 , or increases at a constant rate associated with the offset between proximity sensor 130 and display 150 .
  • processor 110 is further configured to determine, using the first data, whether the object performed a tapping gesture. For example, processor 110 may be configured to analyze the first data and the second data to determine whether a hand moves toward the device and then away from the device. If processor 110 determines a hand moves toward the device and then away from the device, processor 110 may detect a tapping gesture.
  • processor 110 is further configured to output at least one of a message and a command associated with the three-dimensional information.
  • a message may include the various messages described above in process 400 .
  • a command may include the various commands described above in process 400 .
  • processor 110 may be configured to output at least one of a message associated with the determined gesture and a command associated with the determined gesture to thereby enable gesture control. For example, if a zoom-in gesture is detected by processor 110 , processor 110 may be configured to output a command to an application displaying a picture on display 150 to enlarge a portion of the picture. The portion of the picture that is enlarged may be based on, for example, a location of the object in relation to display 150 . For example, the portion of the picture that is enlarged may correspond to the location on display 150 that is perpendicular to the object. Similarly, if a zoom-out gesture is detected by processor 110 , processor 110 may be configured to output a command to an application displaying a picture on display 150 to decrease the size of the picture.
  • processor 110 may be configured to output a command to an application displaying a window on display 150 that is larger than display 150 (e.g., a browser showing a website with several pages of content) to scroll in a first direction (e.g., up, down, left, or right).
  • processor 110 may be configured to output a command to an application displaying a window on display 150 that is larger than display 150 to scroll in a second direction different than the first direction (e.g., in a direction opposite to the first direction).
  • processor 110 may be configured to output a command to an application displaying a picture of display 150 to move the picture in the direction of the gesture.
  • processor 110 may be configured to act differently. For example, if processor 110 detects non-parallel gesture in an upwards direction relative to display 150 , processor 110 may be configured to output a message to external speakers to increase audio volume.
  • processor 110 may also be configured to output a command to activate an application or service based on a detected gesture.
  • processor 110 may be configured to output a command to select a button associated with an application that is presented on display 150 .
  • processor 110 may be configured to output a command to activate an application associated with the tapping gesture.
  • processor 110 is further configured to determine, using the first data, a distance of the object from the proximity sensor; determine, using the second data, a movement of the object, and determine, using the distance and the movement, a speed of the object.
  • FIG. 7 depicts two exemplary speed determinations.
  • processor 110 may determine that the distance to the object equals z1 and that the object has moved horizontally across a first portion of the field of view of image sensor 120 by an amount x1 during a first time period.
  • Image sensor 120 may provide processor 110 with image sensor 120 's field of view, or processor 110 may access data regarding the field of view from memory 140 .
  • processor 110 may be configured to calculate an actual distance travelled by the object. For example, Processor 110 may be configured to determine a total length covered by a field of view at a particular distance, and multiply the total length by the percentage of the field of view that the object covered when moving horizontally. Processor 110 may be configured to divide the actual distance travelled by the first time period to determine a speed of the object v1.
  • processor 110 may determine that the distance to the object equals z2 and that the object has moved horizontally across a first portion of the field of view of image sensor 120 by an amount x2 during a first time period.
  • z2 may be a longer distance than z1
  • x2 may be an equal portion of the field of view of image sensor 120 as x1.
  • processor 110 may determine that the speed of the object associated with z2 and x2 is v2. Because z2 is greater than z1, in this example v2 is greater than v1.
  • processor 110 may be configured to determine a speed associated with a gesture.
  • processor 110 may be configured to determine at least one of a left-to-right gesture and right-to-left gesture using the three-dimensional information.
  • the object in a left-to-right gesture and right-to-left gesture, the object is a hand.
  • Processor 110 may be configured to determine the speed of the object such that the speed is reflective of a hand speed.
  • processor 110 may be configured to determine a circular gesture using the three-dimensional information.
  • a circular gesture the object is a finger.
  • Processor 110 may be configured to determine the speed of the object such that the speed is reflective of a finger speed.
  • the message or command may reflect the speed of the object.
  • a relatively fast clockwise circular gesture e.g., one rotation in less than half a second
  • processor 110 may output a command to cause a window on display 150 exhibit a large scroll (e.g., the window may scroll down an entire page for each rotation).
  • a relatively slow clockwise circular gesture may cause processor 110 to output a command to cause a window on display 150 exhibit a small scroll (e.g., the window may scroll one line for each rotation).
  • various improvements on known techniques for performing touch-free gesture detection can be achieved.
  • battery and processor resources of device 100 may be conserved by implementing processes 400 and 500 .
  • improved gesture detection associated with three-dimensional information may be achieved by implementing process 500 .

Abstract

Systems, methods, and non-transitory computer-readable media are disclosed. For example, a dual sensor control device is disclosed that includes at least one processor for receiving information from a proximity sensor and an image sensor. The processor may be configured to receive first data from the proximity sensor while the image sensor is in a first state, determine, using the first data, a presence of an object in proximity to the proximity sensor. The processor may also be configured to output, based on the determined presence of the object in proximity to the proximity sensor, a signal to the image sensor to cause the image sensor to enter a second state, different from the first state. The processor may also be configured to receive second data from the image sensor in the second state, and output at least one of a message and a command associated with the second data.

Description

    RELATED APPLICATIONS
  • This applications claims priority to U.S. provisional application No. 61/782,678, filed Mar. 14, 2013, which is incorporated herein by reference in its entirety.
  • TECHNICAL FIELD
  • The present disclosure relates to the field of touch-free gesture detection and, more particularly, devices and computer-readable media for using a proximity sensor and an image sensor to detect touch-free gestures.
  • BACKGROUND
  • Permitting a user to interact with a device or an application running on a device is useful in many different settings. For example, keyboards, mice, and joysticks are often included with electronic systems to enable a user to input data, manipulate data, and cause a processor of the system to execute a variety of other actions. Increasingly, however, touch-based input devices, such as keyboards, mice, and joysticks, are being replaced by, or supplemented with, devices that permit touch-free user interaction. For example, a system may include an image sensor to capture images of a user, including, for example, a user's hands and/or fingers. A processor may be configured to receive such images and initiate actions based on touch-free gestures performed by the user.
  • Image sensors often consume relatively large amounts of device resources. For example, an image sensor may require power (often supplied by a battery having a limited charge) and data from the image sensor may need to be analyzed by a processor. Therefore, it may be desirable to limit the resources required by an image sensor while maintaining touch-free gesture detection functionality. Improvements in techniques for detecting and acting upon touch-free gestures are desirable.
  • SUMMARY
  • In one disclosed embodiment, a dual sensor control device is disclosed. The dual sensor control device includes at least one processor for receiving information from a proximity sensor and an image sensor. The at least one processor may be configured to receive first data from the proximity sensor while the image sensor is in a first state, determine, using the first data, a presence of an object in proximity to the proximity sensor, output, based on the determined presence of the object in proximity to the proximity sensor, a signal to the image sensor to cause the image sensor to enter a second state, different from the first state, receive second data from the image sensor in the second state, and output at least one of a message and a command associated with the second data.
  • In another disclosed embodiment, a three-dimensional control device is disclosed. The three-dimensional control device includes at least one processor for receiving information from a proximity sensor and an image sensor. The at least one processor may be configured to receive first data, associated with a detected object, from the proximity sensor while the proximity sensor operates at a first level of power consumption, wherein the first data is reflective of at least a one-dimensional position of the object relative to the proximity sensor, receive second data, associated with the detected object, from the image sensor while the image sensor operates at a second level of power consumption greater than the first level, wherein the second data is reflective of at least a two-dimensional position of the object relative to the image sensor, and coordinate the first data and the second data to obtain three-dimensional information associated with the detected object.
  • Additional aspects related to the embodiments will be set forth in part in the description which follows, and in part will be understood from the description, or may be learned by practice of the disclosed embodiments.
  • It is to be understood that both the foregoing general description and the following detailed description are exemplary and explanatory only and are not restrictive of the claims.
  • BRIEF DESCRIPTION OF THE DRAWINGS
  • FIG. 1 illustrates an example dual sensor control device that may be used for implementing the disclosed embodiments.
  • FIG. 2 illustrates an example graphical representation of the dual sensor control device, in accordance with some of the disclosed embodiments.
  • FIG. 3 illustrates another example graphical representation of the dual sensor control device, in accordance with some of the disclosed embodiments.
  • FIG. 4 illustrates an example process in accordance with some of the disclosed embodiments.
  • FIG. 5 illustrates another example process in accordance with some of the disclosed embodiments.
  • FIGS. 6A-6F illustrate several graphical representations of gestures.
  • FIG. 7 illustrates a graphical representation of a gesture speed determination.
  • DETAILED DESCRIPTION
  • Reference will now be made in detail to the example embodiments, which are illustrated in the accompanying drawings. Wherever possible, the same reference numbers will be used throughout the drawings to refer to the same or like parts.
  • Inventive embodiments may include a dual sensor control device, which may include any device that employs at least two sensors for detecting one or more of proximity, image data, or environmental conditions. FIG. 1 is a diagram illustrating one example of a dual sensor control device 100 that may be used for implementing the disclosed embodiments. The dual sensor control device may be incorporated in, among other things, a mobile phone, smart-glasses, a personal computer (PC), an entertainment device, a set top box, a television, a mobile game machine, a tablet computer, an e-reader, a portable game console, a portable computer such as a laptop or ultrabook, a home appliance such as a kitchen appliance, a communication device, an air conditioning thermostat, a docking station, a game machine such as a mobile video gaming device, a digital camera, a watch, an entertainment device, speakers, a Smart Home device, a media player or media system, a location-based device, a pico projector or an embedded projector, a medical device such as a medical display device, a vehicle, an in-car/in-air infotainment system, a navigation system, a wearable device, an augmented reality-enabled device, wearable goggles, a robot, interactive digital signage, a digital kiosk, a vending machine, an automated teller machine (ATM), or any other apparatus that may receive data from a user or output data to a user. Moreover, the dual sensor control device may be handheld (e.g., held by a user's hand) or non-handheld. As shown in FIG. 1, device 100 may include a processor 110, an image sensor 120, a proximity sensor 130, a memory 140, and a display 150.
  • Embodiments of the invention may include at least one processor. As used herein, a processor may include, among other things, any electric circuit that may be configured to perform a logic operation on at least one input variable, including, for example one or more integrated circuits, microchips, microcontrollers, and microprocessors, which may be all or part of a central processing unit (CPU), a digital signal processor (DSP), a field programmable gate array (FPGA), a graphical processing unit (GPU), or any other circuit known to those skilled in the art that may be suitable for executing instructions or performing logic operations. Exemplary processor 110 in FIG. 1, may receive information from a proximity sensor and an image sensor. For example, as depicted in FIG. 1, processor 110 may be connected to at least one image sensor 120 and at least one proximity sensor 130 such that processor 110 may receive information from the at least one image sensor 120 and the at least one proximity sensor 130. Although one processor is shown in FIG. 1, device 100 may include a plurality of processors, which may provide different processing capabilities (e.g., dedicated graphics processing) and/or that may provide parallel processing functionality. Multiple functions may be accomplished using a single processor or multiple related and/or unrelated functions may be divided among multiple processors.
  • Embodiments of the invention may involve an image sensor. An image sensor may include, among other things, one or more of a CCD image sensor, a CMOS image sensor, a camera, a light sensor, an IR sensor, an ultrasonic sensor, a proximity sensor, a shortwave infrared (SWIR) image sensor, a reflectivity sensor, or any other device that is capable of sensing visual characteristics of an environment. Moreover, an image sensor may include, for example, a single photosensor or 1-D line sensor capable of scanning an area, a 2-D sensor, or a stereoscopic sensor that includes, for example, a plurality of 2-D image sensors. Image sensor may be associated with a lens for focusing a particular area of light onto the image sensor. Images captured by image sensor may be digitized by image sensor and input to a processor, or may be input to a processor in analog form and digitized by processor. By way of example, FIG. 1 depicts image sensor 120 as part of device 100. However, in alternative embodiments, image sensor 120 may be external to device 100.
  • Embodiments may also involve a proximity sensor. Exemplary proximity sensors may include, among other things, one or more of a capacitive sensor, a capacitive displacement sensor, a laser rangefinder, a sensor that uses time-of-flight (TOF) technology, an IR sensor, a sensor that detects magnetic distortion, or any other sensor that is capable of generating information indicative of the presence of an object in proximity to the proximity sensor. In some embodiments, the information generated by proximity sensor may include a distance of the object to the proximity sensor. A proximity sensor may be a single sensor or may be a set of sensors. FIG. 1 depicts proximity sensor 130 as part of device 100. However, in alternative embodiments, proximity sensor 130 may be external to device 100.
  • Embodiments may also include memory. Memory may include, for example, one or more of persistent memory, ROM, EEPROM, EAROM, flash memory devices, magnetic disks, magneto optical disks, CD-ROM, DVD-ROM, Blu-ray, and the like, and may contain instructions (i.e., software or firmware) or other data. By way of example, FIG. 1 depicts memory 140 connected to processor 110. Generally, processor 110 may receive instructions and data stored by memory 140. Thus, in some embodiments, processor 110 executes the software or firmware to perform functions by operating on input data and generating output. However, processor 110 may also be, for example, dedicated hardware or an application-specific integrated circuit (ASIC) that performs processes by operating on input data and generating output. Processor 110 may be any combination of dedicated hardware, one or more ASICs, one or more general purpose processors, one or more DSPs, one or more GPUs, or one or more other processors capable of processing digital information. FIG. 1 depicts memory 140 as part of device 100. However, in alternative embodiments, memory 140 may be external to device 100.
  • Embodiments may also include a display. A display may include, for example, one or more of a television set, computer monitor, head-mounted display, broadcast reference monitor, a liquid crystal display (LCD) screen, a light-emitting diode (LED) based display, an LED-backlit LCD display, a cathode ray tube (CRT) display, an electroluminescent (ELD) display, an electronic paper/ink display, a plasma display panel, an organic light-emitting diode (OLED) display, thin-film transistor display (TFT), High-Performance Addressing display (HPA), a surface-conduction electron-emitter display, a quantum dot display, an interferometric modulator display, a swept-volume display, a carbon nanotube display, a variforcal mirror display, an emissive volume display, a laser display, a holographic display, a light field display, a projector and surface upon which images are projected, or any other electronic device for outputting visual information. A display may include or be part of a touch screen. FIG. 1 depicts display 150 as part of device 100. However, in alternative embodiments, display 150 may be external to device 100. Moreover, in some embodiments, display 150 may be omitted entirely.
  • FIG. 2 provides one example of a graphical representation of device 100. As depicted in FIG. 2, image sensor 120 may be positioned to capture images of an area that may be viewed on display 150. Likewise, proximity sensor 130 may be positioned to capture proximity data associated with at least some display-viewable locations. In the example in FIG. 2, image sensor 120 is positioned in a central horizontal location and proximity sensor 130 is positioned to the side of image sensor 130. In other embodiments, however, both image sensor 120 and proximity sensor 130 may be positioned in a central horizontal location or both image sensor 120 and proximity sensor 130 may be positioned in non-central horizontal locations. Moreover, in some embodiments proximity sensor 130 is non-directional (that is, it may be configured to provide proximity data associated with objects in all directions); in such embodiments, proximity sensor 130 may be positioned anywhere on or in device 100, or outside of device 100.
  • FIG. 3 provides another graphical representation of device 100. Broken lines 310 may be representative of a field of view of image sensor 120. Broken lines 320 may be representative of a region within which proximity sensor 130 may detect the presence of an object and/or detect the distance to an object. As can be seen in FIG. 3, in some embodiments the field of view of image sensor 120 may be narrower than the region within which proximity sensor 130 may detect the presence of an object and/or detect the distance to an object. However, in other embodiments, the field of view of proximity sensor 130 may be narrower than, or the same as, or pointing in a different direction than the field of view of image sensor 120.
  • FIG. 4 illustrates an exemplary process 400 that processor 110 may be configured to perform. For example, as discussed above, processor 110 of device 100 may be configured to perform these operations by executing software or firmware stored in memory 140, or may be configured to perform these operations using dedicated hardware or one or more ASICs.
  • In some embodiments, at least one processor may be configured to receive first data from the proximity sensor while the image sensor is in a first state. For example, in connection with operation 410 in FIG. 4, processor 110 may be configured to receive first data from proximity sensor 130 while image sensor 120 is in a first state.
  • The first data, which may be received by processor 110, may include, any information indicative of a presence of an object to be detected. For example, the first data may include one-dimensional data reflective of a presence of an object within the region defined by broken lines 320 within a certain distance of proximity sensor 130. For example, in some embodiments, the one-dimensional data may indicate whether or not an object is within a predefined distance (e.g., 20 centimeters) from proximity sensor 130. In other embodiments, the one-dimensional data may indicate a distance from proximity sensor 130 to a nearest object. In other embodiments, the one-dimensional data may indicate whether or not any object is within a field of view of proximity sensor 130 (e.g., within broken lines 320). For example, in some embodiments, proximity sensor 130 may include a processor that determines whether or not any object is within a field of view of proximity sensor 130 and outputs the determination to processor 110.
  • In some embodiments, proximity sensor 130 may be configured to output to processor 110 an analog signal representative of the presence and/or distance to the object. In such embodiments, processor 110 may be configured to convert the analog signal to digital data indicative of the presence and/or distance to the object. For example, processor 110 may apply one or more thresholds to the analog signal to determine the presence and/or distance to the object. In other embodiments, processor 110 may receive digital data indicative of the presence and/or distance to the object directly from proximity sensor 130.
  • In some embodiments, the first data is associated with a first resolution. For example, the first resolution may be a single indication of a presence and/or distance of an object detected by a single sensor (e.g., one-dimensional data). In other embodiments, the first resolution may be a plurality of indications of a presence and/or distance of an object. For example, proximity sensor 130 may include a set of proximity sensors (e.g., a set of 16 proximity sensors) that each determine a presence and/or distance to an object within a region defined by broken lines 320.
  • The first state of the image sensor may include any mode of operation of the image sensor. As described in more detail below, image sensor 120 may be configured to operate in a first state and a second state, wherein the second state is different than the first state. For example, in some embodiments, the first state of image sensor 120 may include at least one of an off state and a first power state. The first power state may be, for example, a state in which image sensor 120 does not consume significant processor and/or battery resources as a byproduct of one or more settings associated with image sensor 120. For example, the first power state may include a stand-by mode. As another example, the first power state may include at least one of a first sampling rate, a first resolution, and a first exposure length. With a first sampling rate, image sensor 120 may be configured to capture only a limited small number of images. For example, image sensor 120 may only capture images after an express user request (e.g., after a user selects a picture capture button). A first sampling rate may also include, for example, pictures captured at a low frequency (e.g., image sensor 120 may automatically capture an image once a minute). Similarly, a first resolution may include a scenario in which some sensors associated with image sensor 120 are deactivated. For example, image sensor 120 may be composed of a grid of 1024 sensors. When operating at the first resolution, image sensor 120 may be configured to only use 16 of the sensors. A first exposure length may include, for example, a long exposure length (e.g., if processor resources are being conserved) or a short exposure length (e.g., if battery resources are being conserved).
  • The first state of image sensor 120 is not necessarily limited to power conservation states. For example, image sensor 120 may be used for a non-gesture detection related purpose (e.g., for photography). The first state of image sensor 120 may be a setting in which image sensor 120 is optimized for the non-gesture detection related purpose. For example, the resolution, frame rate, and exposure length of image sensor 120 may be optimized for the non-gesture detection related purpose. As one example, if image sensor 120 is used for photography, a higher resolution may be used than may be needed for gesture detection.
  • In some embodiments, processor 110 is further configured to determine whether the proximity sensor and the image sensor are facing a user based on information received from at least one of an accelerometer and a gyroscope. For example, device 100 may include at least one of an accelerometer and a gyroscope (not shown in FIG. 1). Processor 110 may be configured to receive data from the accelerometer and/or gyroscope. Processor 110 may be configured to use the data received from the accelerometer and/or gyroscope to determine a position and movement of device 100. Memory 140 may store data regarding various possible positions and/or motions and associated possible actions. For example, processor 110 may analyze the determined position and movement of device 100 to determine whether device 100 is moving back and forth (e.g., in a user's pocket) or moved upwards and then stationary (e.g., picked up by a user and then facing the user). Processor 110 may be configured to ignore the first data if a determination is made that the proximity sensor and the image sensor are not facing the user.
  • In some embodiments, at least one processor may be configured to determine, using the first data, a presence of an object in proximity to the proximity sensor. Consistent with embodiments of the invention, a processor may be configured to determine object presence by analyzing, in any way that determines object presence, data from a proximity sensor. For example, in operation 420 in FIG. 4, processor 110 may be configured to determine, using the first data, a presence of an object in proximity to proximity sensor 130.
  • The object may include, for example, one or more hands, one or more fingers, one or more fingertips, one or more other parts of a hand, or one or more hand-held objects associated with a user. The presence of the object may be determined based on the first data received from proximity sensor 130. In some embodiments, processor 110 may determine the presence of the object anytime the first data indicates an object is within the region defined by broken lines 320. In other embodiments, processor 110 may determine the presence of the object if the first data indicates that an object is within the region defined by broken lines 320 within a predetermined distance of proximity sensor 130.
  • As one example of the analysis of the first data, proximity sensor 130 may generate an analog signal that has a voltage level indicative of a distance to an object. If no object is in proximity to proximity sensor 130, the voltage level may be at, or close to, zero. If an object is located adjacent to the proximity sensor 130, the voltage level may be a maximum value such as, for example, 1 volt. If an object is within a predetermined distance to proximity sensor 130, the voltage level may reflect how close the object is to the proximity sensor 130 (e.g., if the object is at a midway point between the predetermined distance and proximity sensor 130, the voltage level may be at a middle value (e.g., 0.5 volts). Proximity sensor 130 may include a processor that converts the analog value to distance data. For example, proximity sensor 130 may access a database that maps analog values to distance values. Alternatively, the analog value may be transmitted to processor 110 that may convert the analog value to a distance value.
  • Processor 110 may be configured to enable gesture control based on the determined presence of the object in proximity to the proximity sensor. For example, as described in more detail below, processor 110 may be configured to change a state of image sensor 120 to permit gesture detection. Moreover, processor 110 may execute, for example, a gesture detection program stored in memory 140. In some embodiments, processor 110 is also configured to turn on display 150 based on the determined presence of the object in proximity to proximity sensor 130.
  • In some embodiments, at least one processor may be configured to output, based on the determined presence of the object in proximity to the proximity sensor, a signal to the image sensor to cause the image sensor to enter a second state, different from the first state. For example, in connection with operation 430 in FIG. 4, processor 110 may be configured to output, based on the determined presence of the object in proximity to proximity sensor 130, a signal to image sensor 120 to cause image sensor 120 to enter a second state, different from the first state.
  • For example, processor 110 may output the signal to image sensor 120 if processor 110 determines the presence of the object anywhere within a field of view of a proximity sensor. For example, processor 110 may output the signal to image sensor 120 if processor 110 determines the presence of the object within the region defined by broken lines 320 for a predetermined amount of time. As another example, processor 110 may output the signal to image sensor 120 if processor 110 determines the presence of the object within the region defined by broken lines 320 for any amount of time.
  • The second state of image sensor 120 may include, for example, at least one of an on state and a second power state higher than the first power state. For example, if the first state was an off state or stand-by mode, the second state may be an on state. Similarly, if the first state was a first power state, the second state may be a second power state higher than the first power state.
  • In some embodiments, the second state is at least one of a second sampling rate different than the first sampling rate, a second resolution different than the first resolution, and a second exposure length different than the first exposure length. For example, when operating with the second sampling rate, image sensor 120 may be configured to capture images more frequently than the first sampling rate. Similarly, when operating with the second resolution, image sensor 120 may be configured to use more sensors associated with image sensor 120 (e.g., the previously deactivated sensor of image sensor 120 may be reactivated). The second exposure length may include, for example, a short exposure length (e.g., if processor resources are being conserved in the first state) or a long exposure length (e.g., if battery resources are being conserved in the first state).
  • Moreover, as described above, the first state of image sensor 120 may be a setting in which image sensor 120 is optimized for a non-gesture detection related purpose that does not pertain to power conservation. In the second state, the resolution, frame rate, and exposure length of image sensor 120 may be optimized for gesture detection. For example, a resolution sufficient to detect the object, a frame rate sufficient to detect a gesture, and an exposure length sufficient to detect a gesture may be experimentally determined.
  • In some embodiments, processor 110 may be further configured to output the signal to the image sensor to enter the second state before the object reaches a field of view of the image sensor. For example, in some embodiments, processor 110 may output the signal to image sensor 120 immediately, before performing any other operations, after processor 110 determines the presence of the object in proximity to proximity sensor 130. As another example, memory 140 may store data regarding the amount of time a typical hand gesture would take to reach the field of view of image sensor 120 after being determined to be in proximity to proximity sensor 130. Processor 110 may be configured to calculate, using the data in memory 140, when to send the signal so that the signal arrives at image sensor 120 before the hand gesture enters the field of view of image sensor 120.
  • In some embodiments, at least one processor may be configured to receive second data from the image sensor in the second state. For example, in connection with operation 440 in FIG. 4, processor 110 may be configured to receive second data from image sensor 120 in the second state.
  • The second data may include any information received by a processor from an image sensor. For example, the second data may include two-dimensional data. The two-dimensional data may include, for example, image information captured by image sensor 120. The image information may include, for example, one or more of an analog image, a digital image, a subset of the analog image, a subset of the digital image, digital information processed by an image signal processor, a mathematical representation or transformation of information associated with data sensed by image sensor 120, visual information such as one or more frequencies in an image, conceptual information such as presence of one or more objects in the field of view of the image sensor. In some embodiments, the image information may also include, for example, information indicative of the state of image sensor 120, parameters of image sensor 120 such as exposure, frame rate, resolution of an image, color bit resolution, depth resolution, and field of view, information from other sensors during image capture, such as proximity sensor information, accelerometer information, information describing further processing that took place after an image is captured, illumination condition during image capture, features extracted from a digital image by image sensor 120, or any other information associated with data sensed by image sensor 120. The image information may include, for example, a single static image or a set of images (e.g., video).
  • In some embodiments, the second data is associated with a second resolution greater than the first resolution. For example, as discussed above, when operating with the second resolution, image sensor 120 may be configured to use more sensors associated with image sensor 120 (e.g., the previously deactivated sensor of image sensor 120 may be reactivated).
  • The second data may include one or more images of the object. For example, the second data may include image information associated with the field of view defined by broken lines 310. If a user places a hand, finger, or other object within the field of view defined by broken lines 310, image sensor 120 may capture an image of the hand, finger or other object.
  • Processor 110 may be configured to detect an object in the second data based on, for example, the contour and/or location of an object in the second data. For example, processor 110 may access a filter mask associated with the object and apply the filter mask to the second data to determine if the object is present in the second data. That is, for example, the location in the second data most correlated to the filter mask may be determined as the location of the object associated with the. Processor 110 may also be configured to access a plurality of different filter masks associated with a plurality of different hand poses. Thus, for example, a filter mask from the plurality of different filter masks that has a best correlation to the image information may cause a determination that the hand pose associated with the filter mask is the hand pose of the object. Other techniques for detecting real-world objects in image information (e.g., edge matching, greyscale matching, gradient matching, and other image feature based methods) are known in the art, and may also be used to detect the object in the second data. Moreover, the same techniques may be used to detect a plurality of objects in the second data. For example, processor 110 may be configured to detect a central location of a hand and a central location of each a tip of each finger of the hand that is visible in the second data. For example, U.S. Pat. Nos. 7,584,113 and 8,199,115 disclose techniques for performing object detection, both of which are incorporated herein by reference in their entirety.
  • In some embodiments, device 100 is configured to enable gesture control. For example, in some embodiments, processor 110 is further configured to enable gesture control based on the determined presence of the object in proximity to the proximity sensor. The gesture control may be based on, among other things, the second data. In some embodiments, the second data is reflective of a gesture. For example, the object may be performing a gesture at one or more times at which image sensor 120 captures an image of the object. Thus, the second data may be reflective of the object performing the gesture. In some embodiments, a gesture is performed by at least one of a hand and a finger. A gesture may include, for example, a swiping gesture, a pinching gesture of two fingers, a pointing gesture towards an object presented on a display, a left-to-right gesture, a right-to-left gesture, an upwards gesture, a downwards gesture, a pushing gesture, a waving gesture, a clapping gesture, a reverse clapping gesture, a gesture of splaying fingers on a hand, a reverse gesture of splaying fingers on a hand, a holding gesture associated with an object presented on a display for a predetermined amount of time, a clicking gesture associated with an object presented on a display, a double clicking gesture, a right clicking gesture, a left clicking gesture, a bottom clicking gesture, a top clicking gesture, a grasping gesture, a gesture towards an object presented on a display from a right side, a gesture towards an object presented on a display from a left side, a gesture passing through an object presented on a display, a blast gesture, a tipping gesture, a clockwise or counterclockwise two-finger grasping gesture over an object presented on a display, a click-drag-release gesture, a gesture sliding an icon such as a volume bar, or any other motion associated with a hand, finger, or other object.
  • In some embodiments, processor 110 is further configured to distinguish between a plurality of predefined gestures. For example, processor 110 may be configured to determine one or more locations associated with the object in the second data to determine the gesture performed by the object. For example, processor 110 may be configured to analyze the second data to determine that a hand of the user moves from a left side of the field of view of image sensor 120 to a right side of the field of view of image sensor 120 and, based on the determination, detect a left-to-right gesture. Similarly, processor 110 may be configured to analyze the second data to determine that one or more finger locations are moving back and forth in a circular motion and, based on the determination, detect a waving gesture. In some embodiments, processor 110 is further configured to determine the gesture based, at least in part, on an analysis of the first data and the second data. For example, processor 110 may be configured to determine the gesture in accordance with the operations described in more detail below for process 500.
  • In some embodiments, at least one processor may be configured to output at least one of a message and a command associated with the second data. For example, in connection with operation 450 of FIG. 4, processor 110 may be configured to output at least one of a message and a command associated with the second data. For example, processor 110 may be configured to address the message or command to any type of destination including, but not limited to, an operating system, one or more services, one or more applications, one or more devices, one or more remote applications, one or more remote services, or one or more remote devices.
  • A message may include, for example, a message to an application running on the external device, a service running on the external device, an operating system running on the external device, a process running on the external device, one or more applications running on a processor of the external device, a software program running in the background of the external device, or to one or more services running on the external device. Moreover, for example, the message may include a message to an application running on a device, a service running on the device, an operating system running on the device, a process running on the device, one or more applications running on a processor of the device, a software program running in the background of the device, or to one or more services running on the device.
  • The message may also include, for example, responsive to a selection of a graphical element, a message requesting data relating to a graphical element identified in an image from an application running on the external device, a service running on the external device, an operating system running on the external device, a process running on the external device, one or more applications running on a processor of the external device, a software program running in the background of the external device, or to one or more services running on the external device. The message may also include, for example, responsive to a selection of a graphical element, a message requesting a data relating to a graphical element identified in an image from an application running on a device, a service running on the device, an operating system running on the device, a process running on the device, one or more applications running on a processor of the device, a software program running in the background of the device, or to one or more services running on the device.
  • A command may include, for example, a command to run an application on the external device or website, a command to stop an application running on the external device or website, a command to activate a service running on the external device or website, a command to stop a service running on the external device or website, or a command to send data relating to a graphical element identified in an image. A message to a device may be a command. The command may be selected, for example, from a command to run an application on the device, a command to stop an application running on the device or website, a command to activate a service running on the device, a command to stop a service running on the device, or a command to send data relating to a graphical element identified in an image.
  • As discussed above, in some embodiments, the second data is reflective of at least a two-dimensional position of the object relative to the image sensor. Moreover, as discussed above, processor 110 may be configured to determine a gesture based on the second data that may be reflective of at least a two-dimensional position. In some embodiments, the at least one of a message and a command is further associated with the two-dimensional position. For example, processor 110 may be configured to determine a particular message or a particular command based on a gesture associated with the two-dimensional position. For example, memory 140 may store a database associating a plurality of different gestures to a plurality of different messages and/or a plurality of different commands. Processor 110 may be configured to select a particular message or command based on the database. For example, a left-to-right gesture may cause processor 110 to select a command to cause a picture shown on display 150 to move to the right. A waving gesture may cause processor 110 to select a command to cause an application running on device 100 to close.
  • As discussed above, in some embodiments the first data is reflective of at least a one-dimensional position of the object relative to the proximity sensor. Moreover, as discussed above, processor 110 may be configured to determine a gesture based on the first data. For example, processor 110 may be configured to detect a particular gesture if a user places an object in proximity to proximity sensor 130 in accordance with a predefined pattern (e.g., in proximity for a first predetermined time, then out of proximity, and then in proximity for a second predetermined time). In some embodiments, the at least one of a message and a command is further associated with the one-dimensional position. For example, processor 110 may be configured to select at least one of a message and a command based on the detected particular gesture. For example, processor 110 may be configured to select at least one of a message and a command by choosing at least one of a message and a command associated with the detected particular gesture in the database.
  • As discussed above, in some embodiments processor 110 may be configured to determine a gesture based on the first data, which may include at least a one-dimensional position of the object, and the second data, which may include at least a two-dimensional position of the object. As discussed in more detail below with respect to process 500, processor 110 may be configured to determine a gesture based on the first data and the second data. Moreover, as discussed in more detail below with respect to process 500, in some embodiments the at least one of a message and a command is further associated with the one-dimensional position and the two-dimensional position. For example, processor 110 may be configured to select at least one of a message and a command based on a determined gesture associated with the one-dimensional position and the two-dimensional position.
  • In some embodiments, the at least one of a message and a command is further associated with the first data obtained before the object reaches the field of view of the image sensor and the second data obtained after the object reaches the field of view of the image sensor. For example, the first data from proximity sensor 130 obtained before the object (e.g., a hand or finger) reaches the field of view of image sensor 140 may be combined with the second data from image sensor 120 to determine three-dimensional information about the object. Processor 110 may be configured to perform the combination in the manner described below with respect to process 500. The three-dimensional information may be analyzed to determine a gesture in the manner described below with respect to process 500.
  • In some embodiments, processor 110 is further configured to output a second signal to the image sensor to cause the image sensor to enter a third state after a predefined condition is met. For example, processor 110 may be configured to determine, using data from the proximity sensor, if the object exits a field of view of the proximity sensor, and output, based on the determined exit, a second signal to the image sensor to cause the image sensor to enter the third state. The third state may be, for example, any state that is different than the second state, any of the states described above with respect to the first state, or any state that requires less power than the second state, but more power than the first state. The predefined condition may include, for example a determination that the object exits a field of view of proximity sensor 130. Other examples of the predefined condition may include, for example, a button selection that turns off device 100, a determination that the object exits a field of view of image sensor 120, a determination that no object has been detected by proximity sensor 130 for a predetermined amount of time, a button selection that turns off display 150, a predetermined gesture, a predetermined state of the device, a position of the device, an application running in foreground, and an application is running at background.
  • In some embodiments, processor 110 is further configured to output a second signal to the proximity sensor to cause the proximity sensor to change to a different state. For example, the different state of proximity sensor 130 may include at least one of an off state, a state having a changed sampling rate, and a state having a changed power mode. Processor 110 may be configured to output the second signal to proximity sensor 130 to cause proximity sensor 130 to change to a different state based on, for example, a button selection that turns off device 100 or a button selection that turns off display 150.
  • FIG. 5 illustrates an exemplary process 500 that processor 110 may be configured to perform. For example, as discussed above, processor 110 of device 100 may be configured to perform these operations by executing software or firmware stored in memory 140, or may be configured to perform these operations using dedicated hardware or one or more ASICs.
  • In some embodiments, at least one processor may be configured to receive first data, associated with a detected object, from the proximity sensor while the proximity sensor operates at a first level of power consumption, wherein the first data is reflective of at least a one-dimensional position of the object relative to the proximity sensor. For example, in connection with operation 510 of FIG. 5, processor 110 may be configured to receive first data, associated with a detected object, from proximity sensor 130 while proximity sensor 130 operates at a first level of power consumption. The first data may be reflective of at least a one-dimensional position of the object relative to proximity sensor 130.
  • The detected object may include, for example, any of the objects described above. For example, the detected object may include one or more hands, one or more fingers, one or more fingertips, one or more other parts of a hand, or one or more hand-held objects associated with a user. As described above with reference to process 400, the first data may include, for example, a one-dimensional position reflective of a presence of an object within the region defined by broken lines 320 within a certain distance of proximity sensor 130. For example, in some embodiments, a one-dimensional position may indicate whether or not an object is within a predefined distance (e.g., 20 centimeters) from proximity sensor 130. In other embodiments, the one-dimensional position may indicate a distance from proximity sensor 130 to a nearest object. In some embodiments, the one-dimensional position may reflect the strength of a signal detected by proximity sensor 130.
  • As described above, in some embodiments, proximity sensor 130 may be configured to output to processor 110 an analog signal representative of the presence and/or distance to the object. In such embodiments, processor 110 may be configured to convert the analog signal to digital data indicative of the presence and/or distance to the object. For example, processor 110 may apply one or more thresholds to the analog signal, or map the analog signal to distances using a database, to determine the presence and/or distance to the object. In other embodiments, processor 110 may receive digital data indicative of the presence and/or distance to the object directly from proximity sensor 130. A one-dimensional position may include, for example, any information reflective of a distance to an object, such as an actual distance from proximity sensor 130, a relative distance between the object and another location, a range of possible distance, an indication of whether the object is moving away from proximity sensor 130, an indication of whether the object is moving toward proximity sensor 130, and the like.
  • In some embodiments the first data is associated with a first resolution. For example, the first resolution may be a single indication of a presence and/or distance of an object detected by a single sensor (e.g., one-dimensional data). In other embodiments, the first resolution may be a plurality of indications of a presence and/or distance of an object. For example, proximity sensor 130 may include a set of 16 proximity sensors that each determine a presence and/or distance to an object within a region defined by broken lines 320.
  • The first level of power consumption associated with proximity sensor 130 may include, for example, a relatively low power consumption. For example, as described below, the first level of power consumption may be lower than a power consumption associated with image sensor 120. For example, each of the sensors associated with proximity sensor 130 may require a particular amount of power to operate. Moreover, proximity sensor 130 may be configured to operate at various different power consumptions based on a rate of sensing. For example, proximity sensor 130 may be configured to sense an object in proximity to it once every 10 milliseconds, once every second, once every 10 seconds, once every minute, or at any other frequency. In some embodiments, proximity sensor 130 may operate at a first frequency until the object is detected, and then proximity sensor 130 may operate at a second frequency greater than the first frequency.
  • In some embodiments, at least one processor may be configured to receive second data, associated with the detected object, from the image sensor while the image sensor operates at a second level of power consumption greater than the first level, wherein the second data is reflective of at least a two-dimensional position of the object relative to the image sensor. For example, in connection with operation 520 of FIG. 5, processor 110 may be configured to receive second data, associated with the detected object, from image sensor 120 while image sensor 120 operates at a second level of power consumption greater than the first level. The second data may be reflective of at least a two-dimensional position of the object relative to image sensor 120.
  • As described above with reference to process 400, the second data may include, for example, two-dimensional data. The two-dimensional data may include, for example, image information captured by image sensor 120. The image information may include, for example, one or more of an analog image, a digital image, a subset of the analog image, a subset of the digital image, digital information processed by an image signal processor, a mathematical representation or transformation of information associated with data sensed by image sensor 120, visual information such as one or more frequencies in an image, conceptual information such as presence of one or more objects in the field of view of the image sensor. In some embodiments, the image information may also include, for example, information indicative of the state of image sensor 120, parameters of image sensor 120 such as exposure, frame rate, resolution of an image, color bit resolution, depth resolution, and field of view, information from other sensors during image capture, such as proximity sensor information, accelerometer information, information describing further processing that took place after an image is captured, illumination condition during image capture, features extracted from a digital image by image sensor 120, or any other information associated with data sensed by image sensor 120. The image information may include, for example, a single static image or a set of images (e.g., video).
  • In some embodiments, the second data is associated with a second resolution greater than the first resolution. For example, as discussed above, when operating with the second resolution, image sensor 120 may be configured to use more sensors associated with image sensor 120 (e.g., the previously deactivated sensor of image sensor 120 may be reactivated).
  • The second data may include one or more images of the object. For example, the second data may include image information associated with the field of view defined by broken lines 310. If a user places a hand, finger, or other object within the field of view defined by broken lines 310, image sensor 120 may capture an image of the hand, finger or other object. Whereas the first data may be reflective of a distance to the object, the second data may be reflective of the horizontal and vertical position of the object. Thus, the two-dimensional position may include a horizontal position of the object and a vertical position of the object. The horizontal and vertical positions may be, for example, offsets from the position of image sensor 120, may be relative positions within an image captured by image sensor 120, or may be some other values reflective of a horizontal position and vertical position of the object.
  • Processor 110 may be configured to detect an object in the second data based on, for example, the contour and/or location of an object in the second data. For example, processor 110 may access a filter mask associated with the object and apply the filter mask to the second data to determine if the object is present in the second data. That is, for example, the location in the second data most correlated to the filter mask may be determined as the location of the object associated with the. Processor 110 may also be configured to access a plurality of different filter masks associated with a plurality of different hand poses. Thus, for example, a filter mask from the plurality of different filter masks that has a best correlation to the image information may cause a determination that the hand pose associated with the filter mask is the hand pose of the object. Other techniques for detecting real-world objects in image information (e.g., edge matching, greyscale matching, gradient matching, and other image feature based methods) are known in the art, and may also be used to detect the object in the second data. Moreover, the same techniques may be used to detect a plurality of objects in the second data. For example, processor 110 may be configured to detect a central location of a hand and a central location of each a tip of each finger of the hand that is visible in the second data. For example, U.S. Pat. Nos. 7,584,113 and 8,199,115 disclose techniques for performing object detection, both of which are incorporated herein by reference in their entirety.
  • The second level of power consumption associated with image sensor 120 may be higher than the first level of power consumption associated with proximity sensor 130. For example, image sensor 120 may include a large number of sensors that each require a relatively large amount of power as compared to the one or more sensors of proximity sensor 130 and/or may operate at a high frequency (requiring more overall power) than proximity sensor 130.
  • In some embodiments, processor 110 is further configured to determine whether the proximity sensor and the image sensor are facing a user based on information received from at least one of an accelerometer and a gyroscope. For example, as described above, device 100 may include at least one of an accelerometer and a gyroscope (not shown in FIG. 1) and may be configured to receive data from the accelerometer and/or gyroscope. Processor 110 may be configured to use the data received from the accelerometer and/or gyroscope to determine a position and movement of device 100. Memory 140 may store data regarding various possible positions and/or motions and associated possible actions. For example, processor 110 may analyze the determined position and movement of device 100 to determine whether device 100 is moving back and forth (e.g., in a user's pocket) or moved upwards and then stationary (e.g., picked up by a user and then facing the user). Processor 110 may be configured to ignore the first data and the second data if a determination is made that the proximity sensor and the image sensor are not facing the user.
  • In some embodiments, at least one processor may be configured to coordinate the first data and the second data to obtain three-dimensional information associated with the detected object. For example, in connection with operation 530 of FIG. 5, processor 110 may be configured to coordinate the first data and the second data to obtain three-dimensional information associated with the detected object. The coordination of the first data and the second data may include, for example, generating, for one or more times, three-dimensional information. For example, the first data may include one or more times at which the first data was captured. Likewise, the second data may include one or more times at which the second data was captured. Processor 110 may be configured, for example, to match information in the first data and information in the second data associated with the same time or substantially the same time (e.g., captured within a time interval in which a hand or finger will exhibit little or no movement). However, processor 110 may also be configured to match information in the first data and information in the second data regardless of time.
  • As discussed above, processor 110 may be configured to determine information regarding the distance to the object from the first data and information regarding the horizontal and vertical position of the object from the second data. Processor 110 may be configured to combine the distance information, the horizontal position information, and the vertical position information to obtain the three-dimensional information. For example, the distance information, horizontal information, and vertical information may each be associated with the location of the object. Moreover, as discussed above, processor 110 may be configured to determine a plurality of objects in the second data (e.g., a plurality of finger tips and a whole hand). In some embodiments, the distance information associated with the first data may be applied to the horizontal and vertical locations of each of the plurality of objects detected in the second data. However, if proximity sensor 130 includes a resolution sufficient to distinguish between a plurality of objects, different portions of the first data representative of the distance to a plurality of the objects may be associated with corresponding horizontal and vertical positions of each of the plurality of objects.
  • In some embodiments, processor 110 may be configured to determine a series of three-dimensional positions of the object, determined from a series of information regarding distance, horizontal position, and vertical position captured at a plurality of different times. Each three-dimensional position of the object may be determined, for example, in the manner described above. Moreover, in some embodiments, processor 110 may be configured to determine a series of three-dimensional positions of each of the plurality of objects (e.g., a series of three-dimensional positions associated with a user's finger tips and whole hand).
  • In some embodiments, processor 110 is further configured to determine a gesture using the three-dimensional information. For example, the gesture is performed by at least one of a hand and a finger. In addition to the gestures described above in process 400, the gesture may include, for example, a zoom-in gesture, a zoom-out gesture, a clockwise circular gesture, a counter-clockwise circular gesture, a gesture parallel to a display, a gesture non-parallel to a display, a directional gesture in which the object starts close to the device and then moves away, or a directional gesture in which the object starts away from the device and then moves close to the device.
  • FIG. 6A graphically depicts an example of a zoom-in gesture. A zoom-in gesture may include, for example, a movement in which two or more fingers move away from each other as a hand moves toward display 150. FIG. 6B graphically depicts an example of a zoom-out gesture. A zoom-out gesture may include, for example, a movement in which two or more fingers move towards each other as a hand moves away from display 150. If using only horizontal and vertical position data, it may be difficult to distinguish between a zoom-in gesture and a zoom-out gesture. For example, a user may initially have a plurality of finger tips separated before bringing them together to start a zoom-in gesture; the initial phase may be confused for a zoom-out gesture. Similarly, a user may initially have a plurality of finger tips touching before separating them to start a zoom-out gesture; this initial phase may be confused for a zoom-in gesture. Processor 110 may be configured to distinguish between a zoom-in gesture and a zoom-out gesture using the three-dimensional information. For example, processor 110 may be configured to determine, using the first data, whether the detected object is moving toward or away from the device, to determine, using the second data, whether one or more fingers associated with the detected object are spread or closed, and to determine at least one of zoom-in gesture and a zoom-out gesture based on the determination of whether the detected object is moving toward or away from the device and the determination of whether the one or more fingers associated with the detected object are spread or closed. For example, if processor 110 determines that the detected object is moving toward the device and the one or more fingers associated with the detected object are initially closed and then spread as the detected object moves toward the device, processor 110 may detect a zoom-in gesture. If processor 110 determines that the detected object is moving away from the device and the one or more fingers associated with the detected object are initially spread and the closed as the detected object moves away from the device, processor 110 may detect a zoom-out gesture.
  • As another example, processor 110 may be configured to detect a zoom-in gesture if processor 110 detects fingers that are spread while a hand moves toward the device, initially stay open while the hand moves away from the device, and then close while the hand is away from the device. As another example, processor 110 may be configured to detect a zoom-out gesture if processor 110 detects fingers that are spread while a hand moves toward the device, are closed as the hand when the hand is near the device, and stay closed as the hand moves backwards away from the device.
  • In some embodiments, processor 110 is further configured to determine a circular gesture. A circular gesture may include, for example, a clockwise circular gesture or a counter-clockwise circular gesture. FIG. 6C graphically depicts an example of a clockwise circular gesture. A clockwise circular gesture may include, for example, a gesture in which the object (e.g., a hand or finger) make a circular, approximately circular, elliptical, or approximately elliptical motion that approaches and moves away from display 150 in a first direction. FIG. 6D graphically depicts an example of a counter-clockwise circular gesture. A counter-clockwise circular gesture may include, for example, a gesture in which the object (e.g., a hand or finger) make a circular, approximately circular, elliptical, or approximately elliptical motion that approaches and moves away from display 150 in a second direction opposite the first direction. If only two-dimensional horizontal and vertical position data from the second data were analyzed, it may be difficult to distinguish between a clockwise circular gesture and a counter-clockwise direction, since in both scenarios image sensor 120 may see the object moving left-to-right and right-to-left. That is, the same left-to-right and right-to-left motion may be exhibited in both a clockwise circular gesture and a counter-clockwise circular gesture In some embodiments, however, processor 110 is configured to distinguish between a clockwise circular gesture and a counter-clockwise circular gesture. In particular, processor 110 may be configured to analyze the distance of the object to proximity sensor 130 as the object moves to distinguish between a clockwise circular gesture and a counter-clockwise circular gesture. As one example, if processor 110 determines that the object approaches device 110 moving left-to-right, continues to approach device 110 moving right-to-left, then begins to move away from device 110 still moving right-to-left, and then continues to move away from device 110 moving left-to-right, processor 110 may determine a clockwise circular gesture. If processor 110 determines that the object moves away from device 110 moving left-to-right, continues to move away from device 110 moving right-to-left, then begins to move toward device 110 still moving right-to-left, and then continues to move toward device 110 moving left-to-right, processor 110 may determine a counter-clockwise circular gesture.
  • In some embodiments, processor 110 is further configured to determine, using the first data, whether the object performed a gesture parallel to a display. For example, as depicted in FIG. 6E, the object may move parallel to display 150 in a right-to-left direction. In contrast, as depicted in FIG. 6F, the object may move in a right-to-left direction while moving away from proximity sensor 130. As another example of a non-parallel gesture, the object may move in a right-to-left direction while moving toward proximity sensor 130. Parallel and non-parallel gestures that follow the same horizontal and vertical path, but a different distance path, may be difficult to distinguish using only the second data. However, processor 110 may be configured to determine whether the object performed a gesture parallel to display 150 by analyzing the first data. For example, processor 110 may determine a gesture parallel to display 150 if the distance to proximity sensor 150 remains constant, decreases at a constant rate associated with the offset between proximity sensor 130 and display 150, or increases at a constant rate associated with the offset between proximity sensor 130 and display 150.
  • In some embodiments, processor 110 is further configured to determine, using the first data, whether the object performed a tapping gesture. For example, processor 110 may be configured to analyze the first data and the second data to determine whether a hand moves toward the device and then away from the device. If processor 110 determines a hand moves toward the device and then away from the device, processor 110 may detect a tapping gesture.
  • In some embodiments, processor 110 is further configured to output at least one of a message and a command associated with the three-dimensional information. A message may include the various messages described above in process 400. Likewise, a command may include the various commands described above in process 400.
  • In some embodiments, processor 110 may be configured to output at least one of a message associated with the determined gesture and a command associated with the determined gesture to thereby enable gesture control. For example, if a zoom-in gesture is detected by processor 110, processor 110 may be configured to output a command to an application displaying a picture on display 150 to enlarge a portion of the picture. The portion of the picture that is enlarged may be based on, for example, a location of the object in relation to display 150. For example, the portion of the picture that is enlarged may correspond to the location on display 150 that is perpendicular to the object. Similarly, if a zoom-out gesture is detected by processor 110, processor 110 may be configured to output a command to an application displaying a picture on display 150 to decrease the size of the picture.
  • As another example, if a clockwise circular gesture is detected by processor 110, processor 110 may be configured to output a command to an application displaying a window on display 150 that is larger than display 150 (e.g., a browser showing a website with several pages of content) to scroll in a first direction (e.g., up, down, left, or right). Similarly, if a counter-clockwise circular gesture is detected by processor 110, processor 110 may be configured to output a command to an application displaying a window on display 150 that is larger than display 150 to scroll in a second direction different than the first direction (e.g., in a direction opposite to the first direction).
  • As another example, if a gesture parallel to display 150 is detected by processor 110, processor 110 may be configured to output a command to an application displaying a picture of display 150 to move the picture in the direction of the gesture. However, if processor 110 detects a non-parallel gesture rather than the parallel gesture, even if in the same direction as the parallel gesture, processor 110 may be configured to act differently. For example, if processor 110 detects non-parallel gesture in an upwards direction relative to display 150, processor 110 may be configured to output a message to external speakers to increase audio volume. As another example, processor 110 may also be configured to output a command to activate an application or service based on a detected gesture.
  • As another example, if a tapping gesture is detected by processor 110, processor 110 may be configured to output a command to select a button associated with an application that is presented on display 150. As another example, if a tapping gesture is detected by processor 110, processor 110 may be configured to output a command to activate an application associated with the tapping gesture.
  • In some embodiments, processor 110 is further configured to determine, using the first data, a distance of the object from the proximity sensor; determine, using the second data, a movement of the object, and determine, using the distance and the movement, a speed of the object. FIG. 7 depicts two exemplary speed determinations. As a first example, processor 110 may determine that the distance to the object equals z1 and that the object has moved horizontally across a first portion of the field of view of image sensor 120 by an amount x1 during a first time period. Image sensor 120 may provide processor 110 with image sensor 120's field of view, or processor 110 may access data regarding the field of view from memory 140. Based on the field of view data, the distance data, and the data regarding the horizontal motion of the object, processor 110 may be configured to calculate an actual distance travelled by the object. For example, Processor 110 may be configured to determine a total length covered by a field of view at a particular distance, and multiply the total length by the percentage of the field of view that the object covered when moving horizontally. Processor 110 may be configured to divide the actual distance travelled by the first time period to determine a speed of the object v1.
  • As a second example, processor 110 may determine that the distance to the object equals z2 and that the object has moved horizontally across a first portion of the field of view of image sensor 120 by an amount x2 during a first time period. As can be seen in FIG. 7, z2 may be a longer distance than z1, and x2 may be an equal portion of the field of view of image sensor 120 as x1. Using the same technique described above, processor 110 may determine that the speed of the object associated with z2 and x2 is v2. Because z2 is greater than z1, in this example v2 is greater than v1. Thus, processor 110 may be configured to determine a speed associated with a gesture.
  • For example, as described above, processor 110 may be configured to determine at least one of a left-to-right gesture and right-to-left gesture using the three-dimensional information. In some embodiments, in a left-to-right gesture and right-to-left gesture, the object is a hand. Processor 110 may be configured to determine the speed of the object such that the speed is reflective of a hand speed.
  • Similarly, as described above, processor 110 may be configured to determine a circular gesture using the three-dimensional information. In some embodiments, in a circular gesture the object is a finger. Processor 110 may be configured to determine the speed of the object such that the speed is reflective of a finger speed. The message or command may reflect the speed of the object. For example, a relatively fast clockwise circular gesture (e.g., one rotation in less than half a second) may cause processor 110 to output a command to cause a window on display 150 exhibit a large scroll (e.g., the window may scroll down an entire page for each rotation). On the other hand, a relatively slow clockwise circular gesture (e.g., one rotation in more than half a second) may cause processor 110 to output a command to cause a window on display 150 exhibit a small scroll (e.g., the window may scroll one line for each rotation).
  • By using image sensor 120 and proximity sensor 130 in accordance with the disclosed embodiments, various improvements on known techniques for performing touch-free gesture detection can be achieved. For example, in some embodiments, battery and processor resources of device 100 may be conserved by implementing processes 400 and 500. As another example, in some embodiments, improved gesture detection associated with three-dimensional information may be achieved by implementing process 500.
  • Certain features which, for clarity, are described in this specification in the context of separate embodiments, may also be provided in combination in a single embodiment. Conversely, various features which, for brevity, are described in the context of a single embodiment, may also be provided in multiple embodiments separately or in any suitable sub-combination. Moreover, although features may be described above as acting in certain combinations and even initially claimed as such, one or more features from a claimed combination can in some cases be excised from the combination, and the claimed combination may be directed to a subcombination or variation of a subcombination.
  • Particular embodiments have been described. Other embodiments are within the scope of the following claims.

Claims (40)

What is claimed is:
1. A dual sensor control device, comprising:
at least one processor for receiving information from a proximity sensor and an image sensor, the at least one processor being configured to:
receive first data from the proximity sensor while the image sensor is in a first state;
determine, using the first data, a presence of an object in proximity to the proximity sensor;
output, based on the determined presence of the object in proximity to the proximity sensor, a signal to the image sensor to cause the image sensor to enter a second state, different from the first state;
receive second data from the image sensor in the second state; and
output at least one of a message and a command associated with the second data.
2. The device of claim 1, wherein the object is at least one of a hand and a finger, wherein the second data is reflective of a gesture, and wherein the device is configured to enable gesture control.
3. The device of claim 2, wherein the gesture is performed by at least one of a hand and a finger.
4. The device of claim 2, wherein the at least one processor is further configured to determine the gesture based, at least in part, on an analysis of the first data and the second data.
5. The device of claim 2, wherein the at least one processor is further configured to distinguish between a plurality of predefined gestures.
6. The device of claim 2, wherein the first data is reflective of at least a one-dimensional position of the object relative to the proximity sensor, and wherein the at least one of a message and a command is further associated with the one-dimensional position.
7. The device of claim 2, wherein the second data is reflective of at least a two-dimensional position of the object relative to the image sensor, and wherein the at least one of a message and a command is further associated with the two-dimensional position.
8. The device of claim 2, wherein the first data is reflective of at least a one-dimensional position of the object relative to the proximity sensor, wherein the second data is reflective of at least a two-dimensional position of the object relative to the image sensor, and wherein the at least one of a message and a command is further associated with the one-dimensional position and the two-dimensional position.
9. The device of claim 1, wherein the at least one processor is further configured to output the signal to the image sensor to enter the second state before the object reaches a field of view of the image sensor.
10. The device of claim 9, wherein the at least one of a message and a command is further associated with the first data obtained before the object reaches the field of view of the image sensor and the second data obtained after the object reaches the field of view of the image sensor.
11. The device of claim 1, wherein the at least one processor is further configured to enable gesture control based on the determined presence of the object in proximity to the proximity sensor.
12. The device of claim 1, wherein the at least one processor is further configured to turn on a display based on the determined presence of the object in proximity to the proximity sensor.
13. The device of claim 1, wherein the first state is at least one of an off state and a first power state, and wherein the second state is at least one of an on state and a second power state higher than the first power state.
14. The device of claim 1, wherein the first state is at least one of a first sampling rate, a first resolution, and a first exposure length, and the second state is at least one of a second sampling rate different than the first sampling rate, a second resolution different than the first resolution, and a second exposure length different than the first exposure length.
15. The device of claim 1, wherein the first data is associated with a first resolution and the second data is associated with a second resolution greater than the first resolution.
16. The device of claim 1, wherein the at least one processor is further configured to determine whether the proximity sensor and the image sensor are facing a user based on information received from at least one of an accelerometer and a gyroscope.
17. The device of claim 1, wherein the at least one processor is further configured to:
determine, using data from the proximity sensor, if the object exits a field of view of the proximity sensor; and
output, based on the determined exit, a second signal to the image sensor to cause the image sensor to enter a third state.
18. The device of claim 1, wherein the at least one processor is further configured to output a second signal to the image sensor to cause the image sensor to enter a third state after a predefined condition is met.
19. The device of claim 1, wherein the at least one processor is further configured to output a second signal to the proximity sensor to cause the proximity sensor to change to a different state.
20. A non-transitory computer-readable medium comprising instructions that, when executed by at least one processor, cause the at least one processor to perform operations including:
receiving first data from the proximity sensor while an image sensor is in a first state;
determining, using the first data, a presence of an object in proximity to a proximity sensor;
outputting, based on the determined presence of the object in proximity to the proximity sensor, a signal to the image sensor to cause the image sensor to enter a second state, different from the first state;
receiving second data from the image sensor in the second state; and
outputting at least one of a message and a command associated with the second data.
21. A three-dimensional control device, comprising:
at least one processor for receiving information from a proximity sensor and an image sensor, the at least one processor being configured to:
receive first data, associated with a detected object, from the proximity sensor while the proximity sensor operates at a first level of power consumption, wherein the first data is reflective of at least a one-dimensional position of the object relative to the proximity sensor;
receive second data, associated with the detected object, from the image sensor while the image sensor operates at a second level of power consumption greater than the first level, wherein the second data is reflective of at least a two-dimensional position of the object relative to the image sensor; and
coordinate the first data and the second data to obtain three-dimensional information associated with the detected object.
22. The device of claim 21, wherein the at least one processor is further configured to output at least one of a message and a command associated with the three-dimensional information.
23. The device of claim 21, wherein the at least one processor is further configured to determine a gesture using the three-dimensional information.
24. The device of claim 23, wherein the gesture is performed by at least one of a hand and a finger.
25. The device of claim 23, wherein the at least one processor is further configured to output at least one of a message associated with the determined gesture and a command associated with the determined gesture to thereby enable gesture control.
26. The device of claim 23, wherein the at least one processor is further configured to distinguish between a zoom-in gesture and a zoom-out gesture using the three-dimensional information.
27. The device of claim 23, wherein the at least one processor is further configured to:
determine, using the first data, whether the detected object is moving toward or away from the device;
determine, using the second data, whether one or more fingers associated with the detected object are spread or closed; and
determine at least one of a zoom-in gesture and a zoom-out gesture based on the determination of whether the detected object is moving toward or away from the device and the determination of whether the one or more fingers associated with the detected object are spread or closed.
28. The device of claim 23, wherein the at least one processor is further configured to determine a circular gesture.
29. The device of claim 28, wherein the at least one processor is further configured to distinguish between a clockwise circular gesture and a counter-clockwise circular gesture.
30. The device of claim 21, wherein the first data is associated with a first resolution and the second data is associated with a second resolution that is greater than the first resolution.
31. The device of claim 21, wherein the at least one processor is further configured to determine whether the proximity sensor and the image sensor are facing a user based on information received from at least one of an accelerometer and a gyroscope.
32. The device of claim 21, wherein the at least one processor is further configured to determine, using the first data, whether the object performed a gesture parallel to a display.
33. The device of claim 24, wherein the at least one processor is further configured to:
determine, using the first data, a distance of the object from the proximity sensor;
determine, using the second data, a movement of the object; and
determine, using the distance and the movement, a speed of the object.
34. The device of claim 33, wherein the at least one processor is further configured to determine at least one of a left-to-right gesture and right-to-left gesture using the three-dimensional information, wherein the object is a hand and the speed is reflective of a hand speed.
35. The device of claim 33, wherein the at least one processor is further configured to determine a circular gesture using the three-dimensional information, wherein the object is a finger and the speed is reflective of a finger speed.
36. A non-transitory computer-readable medium comprising instructions that, when executed by at least one processor, cause the at least one processor to perform operations including:
receiving first data, associated with a detected object, from a proximity sensor while the proximity sensor operates at a first level of power consumption, wherein the first data is reflective of at least a one-dimensional position of the object relative to the proximity sensor;
receiving second data, associated with the detected object, from an image sensor while the image sensor operates at a second level of power consumption greater than the first level, wherein the second data is reflective of at least a two-dimensional position of the object relative to the image sensor; and
coordinating the first data and the second data to obtain three-dimensional information associated with the detected object.
37. The non-transitory computer-readable medium of claim 36, wherein the instructions, when executed by the at least one processor, further causes the at least one processor to perform an additional operation including determining a gesture using the three-dimensional information.
38. The non-transitory computer-readable medium of claim 37, wherein the instructions, when executed by the at least one processor, further causes the at least one processor to perform additional operations including:
determining, using the first data, whether the detected object is moving toward or away from the device;
determining, using the second data, whether one or more fingers associated with the detected object are spread or closed; and
determining at least one of a zoom-in gesture and a zoom-out gesture based on the determination of whether the detected object is moving toward or away from the device and the determination of whether the one or more fingers associated with the detected object are spread or closed.
39. The non-transitory computer-readable medium of claim 36, wherein the instructions, when executed by the at least one processor, further causes the at least one processor to perform an additional operation including distinguishing between a clockwise circular gesture and a counter-clockwise circular gesture.
40. The non-transitory computer-readable medium of claim 36, wherein the instructions, when executed by the at least one processor, further causes the at least one processor to perform an additional operation including determining, using the first data, whether the object performed a gesture parallel to a display.
US17/007,186 2013-03-14 2020-08-31 Vehicle systems and methods for interaction detection Abandoned US20210096651A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
US17/007,186 US20210096651A1 (en) 2013-03-14 2020-08-31 Vehicle systems and methods for interaction detection

Applications Claiming Priority (5)

Application Number Priority Date Filing Date Title
US201361782678P 2013-03-14 2013-03-14
PCT/IB2014/000908 WO2014140827A2 (en) 2013-03-14 2014-03-13 Systems and methods for proximity sensor and image sensor based gesture detection
US201514775657A 2015-09-11 2015-09-11
US15/984,999 US10761610B2 (en) 2013-03-14 2018-05-21 Vehicle systems and methods for interaction detection
US17/007,186 US20210096651A1 (en) 2013-03-14 2020-08-31 Vehicle systems and methods for interaction detection

Related Parent Applications (1)

Application Number Title Priority Date Filing Date
US15/984,999 Continuation US10761610B2 (en) 2013-03-14 2018-05-21 Vehicle systems and methods for interaction detection

Publications (1)

Publication Number Publication Date
US20210096651A1 true US20210096651A1 (en) 2021-04-01

Family

ID=51538205

Family Applications (3)

Application Number Title Priority Date Filing Date
US14/775,657 Active US9977507B2 (en) 2013-03-14 2014-03-13 Systems and methods for proximity sensor and image sensor based gesture detection
US15/984,999 Active US10761610B2 (en) 2013-03-14 2018-05-21 Vehicle systems and methods for interaction detection
US17/007,186 Abandoned US20210096651A1 (en) 2013-03-14 2020-08-31 Vehicle systems and methods for interaction detection

Family Applications Before (2)

Application Number Title Priority Date Filing Date
US14/775,657 Active US9977507B2 (en) 2013-03-14 2014-03-13 Systems and methods for proximity sensor and image sensor based gesture detection
US15/984,999 Active US10761610B2 (en) 2013-03-14 2018-05-21 Vehicle systems and methods for interaction detection

Country Status (3)

Country Link
US (3) US9977507B2 (en)
CN (2) CN105229582B (en)
WO (1) WO2014140827A2 (en)

Families Citing this family (32)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10691219B2 (en) * 2012-01-17 2020-06-23 Ultrahaptics IP Two Limited Systems and methods for machine control
US11493998B2 (en) * 2012-01-17 2022-11-08 Ultrahaptics IP Two Limited Systems and methods for machine control
US20230325005A1 (en) * 2012-01-17 2023-10-12 Ultrahaptics IP Two Limited Systems and methods for machine control
US9207759B1 (en) * 2012-10-08 2015-12-08 Edge3 Technologies, Inc. Method and apparatus for generating depth map from monochrome microlens and imager arrays
TW201419036A (en) * 2012-11-06 2014-05-16 Pixart Imaging Inc Sensor array and method of controlling sensing device and related electronic apparatus
US9702977B2 (en) 2013-03-15 2017-07-11 Leap Motion, Inc. Determining positional information of an object in space
KR20150068001A (en) * 2013-12-11 2015-06-19 삼성전자주식회사 Apparatus and method for recognizing gesture using sensor
US10013025B2 (en) 2014-12-11 2018-07-03 Intel Corporation Wearable device with power state control
US9949013B2 (en) * 2015-08-29 2018-04-17 Bragi GmbH Near field gesture control system and method
DE102015226153A1 (en) * 2015-12-21 2017-06-22 Bayerische Motoren Werke Aktiengesellschaft Display device and operating device
CN105718056B (en) * 2016-01-19 2019-09-10 北京小米移动软件有限公司 Gesture identification method and device
US10908694B2 (en) * 2016-02-01 2021-02-02 Microsoft Technology Licensing, Llc Object motion tracking with remote device
WO2017184172A1 (en) * 2016-04-22 2017-10-26 Hewlett-Packard Development Company, L.P. Distance determination
KR102524498B1 (en) * 2016-07-06 2023-04-24 삼성전자주식회사 The Electronic Device including the Dual Camera and Method for controlling the Dual Camera
CN107562198A (en) * 2016-11-14 2018-01-09 上海闻泰电子科技有限公司 VR apparatus control methods
JP6914065B2 (en) * 2017-03-17 2021-08-04 シャープ株式会社 Obstacle detection device, traveling device, obstacle detection system and obstacle detection method
EP3574388B1 (en) * 2017-07-25 2024-02-07 Hewlett-Packard Development Company, L.P. Determining user presence based on sensed distance
CN107635057A (en) * 2017-07-31 2018-01-26 努比亚技术有限公司 A kind of virtual reality terminal control method, terminal and computer-readable recording medium
US10599931B2 (en) * 2017-08-21 2020-03-24 2236008 Ontario Inc. Automated driving system that merges heterogenous sensor data
EP3698609B1 (en) * 2017-10-17 2021-02-17 Signify Holding B.V. Occupancy sensor calibration and occupancy estimation
CN108735128A (en) * 2018-04-19 2018-11-02 浙江大学 The self-service introduction system in museum based on gesture control
CN109033985B (en) 2018-06-29 2020-10-09 百度在线网络技术(北京)有限公司 Commodity identification processing method, device, equipment, system and storage medium
US10462425B1 (en) 2018-09-07 2019-10-29 Bank Of America Corporation Processing system for providing a teller assistant experience using enhanced reality interfaces
US10755250B2 (en) 2018-09-07 2020-08-25 Bank Of America Corporation Processing system for providing a teller assistant experience using enhanced reality interfaces
US11681415B2 (en) * 2018-10-31 2023-06-20 Apple Inc. Near-viewing notification techniques
TWI676974B (en) * 2018-11-05 2019-11-11 友達光電股份有限公司 Aquatic creature display system and aquatic creature exhibition method
WO2020100664A1 (en) * 2018-11-13 2020-05-22 ソニー株式会社 Image processing device, image processing method, and program
CN113260951A (en) * 2018-12-27 2021-08-13 美国斯耐普公司 Fade-in user interface display based on finger distance or hand proximity
US10993417B2 (en) * 2019-08-14 2021-05-04 International Business Machines Corporation Detection and management of disease outbreaks in livestock using health graph networks
CN111240472A (en) * 2019-12-31 2020-06-05 Oppo广东移动通信有限公司 Electronic equipment, gesture recognition device and method
CN111580656B (en) * 2020-05-08 2023-07-18 安徽华米信息科技有限公司 Wearable device, and control method and device thereof
CN115190239B (en) * 2022-06-27 2024-04-19 联宝(合肥)电子科技有限公司 Image acquisition method and device, electronic equipment and storage medium

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20110018822A1 (en) * 2009-07-21 2011-01-27 Pixart Imaging Inc. Gesture recognition method and touch system incorporating the same
US20110018795A1 (en) * 2009-07-27 2011-01-27 Samsung Electronics Co., Ltd. Method and apparatus for controlling electronic device using user interaction
US20140059489A1 (en) * 2012-08-21 2014-02-27 Amulet Technologies, Llc Rotate Gesture
US20150293595A1 (en) * 2012-10-23 2015-10-15 Lg Electronics Inc. Image display device and method for controlling same

Family Cites Families (15)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8323106B2 (en) * 2008-05-30 2012-12-04 Sony Computer Entertainment America Llc Determination of controller three-dimensional location using image analysis and ultrasonic communication
US7411575B2 (en) * 2003-09-16 2008-08-12 Smart Technologies Ulc Gesture recognition method and touch system incorporating the same
US8952895B2 (en) * 2011-06-03 2015-02-10 Apple Inc. Motion-based device operations
US8599132B2 (en) * 2008-06-10 2013-12-03 Mediatek Inc. Methods and systems for controlling electronic devices according to signals from digital camera and sensor modules
CN102822773A (en) * 2010-03-24 2012-12-12 惠普开发有限公司 Gesture mapping for display device
WO2011148607A1 (en) * 2010-05-28 2011-12-01 パナソニック株式会社 Gesture recognition device and gesture recognition method
AU2011202049B2 (en) * 2010-07-27 2014-05-08 Wms Gaming, Inc. Virtual banks for community group bonus games
US20130215027A1 (en) * 2010-10-22 2013-08-22 Curt N. Van Lydegraf Evaluating an Input Relative to a Display
WO2012151471A2 (en) * 2011-05-05 2012-11-08 Net Power And Light Inc. Identifying gestures using multiple sensors
JP5876142B2 (en) * 2011-05-12 2016-03-02 アップル インコーポレイテッド Presence sensor
US20130009875A1 (en) * 2011-07-06 2013-01-10 Fry Walter G Three-dimensional computer interface
US20130022950A1 (en) * 2011-07-22 2013-01-24 Muniz Simas Fernando Moreira Method and system for generating behavioral studies of an individual
CN103890695B (en) * 2011-08-11 2017-10-13 视力移动技术有限公司 Interface system and method based on gesture
US9389690B2 (en) * 2012-03-01 2016-07-12 Qualcomm Incorporated Gesture detection based on information from multiple types of sensors
US10013024B2 (en) * 2012-09-28 2018-07-03 Nokia Technologies Oy Method and apparatus for interacting with a head mounted display

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20110018822A1 (en) * 2009-07-21 2011-01-27 Pixart Imaging Inc. Gesture recognition method and touch system incorporating the same
US20110018795A1 (en) * 2009-07-27 2011-01-27 Samsung Electronics Co., Ltd. Method and apparatus for controlling electronic device using user interaction
US20140059489A1 (en) * 2012-08-21 2014-02-27 Amulet Technologies, Llc Rotate Gesture
US20150293595A1 (en) * 2012-10-23 2015-10-15 Lg Electronics Inc. Image display device and method for controlling same

Also Published As

Publication number Publication date
CN105229582B (en) 2020-04-28
US9977507B2 (en) 2018-05-22
US10761610B2 (en) 2020-09-01
US20180356896A1 (en) 2018-12-13
WO2014140827A3 (en) 2015-04-16
WO2014140827A2 (en) 2014-09-18
US20160026255A1 (en) 2016-01-28
CN105229582A (en) 2016-01-06
CN111475059A (en) 2020-07-31

Similar Documents

Publication Publication Date Title
US20210096651A1 (en) Vehicle systems and methods for interaction detection
US10203764B2 (en) Systems and methods for triggering actions based on touch-free gesture detection
US10120454B2 (en) Gesture recognition control device
US20220382379A1 (en) Touch Free User Interface
JP7191714B2 (en) Systems and methods for direct pointing detection for interaction with digital devices
US20180292907A1 (en) Gesture control system and method for smart home
EP2839357B1 (en) Rapid gesture re-engagement
US20130335324A1 (en) Computer vision based two hand control of content
US20200142495A1 (en) Gesture recognition control device
TWI581127B (en) Input device and electrical device
KR102086495B1 (en) Method and device of recognizing user's movement, and electric-using apparatus using the device
Hayes et al. Device Motion via Head Tracking for Mobile Interaction
WO2013168160A1 (en) System and method for computer vision based tracking of a hand
Komuro Vision-based 3D input interface technologies
KR20140016483A (en) Touch type portable device and driving method of the same

Legal Events

Date Code Title Description
STPP Information on status: patent application and granting procedure in general

Free format text: DOCKETED NEW CASE - READY FOR EXAMINATION

STPP Information on status: patent application and granting procedure in general

Free format text: NON FINAL ACTION MAILED

STPP Information on status: patent application and granting procedure in general

Free format text: FINAL REJECTION MAILED

STPP Information on status: patent application and granting procedure in general

Free format text: DOCKETED NEW CASE - READY FOR EXAMINATION

STPP Information on status: patent application and granting procedure in general

Free format text: NON FINAL ACTION MAILED

STCB Information on status: application discontinuation

Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION