US9423879B2 - Systems and methods for controlling device operation according to hand gestures - Google Patents

Systems and methods for controlling device operation according to hand gestures Download PDF

Info

Publication number
US9423879B2
US9423879B2 US14/318,019 US201414318019A US9423879B2 US 9423879 B2 US9423879 B2 US 9423879B2 US 201414318019 A US201414318019 A US 201414318019A US 9423879 B2 US9423879 B2 US 9423879B2
Authority
US
United States
Prior art keywords
sensor
function key
control
hand gesture
sensing device
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
US14/318,019
Other versions
US20150002391A1 (en
Inventor
Chia Ming Chen
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Individual
Original Assignee
Individual
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Individual filed Critical Individual
Priority to US14/318,019 priority Critical patent/US9423879B2/en
Publication of US20150002391A1 publication Critical patent/US20150002391A1/en
Priority to US15/206,355 priority patent/US20170083103A1/en
Application granted granted Critical
Publication of US9423879B2 publication Critical patent/US9423879B2/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/017Gesture based interaction, e.g. based on a set of recognized hand gestures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/20Movements or behaviour, e.g. gesture recognition
    • G06V40/28Recognition of hand or arm movements, e.g. recognition of deaf sign language
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/10Cameras or camera modules comprising electronic image sensors; Control thereof for generating image signals from different wavelengths
    • H04N23/125Colour sequential image capture, e.g. using a colour wheel
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/56Cameras or camera modules comprising electronic image sensors; Control thereof provided with illuminating means
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/61Control of cameras or camera modules based on recognised objects
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/63Control of cameras or camera modules by using electronic viewfinders
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality

Definitions

  • the present inventive concepts generally relate to device control, and more particularly relate to devices, systems, and methods for controlling a machine, instrument, robot, vehicle, or other device or object according to hand gestures.
  • Operation of a machine by a human operator can be summarized as follows. First, an operator can observe or inspect the result of a previous operation. If the desired result is not yet obtained, the operation can continue, or adjustments can be made to the inputs of the next operation. This process can continue until a desired result is obtained.
  • a non-contact sensing device comprising a sensor comprising a plurality of function key sensors.
  • a function key sensor of the plurality of function key sensors has a field of view.
  • the function key sensor is constructed and arranged to detect a hand gesture at the field of view and to generate a function key control signal in response to detecting the hand gesture at the field of view.
  • a processor processes the function key control signal from the function key sensor and outputs a command to a remote apparatus in response to the processed control signal.
  • the non-contact sensing device comprises one or more cameras that recognize the hand gesture, wherein the command is generated from a combination of a function key corresponding to the function key sensor and the recognized hand gesture.
  • the senor is a staring sensor comprising a detector array that includes a combination of the function key sensors and non-function key sensors, the staring sensor generating all image pixels of the detector array simultaneously.
  • the senor is a scanning sensor that scans a portion of a field of view at a time.
  • the senor includes a scan mirror that scans all function key sensors and only the non-function key sensors in the path of the scan to shorten the data acquisition time.
  • the senor is constructed and arranged as an emissive mode sensor comprising a thermal sensor that collects thermal radiation emitted from the hand gesture.
  • the senor is constructed and arranged as a reflective mode sensor comprising a color sensor that collects color light reflected from the hand gesture.
  • the non-contact sensing device further comprises a control spot generator that generates a control spot that is aligned with the field of view, and the function key sensor detects a target within the control spot
  • the sensor detects the hand gesture at the control spot.
  • the non-contact sensing device further comprises a beamsplitter positioned between the sensor and the control spot generator, wherein light output from the control spot generator directed at the beamsplitter coincides with the field of view.
  • a function key corresponding to the function key sensor distinguished from other function key sensors is identified by positioning a ground truth target such as a hand at the control spot among a plurality of control spots and collecting by the non-contact sensing device an image of the ground truth target, wherein a pixel or group of pixels at the sensor having a highest detector output is identified as the function key.
  • control spot generator comprises a white light emitting diode (LED), a control spot generator plate having a plurality of color filters, and a lens, wherein color light is generated from the color filters when the white LED illuminates, and wherein a plurality of control spots are generated.
  • LED white light emitting diode
  • control spot generator plate having a plurality of color filters
  • lens wherein color light is generated from the color filters when the white LED illuminates, and wherein a plurality of control spots are generated.
  • each color control spot is aligned with a field of view of a function key sensor.
  • control spot generator comprises a plurality of color LEDs, light pipes, a light pipe mounting plate, and a lens, wherein the color LEDs are placed at the input ends of light pipes and the output ends of light pipes are placed at the focal plane of the lens, thereby generating control spots of different colors that each illuminate a field of view of a different function key sensor, and wherein the light pipe plate holds the light pipes together at the focal plane of the lens.
  • the remote apparatus comprises a plurality of devices
  • the processor generates a device number for a device of the plurality of devices, each device number corresponding to a hand gesture at a designated control spot, thereby allowing a user to choose what device to operate.
  • a function key corresponding to the function key pixel become inactive when the hand gesture is placed in the control spot, and the function key is reactivated when the hand gesture is placed in the control spot.
  • the senor is a color sensor comprising color filters on a rotating wheel in front of the sensor, wherein an image of a scene is taken for each color filter, and wherein function key pixels of color images are processed to determine if a skin color spectrum is detected.
  • the senor is a color sensor comprising a color camera, and wherein function key pixels of color images are processed to determine if a skin color spectrum is detected.
  • the processor distinguishes the function key sensor from other sensors of the plurality of sensors from the function key positions stored in the processor during a function key identification calibration process.
  • the non-contact sensing device further comprises a head mounted display collocated with the sensor, the display providing visual information regarding an operation of the remote apparatus.
  • the remote apparatus is a drone
  • the non-contact sensing device is constructed and arranged to control the operation of the drone
  • the head mounted display displays a combination of flight information, onboard camera images, and other information regarding the operation of the drone.
  • a camera captures image data corresponding to the hand gesture and the processor converts the captured image data into a cursor command signal that controls a cursor at a display.
  • a hand gesture control system comprising: a control spot generator that forms a control spot at a surface; a sensor that senses the presence of a hand in the control spot; at least one hand gesture sensor that provides a field of view for capturing images of a hand gesture at the control spot; a processor that converts the captured images of the hand gesture into a command signal; and a transmitter that outputs the command signal to an apparatus that translates the command signal to an action performed by the apparatus.
  • the senor is at least one of a color sensor or a thermal sensor.
  • the hand gesture control system further comprising a display for operating the apparatus, and a plurality of applications that are displayed from the display, which are activated in response to the command signal corresponding to the hand gesture.
  • the senor comprises a visible-thermal dual band camera or multiple cameras for recognizing the hand gesture.
  • the processor converts the captured images of the hand gesture into a cursor command signal that controls a cursor at a display.
  • the processor converts the captured images of the hand gesture into a joystick command signal that controls the apparatus without interaction with the display.
  • the hand gesture control system is constructed and arranged for mounting to a ceiling, a mount, or a head band.
  • a method for providing non-contact switching and controlling an apparatus comprising: sensing of hand signal in the field of view of a function key sensor of a imaging sensor lit up by color light of a control spot generator; capturing a hand gesture images by one or multiple cameras in a control spot; issuing, by a processor, a control command based on the assigned task of the function key sensor, the assigned task of the hand gesture, or both, sample of the image area to form an image pixel; converting the image pixel to a function key pixel; and controlling an apparatus by blocking a field of view of the function key pixel.
  • a diffuser based control system comprising: a diffuser for smearing background details while keeping details for target near it; a imaging sensor for capturing hand gestures; one or more function key sensors constructed from selected pixels of the imaging sensor; a light source for illuminating the target; control spots for identifying field of views of function key sensors; a processor for processing a combination of hand gesture images, function key sensor signals, and converting them into commands; and a transceiver for sending commands to and receiving information from the control apparatus.
  • FIG. 1 is a flowchart illustrating an operation requiring manual control.
  • FIG. 2A is a flowchart illustrating the operation of an apparatus, in accordance with some embodiments.
  • FIG. 2B is a block diagram illustrating an operation of a hand gesture control system, in accordance with some embodiments.
  • FIGS. 2C and 2D are views of a relationship between a hand gesture and a cursor and corresponding display views, in accordance with some embodiments.
  • FIG. 3A is a diagram illustrating the operation of an apparatus controlled by a hand gesture control module that provides an interaction between a hand gesture and a wearable computer display, in accordance with some embodiments.
  • FIGS. 3B and 3C are diagrams illustrating views of a wearable computer display, in accordance with some embodiments.
  • FIGS. 4A to 4C illustrate different hand gestures and corresponding commands for performing cursor operations, in accordance with embodiments.
  • FIGS. 5A to 5F illustrate different hand gestures and corresponding commands for performing joystick operations, in accordance with embodiments.
  • FIG. 6 is a diagram illustrating a hand gesture control module remotely controlling a light emitting diode (LED) lamp, in accordance with some embodiments.
  • LED light emitting diode
  • FIG. 7 is a diagram illustrating a hand gesture control module remotely controlling a television set, in accordance with some embodiments.
  • FIG. 8 is a block diagram of a hand gesture control module, in accordance with some embodiments.
  • FIG. 9 is a diagram of a flashlight camera coupled to a gimbal of a beam steering mechanism, in accordance with some embodiments.
  • FIG. 10 is a diagram of a dual-axis pitch-roll gimbal of a beam steering mechanism, in accordance with some embodiments.
  • FIG. 11A a top view of a visible-thermal dual-band flashlight camera, in accordance with some embodiments.
  • FIG. 11B a side view of the visible-thermal dual-band flashlight camera of FIG. 11A .
  • FIG. 12 is a diagram of a hand gesture control module, in accordance with some embodiments.
  • FIG. 13 is a diagram of a control spot generator, in accordance with some embodiments.
  • FIG. 14 is a view of a visible-thermal dual-band camera of a hand gesture sensor, in accordance with some embodiments.
  • FIGS. 15A-15B are views of various headsets, each including a three-dimensional hand gesture control module and a predetermined number of cameras, in accordance with some embodiments.
  • FIGS. 15C and 15D are views of a control module outputting color control spots at two different heights before and after a hand gesture is positioned, respectively, in accordance with some embodiments.
  • FIG. 16 is a view illustrating an operation of elements of a non-contact switch system, in accordance with some embodiments.
  • FIG. 17 is a view of a function key of a non-contact switch, in accordance with some embodiments.
  • FIG. 18 is an illustration of multiple non-contact keys constructed from the pixels of a sensor image, in accordance with some embodiments.
  • FIGS. 19A and 19B are views of a staring-type non-contact sensor and a scanning-type non-contact sensor, respectively, in accordance with some embodiments.
  • FIGS. 19C and 19D are views of a thermal sensor and a skin color sensor, respectively, in accordance with some embodiments.
  • FIG. 20 is a view of a system 2000 for aligning a function key pixel and a control spot, in accordance with some embodiments.
  • FIG. 21A is an illustration of a parallel scanning configuration for a line detector array, in accordance with some embodiments.
  • FIG. 21B is an illustration of a scanning configuration for a single element non-contact sensor, in accordance with some embodiments.
  • FIG. 21C is an illustration of a fast scanning configuration for a single element non-contact sensor, in accordance with some embodiments.
  • FIG. 22 is a view of a control spot generator, in accordance with some embodiments.
  • FIG. 23 is a view of a control spot generator, in accordance with some embodiments.
  • FIG. 24 is a view of a non-contact switch including a scanning non-contact sensor, in accordance with some embodiments.
  • FIG. 25 is a view of a non-contact switch including a staring non-contact sensor, in accordance with some embodiments.
  • FIG. 26 is an illustration of an apparatus controlled by a non-contact switch, in accordance with some embodiments.
  • FIG. 27 is a view of a non-contact switch including a camera, in accordance with some embodiments.
  • FIG. 28 is an illustration of multiple devices controlled by a non-contact switch, in accordance with some embodiments.
  • FIG. 29 is a flowchart illustrating the relationship between elements of a non-contact switch in an operation, in accordance with some embodiments.
  • FIG. 30 is a view of control spot functions of a remote controller for a remote control vehicle, in accordance with some embodiments.
  • FIG. 31 is a view of control spot functions of a game controller for a video game console, in accordance with some embodiments.
  • FIG. 32 is a view of control spot functions for two non-contact controllers controlling an excavator, in accordance with some embodiments.
  • FIG. 33A is a side view of a non-contact controller system for controlling a drone, in accordance with some embodiments.
  • FIG. 33B is a front view of the non-contact controller system of FIG. 33A .
  • FIG. 33C is a view of a drone controlled by the non-contact controller system of FIGS. 33A and 33B .
  • FIG. 34 is another illustration of control spot functions of a non-contact controller system for controlling a drone, in accordance with some embodiments.
  • FIG. 35A is an image generated from an onboard camera of the drone of FIG. 33C when the controller system is configured for a manual mode of operation, in accordance with some embodiments.
  • FIG. 35B is an image generated from the onboard camera of the drone of FIG. 33C when the controller system is configured for an automatic mode of operation, in accordance with some embodiments.
  • FIG. 35C is a view of the flight track of the drone of FIG. 33C .
  • FIG. 36 is an illustration of a controller system controlling an onboard camera, in accordance with some embodiments.
  • FIG. 37 is an illustration of a diffuser scattering light, in accordance with some embodiments.
  • FIGS. 38A-D are images generated when a diffuser is between a camera and an object, in accordance with some embodiments.
  • FIG. 39 is a view of elements of a switch having a diffuser, in accordance with some embodiments.
  • FIG. 40A is a top view of a switch having a diffuser, in accordance with some embodiments.
  • FIG. 40B is a side view of the switch of FIG. 40A .
  • FIG. 41 is an illustration of an operator using a diffuser-based controller, in accordance with some embodiments.
  • FIG. 42 are views of hand gestures representing combination lock integers, in accordance with some embodiments.
  • FIG. 43 is an illustration of a diffuser-based hand gesture lock mechanism, in accordance with some embodiments.
  • FIG. 1 is a flowchart illustrating an operation requiring manual control of a device.
  • a human operator 12 participates in an operation 18 requiring manual control 14 of a device 16 , such as a machine, instrument, robot, vehicle, or other device or object.
  • Manual control 14 can include direct manual control of the apparatus 16 , for example, the operator 12 sitting in a car and turning the car in a desired direction by way of a steering wheel in the car.
  • Manual control 14 can alternatively include indirect control, or remote control, of the apparatus 16 , for example, the operator 12 remotely controlling a drone by way of a joystick in communication with a computer console.
  • a human observation 20 may establish that adjustments are required regarding the control of the apparatus 16 .
  • the human operator 12 may determine that a joystick must be moved in a different direction as part of the operation 18 in response to human observation.
  • the operator when seated in a vehicle, the operator may have to sit in a cramped space and repeat laborious hand and foot motions for hours when performing operations, which can be ergonomically hazardous. In some applications, the operator may be physically close to a hazardous operation area.
  • Remote controls may be used to address these problems. While remote controls have advantages in some situations because they are inexpensive and easily constructed, they have extensive weight and require repeated hand motions, for example, when using a joystick or computer mouse to move a cursor translating to a movement of the apparatus.
  • Hand gesture control mechanisms do not have extra weight, since hand gesture motions are natural motions of a user's hands and fingers.
  • hand gestures can be used to control a light emitting diode (LED) lamp or related light-emitting device.
  • the hand gesture recognition can be accomplished by a visible camera in conjunction with either a thermal sensor or a radiometric skin detection sensor.
  • a beam steering mechanism can steer the illumination spot and control spot generated by the device as well as the field of view (FOV) of the sensors and visible camera according to the user's hand gestures.
  • FOV field of view
  • embodiments of the present inventive concepts include systems and methods for permitting a hand gesture to be used to remotely control an apparatus such as a mechanical device, instrument, machine, robot, gaming console, and/or other apparatus capable of receiving and processing a signal output in response to the hand gesture.
  • the system generates and outputs a control spot so that a user can determine where to position a hand or related object. In doing so, the user can make a hand gesture prior to or at the control spot to control the apparatus. Accordingly, the hand gesture can represent a command.
  • a hand gesture control module can replace a control panel, joystick, computer mouse, or other conventional peripheral device, or direct manual action commonly use to communicate with one or more machines, robots, devices, instruments, and game consoles.
  • a hand gesture control module can complement a control panel, joystick, computer mouse, or other conventional peripheral device, or direct manual action commonly use to communicate with one or more machines, robots, devices, instruments, and game consoles.
  • FIG. 2A is a flowchart illustrating the operation of an apparatus 16 , in accordance with some embodiments.
  • the apparatus 16 can include an instrument, robot, vehicle device such as a crane shown in FIG. 3 , gaming console, and/or other mechanical device that can receive and process an electronic, optical, RF, or other control signal 23 output to the apparatus in response to a hand gesture or the like performed to control a movement or other action of the apparatus.
  • vehicle device such as a crane shown in FIG. 3
  • gaming console and/or other mechanical device that can receive and process an electronic, optical, RF, or other control signal 23 output to the apparatus in response to a hand gesture or the like performed to control a movement or other action of the apparatus.
  • a control signal 23 is generated by a hand gesture control system 30 instead of conventional manual control 14 .
  • a human observation decision diamond 22 can be performed by human observation with or without a wearable computer, such as a Google Glass computer.
  • the wearable computer preferably with an optical head-mounted display (OHMD), can be added to monitor an operation and assist in observing the operation and enhancing the hand gesture functions during feedback from the operation 18 performed by the apparatus 16 .
  • OHMD optical head-mounted display
  • a wearable computer is not part of the system.
  • FIG. 2B the flowchart of FIG. 2A can be implemented, whereby a hand gesture 71 can be used to control a movement of a cursor 72 in some embodiments, for example, between different displayed icons, buttons, windows, or other display elements known to those of ordinary skill in the art.
  • the cursor 72 is shown as an arrow in the display 44 in FIG. 2C .
  • FIG. 2D illustrates a hand gesture in an image grid of pixels 78 .
  • the pixel location 71 a of the hand gesture in the image 78 is known.
  • the pixel location of the hand gesture 71 is converted into a pixel position of the cursor 72 in the display 44 .
  • the cursor 72 can move to a new position from a current position.
  • FIG. 2B illustrates the process of moving the cursor 72 by the hand gesture 71 .
  • Image 78 a corresponding to the hand gesture is captured.
  • the location 71 a of the hand gesture is extracted from the image 78 a .
  • the location 71 a is converted into a cursor position 72 a in the display 44 .
  • the cursor 72 moves to new position from current position. The process keeps repeating until the hand gesture 71 stops moving.
  • FIG. 3A is a diagram illustrating the operation of an apparatus 16 controlled by a hand gesture control system 30 that provides an interaction between a hand gesture and a wearable computer display 44 , in accordance with some embodiments.
  • the apparatus 16 is a crane.
  • the inventive concepts are not limited thereto, and can be implemented for an operation of another apparatus known to those of ordinary skill in the art, for example, pilotless airplanes, robots, instruments, game consoles, and so on.
  • the hand gesture control system 30 can be mounted on a permanent platform such as ceilings, walls, or fixtures.
  • the hand gesture control system 30 can be mounted to temporary platforms such as tripods, poles, and other fixtures for field operations.
  • a hand gesture control module can be mounted on a head set worn by the operator.
  • a crane operator can place a hand in an illuminated control spot 46 and make a hand gesture.
  • the control spot 46 can be generated by a light source, for example, an LED light, that illuminates a surface with the control spot 46 .
  • a single control spot 46 is employed.
  • multiple control spots are employed.
  • the control spot 46 is a color in the visible spectrum.
  • the control spot 46 can be produced by a filter, a light pipe, a control spot LED, or a combination thereof, for example, described herein.
  • the hand gesture can be presented as a cursor 72 or the like at the display 44 .
  • the image size corresponding to the hand gesture and the display size corresponding to the cursor can be scaled. For example, if the image size is twice the size of the display, then a hand gesture position at i, j (200,400) is cursor position (100,200) in the display 44 .
  • the cursor When the hand gesture moves over a control spot generated by the hand gesture control system 30 , the cursor also moves in the same direction.
  • the display and camera image are preferably aligned so that when the hand gesture moves in the horizontal direction in the image, the cursor also moves in the horizontal direction at the display 44 .
  • the operator can choose the apparatus to operate on by selecting, e.g., double clicking, the icon 51 corresponding to the apparatus 16 , for example, the crane.
  • the clicking is performed by hitting the index finger by the thumb, i.e., a motion whereby the user snaps the fingers together as shown in FIG. 4(B) .
  • Other hand gestures can translate to different cursor functions. For example, holding down the cursor is accomplished by touching the index finger with the thumb as shown in FIG. 4(C) .
  • the hand gesture control module mounted above the user or worn by the user captures image of the hand gesture.
  • the processor analyzed the hand gesture type. If it is a cursor hand gesture, then its position is a cursor position. When the hand gesture moves, the cursor also moves to a new location. By moving the hand gesture to the upper left corner of the image, the cursor also moves to the upper left corner of the display.
  • the cursor arrives at icon 51 , using the clicking hand gesture 70 B shown in FIG. 4(B) , the user can then access the control panel 58 of the crane.
  • One hand is shown being used for gesture control. In some embodiments, two hands can be placed in the control spot for hand gesture control, for example, to enhance hand gesture control capability permitting additional hand gestures to be processed.
  • FIGS. 3B and 3C are diagrams illustrating views of a wearable computer display 44 , in accordance with some embodiments.
  • a computer application screen 44 A displays icons of various devices, robots, equipment, instruments, vehicles, games, and so on. Instead of using a mouse or the like to move a cursor over the icon, a user can activate an icon, for example, icon 55 , by performing a hand gesture. In doing so, an operation screen 44 B can be displayed.
  • the operation screen 44 B includes an instrument status sub-window 56 showing current information of the instrument, for example, device status information and an environmental sub-window 57 showing environmental information, which is also important in crane operation, for example, providing weather information so that the operator is aware of a rainy day, and icons of various tools for controlling the instrument, such as a button mode icon 58 , and a joystick mode icon 60 .
  • hand gesture control can operate in cursor mode, shown in the selection menu display of the control panel of FIG. 3B and FIG. 3C .
  • cursor mode the user can click on buttons on the control panel.
  • hand gesture control can be performed in a joystick mode. In this mode, hand gestures for various motions can be used.
  • hand gesture control can be performed in both a cursor and a joystick mode.
  • FIGS. 4A to 4C illustrate different hand gestures 70 A- 70 C and corresponding commands for performing cursor operations, in accordance with embodiments.
  • a hand gesture 70 A relates to a command so that the hand gesture 70 A emulates a cursor, or otherwise performs a function of a cursor.
  • a cursor clicking function can be performed by making a hand gesture 70 B that includes hitting the index finger with the thumb as shown in FIG. 4B .
  • the hand gesture 70 B can be performed to “click” on icon 55 of FIG. 3B , and activate a program corresponding to icon 55 .
  • a function that relates to holding down the cursor is accomplished by touching the index finger with the thumb as illustrated by the hand gesture 70 C shown in FIG. 4C , for example, to drag and drop icons, windows, buttons, on a display, similar to a cursor function.
  • FIGS. 5A to 5F illustrate different hand gestures 80 A- 80 F and corresponding commands for performing joystick operations, in accordance with embodiments.
  • a hand gesture 80 A relates to a command for emulating an up motion or pitching up motion of a joystick, which translates to a motion on a computer screen of an item, character, or other element displayed on the computer screen that moves from one point at the computer screen to another point in response to the hand gesture.
  • the operator does not look at the display, but instead focuses attention at the operation at hand by looking at the joystick.
  • a human response in joystick mode i.e., using hand gestures, is faster than in a cursor mode.
  • a joystick mode in a joystick mode, minimum interaction occurs between the operator and the monitor.
  • the operator memorizes the hand gestures for various motions.
  • a joystick mode can therefore be faster than a cursor mode because the operator can focus attention on the operation.
  • An operation in a joystick mode may therefore be preferred by gamers or the like.
  • a hand gesture 80 B relates to a command for emulating joystick down motions, including a pitching motion.
  • a hand gesture 80 C relates to a command for stopping up and down motions, including pitching motions.
  • Circular motions for example, for controlling a rotating motion of a steering wheel, can be obtained by making a rotating hand gesture in a circular pattern as illustrated by the hand gesture 80 D shown in FIG. 5D .
  • a linear motion can be obtained a hand gesture motion along the horizontal plane as illustrated by the hand gesture 80 E shown in FIG. 5E .
  • the hand gesture 80 F on the lower right corner can be for triggering, for example, pulling the trigger on a joystick, gun, or other device.
  • hand gesture motions 70 , 80 are two-dimensional hand gestures.
  • the application screen 44 A can display a large number of icons, windows, buttons, or other visual representations of computer applications of various types.
  • a hand gesture control system 30 in accordance with some embodiments can be used to control many different devices using the applications corresponding to the icons 55 displayed at the application screen 44 A.
  • the same hand gesture control system 30 can be used to control a robot or other mechanical device having one or more elements that move relative to each other.
  • the same hand gesture control system 30 can also be used to control one or more other apparatuses.
  • FIG. 6 is a diagram illustrating a hand gesture control system 30 remotely controlling a light emitting diode (LED) lamp 90 , in accordance with some embodiments.
  • Individual elements of the hand gesture control system 30 and/or LED lamp 90 can be similar to or the same as those described in U.S. patent application Ser. No. 13/826,177, incorporated by reference herein.
  • the hand gesture control system 30 and the LED lamp 90 are physically separate.
  • the hand gesture control system 30 includes a signal transmitter 31 .
  • the LED lamp 91 includes a signal receiver 91 that communicates with the signal transmitter 31 via wireless communication signals, for example, radio waves, optical signals, or the like. In some embodiments, the signal can be transmitted via a fiber or Ethernet cable. Since the hand gesture control system 30 generates a control spot 46 , and since the LED lamp 90 generates an illumination spot 92 , a hand gesture can be placed under the control spot 30 to control the illumination spot 92 . If a user or other human observer wants to steer the illumination spot 92 to a desired position, he/she can adjust hand gesture position and compare the current illumination spot 92 position to the desired position. Continue this process can continue and be repeated until illumination spot 92 reaches its destination.
  • One or both of the hand gesture control system 30 and the LED lamp 90 can have a beam steering mechanism (not shown), for example, described in U.S. patent application Ser. No. 13/826,177, incorporated by reference herein.
  • a monitor and corresponding processor are provided for controlling multiple LED lamps through the same hand gesture control system 30 .
  • FIG. 7 is a diagram illustrating a hand gesture control module 130 remotely controlling a television set 102 , in accordance with some embodiments.
  • the control module 130 includes a transmitter 131 that exchanges communication signals with a receiver 103 at the television set 102 , for example, wireless communication signals such as radio waves, optical signals, or the like.
  • An example of a communication signal is a control signal similar or the same as control signal 23 described with reference to FIG. 2A ,
  • Hand gestures 70 , 80 in a control spot 46 generated by the control module 130 can replace a remote control device, or the like, and be used to access a selection menu of the like on the television set 102 , which can be achieved by cursor or joystick hand gestures described herein.
  • FIG. 8 is a block diagram of a hand gesture control system 30 , in accordance with some embodiments.
  • the hand gesture control system 30 comprises a beam steering mechanism 202 , a computation processor 204 , a transceiver 206 , a hand gesture sensor 208 , a control spot generator 210 , and a wearable computer display 212 , some or all of which can be co-located under a same housing as a single unit.
  • the control spot generator 210 generates a control spot at a region at or proximal to that where hand gestures will be sensed.
  • the control spot is sufficiently large to surround at least one human hand, for example, 12′′ diameter.
  • the control spot is small, for example, about a 1′′ diameter.
  • the control spot generator 210 can include filters, light pipes, LEDs, or a combination thereof for generating a control spot, which can be the same as or similar to that described in U.S. patent Ser. No. 13/826,177 incorporated herein by reference above.
  • the control spot generator 210 comprises one or more LEDs 602 or other light source and a lens 604 .
  • An LED 602 can comprise narrow-beam optics for generating a narrow light beam at the lens 604 so that its diameter is equal or smaller than the aperture diameter of the lens.
  • the control spot generator 210 can further include a heat sink 606 for dissipating heat generated by the LEDs 602 .
  • other sources that emit light can equally apply.
  • the light output from the LEDs 602 or other light source can be in the visible light spectrum, or other light spectrum.
  • the hand gesture sensor 208 captures hand gesture images taken from a generated control spot, which can be the same as or similar to that described in U.S. patent Ser. No. 13/826,177 incorporated herein by reference above.
  • the hand gesture sensor 208 comprises one or multiple thermal cameras, which capture thermal hand gesture images. In some embodiments, the hand gesture sensor comprises one or multiple visible cameras, which capture only visible hand gesture images. In some embodiments, the hand gesture sensor comprises of one or multiple visible, thermal dual-band cameras, for example, illustrated in FIG. 14 .
  • a visible, thermal dual-band camera 700 comprises a visible camera 710 , a thermal camera 720 , and an infrared window 702 that reflects visible light into the visible camera 710 and transmits thermal light to the thermal camera 720 .
  • the thermal camera 720 can include an infrared focal plane detector array 722 and an optical element such as an infrared lens 724 or the like.
  • the visible camera 710 can include a visible FPA or the like and a visible lens 714 or related optical element.
  • a visible-thermal dual-band flashlight camera 400 comprises an infrared lens 402 shared by a visible FPA detector 426 and a thermopile FPA or detector 428 , and a three-face mirror 404 .
  • the visible, thermal dual-band camera 400 captures both visible and thermal images of a hand gesture.
  • the visible FPA 426 and the thermal FPA 428 can respond to different wavelengths.
  • the thermal FPA 428 can respond to wavelengths corresponding to emitted thermal light of a hand making a gesture, while the visible FPA 426 can response to wavelengths of light of the hand in the illuminated control spot.
  • a high signal-to-noise ratio thermal image can distinguish a hand gesture from a background, and therefore be used to extract the hand gesture.
  • the beam steering mechanism 202 can include a dual-axis gimbal or the like for steering a control spot and a field of view (FOV) provided by the hand gesture sensor 208 .
  • the beam steering mechanism 202 includes a steering mirror on a gimbal, for example, described in U.S. patent application Ser. No. 13/826,177, incorporated by reference herein.
  • the hand gesture sensor 208 and the control spot generator 210 are preferably placed in front of the mirror of the beam steering mechanism 202 .
  • the gimbal mirror steers the beam of the control spot generator 210 and the field of view of the hand gesture sensor 208 as the hand gesture moves.
  • the beam steering mechanism 202 comprises a Micro-Electro-Mechanical Systems (MEMS) mirror array as described in U.S. patent application Ser. No. 13/826,177 incorporated by reference herein.
  • the MEMS mirror array can steer the field of view of the hand gesture sensor 208 and the beam output by the control spot generator 210 as the hand gesture moves.
  • the beam steering mechanism 202 comprises two counter-rotating prism wedges as illustrated in U.S. patent application Ser. No. 13/826,177 incorporated by reference herein.
  • the hand gesture sensor 208 and the control spot generator 210 can be placed in front of the counter rotating wedge assembly. Beam steering can be performed by a combination of counter-rotation and co-rotation of the two wedge prisms.
  • the beam steering mechanism 202 comprises a dual-axis gimbal 302 .
  • the hand gesture sensor 208 and the control spot generator 210 can be coupled to a mounting plate 512 on the gimbal 302 as illustrated in FIG. 12 .
  • a flashlight camera 400 is coupled to the gimbal 302 .
  • the flashlight camera can be the same as or similar to the flashlight camera 400 described at FIGS. 11A and 11B .
  • the dual-axis gimbal 302 can be pitch-roll type as illustrated by FIG. 10 or pitch-yaw type as illustrated by described and illustrated in U.S. patent application Ser. No. 13/826,177 incorporated by reference herein.
  • the gimbal 302 can include an outer ring and an inner ring rotating relative to each other by shafts or the like.
  • One or more motors (not shown) and can be mounted on the inner ring and outer ring, respectively.
  • Counterweights (not shown) can be mounted on the outer ring and inner ring, respectively, for balancing and stabilizing the gimbal 302 , and moving that gimbal 302 , i.e. pitch, yaw, and/or roll.
  • the hand gesture sensor 208 and a control spot generator unit 210 are constructed and arranged on the mounting plate 512 so that the lines of sight (LOS) of the gesture sensor 208 and a control spot generator unit 210 are parallel. Because of the proximity to each other, the center of a control beam spot generated by the control spot generator unit 210 can coincide or nearly coincide with the center of the imaging region provided by the hand gesture sensor 208 .
  • LOS lines of sight
  • the computer processor 204 is responsible for hand gesture recognition, tracking, beam steering, and control command signal generation.
  • the computer processor 204 can be a computer, for example, comprising a processor, a memory, and a connector between the processor and the memory and/or other elements of the hand gesture control system 30 .
  • the computer processor 204 is a DSP chip.
  • the computer processor 204 is responsible for hand gesture recognition, tracking, beam steering, control command signal generation, communicating with the wearable computer 212 , e.g., Google Glass or the like, and/or lighting control.
  • icons of applications for controlling various machines, instruments, robots, devices, and playing games can be displayed on the wearable computer display 212 or related display screen.
  • the computer processor 204 in communication with the hand gesture sensor 208 can permit hand gestures to appear as cursors, joysticks, and other symbols on the wearable computer display 212 or related display screen.
  • a cursor can be moved at the display 212 by making a hand gesture as shown in FIG. 4 in a control spot formed by the control spot generator 210 .
  • hand gesture motions can be made that emulate joystick control motions as shown in FIG. 5 .
  • Other software applications can equally apply, which are stored in memory, executed by one or more processors. Some or all applications can be displayed at the display 212 , and executed in responds to hand gestures or the like, for example, described herein.
  • the transceiver 206 of the hand gesture control module sends command signals to an operating instrument controlled according to hand gestures processed by the hand gesture control system 30 .
  • the transceiver 206 can be similar to or the same as the transmitters 31 and 131 of FIGS. 6 and 7 , respectively.
  • a receiver of the operating instrument for example can provide status information or the like regarding the instrument.
  • Information can be displayed on the wearable computer display 212 , for example, a control panel screen 44 B shown in FIG. 3C .
  • the signal and information can be sent wirelessly by radio wave or light beam.
  • the signal can be transmitted via a fiber or Ethernet cable.
  • the hand gesture sensor 208 and the control spot generator 210 are assembled into one unit.
  • the hand gesture control module 20 comprises one or more of such units.
  • the assembled unit is constructed and arranged as a visible, thermal dual-band flashlight camera, for example, described in U.S. patent application Ser. No. 13/826,177, incorporated by reference herein, and/or the dual-band flashlight camera 400 illustrated in FIGS. 11A and 11B , which can comprise an infrared lens 408 , a three-face pyramid prism mirror 404 , a visible focal plane array 426 , a thermal focal plane array 428 , and one or more color LEDs or related light source 402 .
  • a hand gesture sensor and the control spot generator are assembled under a single housing, i.e., the flashlight housing.
  • the LEDs 402 , visible FPA 426 , and thermal detector or FPA 428 can share the same lens 402 but split the aperture, and illuminate at the same target. Therefore, the illumination area of the color LED 402 is also the imaging area of the visible FPA 426 .
  • the thermal FPA 428 also provides imaging in the same area as the visible FPA 426 .
  • the light source 402 is the actual thermal emission from the target under a control spot.
  • the lens 408 is a transmissive type lens. In other embodiments, reflected Cassagrain optics are provided instead of an infrared lens.
  • the control spot generator 210 is placed next to a hand gesture sensor 208 as shown in FIG. 12 .
  • the flashlight camera functions as the hand gesture sensor.
  • FIGS. 15A-15B are views of various headsets 800 A- 800 B, respectively, each including a three-dimensional hand gesture control module and a predetermined number of cameras, in accordance with some embodiments.
  • a hand gesture sensor and a control spot generator are co-located, and mounted in the front of a platform, in particular, for 2-dimension hand gesture motion sensing.
  • two side cameras 812 A, 812 B are added to a headset 800 C for 3-dimension hand gesture motion sensing.
  • the front 806 and side cameras 802 , 812 can be visible cameras. In some embodiments, they can be visible, thermal dual-band cameras.
  • some embodiments provide for a hand gesture control module that processes three-dimensional hand gesture motions.
  • FIG. 15B includes an example of a hand gesture control module with 3 cameras, for example, two side cameras and a front camera, in accordance with some embodiments.
  • the hand gesture control modules can be worn by an operator on his/her head.
  • To detect 2-dimensional motions only one camera looking straight down at the hand gesture is needed.
  • To detect 3-dimensional motions two side cameras 802 A, 802 B are needed.
  • a hand gesture can be separated from background by using two cameras separated by a distance and a control spot generator.
  • the color control spot 9004 is at two different heights before and after a hand gesture 9005 is inserted.
  • This color control spot 9004 can be used as a reference.
  • the control spot 9004 appears in different locations in the camera images due to separation between cameras 9001 A and 9001 B.
  • the distance of control spot between two images increases for control spot near the cameras and decreases for control spot further away from cameras.
  • the control spot is projected onto the background 9003 .
  • hand gesture 9005 in the control spot 9004 can be separated from the background 9003 . This is done by separating pixels into two groups: one with large image separation and one with small image separation between two camera images.
  • the group with the large image separation belongs to the hand gesture 9005 .
  • the group with the small image separation belongs to the background 9003 .
  • Normally 3D stereoscopic imaging requires calibration of the camera system to extract the depth information.
  • the reference positions of the control spot 9004 before and after insertion of hand gesture makes the calibration unnecessary.
  • the two cameras 9001 A and 9001 B can be color cameras or multispectral cameras.
  • the control spot 9004 can be easily seen in the color band whose color matches that of the control spot.
  • the cameras 802 A and 802 B in the headset in FIG. 15B can perform some or all of this technique to separate a hand gesture from a background.
  • a control spot can be used as a reference beam for separating hand gesture from background using the fact that the parallax of the hand gesture is larger than that of the background, which is further away from the cameras.
  • the control spot appears in two different distances before and after inserting a hand into the control spot.
  • thermal non-contact switches measure thermal signal changes due to an approaching warm body or object in a field of view (FOV) of a passive infrared sensor.
  • a conventional non-contact switch can comprise a thermal detector such as a pyroelectric detector and a Fresnel lens. The thermal detector is located at the focal plane of the lens.
  • a feature of the thermal non-contact switch is that as a warm body approaches the FOV of the thermal sensor, a change of signal from a cooler background to a hotter body is detected. As a result, the switch is triggered.
  • non-contact switches are limited in functionality. For example, a conventional thermal non-contact switch does not have an amplitude increase or decrease capability.
  • selected pixels from an image are detected by a non-contact sensor, and provided as function keys for a non-contact switch or controller.
  • the projections of these keys on an imaging surface can serve as non-contact buttons.
  • a user can position a hand or other object into the projection, or field of view, of an image selected as a function key, thereby activating the non-contact button without physically contacting it.
  • the image of the non-contact sensor is small because the number of assigned function keys is also small. Very little processing time is required as compared with other hand gesture processing techniques, thereby increasing a switching time of the non-contact switch.
  • Multiple function keys can be provided, each corresponding to a function key sensor of the non-contact sensor.
  • a non-contact switch in accordance with some embodiments can be constructed and arranged as a controller.
  • color light can be used to illuminate control keys so that users can easily identify the various buttons.
  • a control spot generator can be provided to generate cones of color light.
  • a beamsplitter can be used to align the non-contact keys and the control spots.
  • a switch can include a scanning non-contact sensor with a single element detector.
  • a scanning non-contact sensor can create a partial image consisting of only function key pixels and minimum number of non-function pixels, for example, shown in FIG. 21C .
  • a fast non-contact switch can be constructed because the scan mirror has only a few positions to go to. Since a single detector is used instead of an array of detectors, the manufacturing cost is significantly reduced, especially with respect to the thermal scan sensor.
  • the scan mirror can be inexpensive too, especially for a MEMS mirror in large volume. Therefore, a fast and inexpensive non-contact switch can be constructed from a scanning non-contact sensor.
  • the scan scans all function key sensors and only the non-function key sensors in the path of the scan to shorten the data acquisition time.
  • a hand gesture can be positioned in the field of view of the camera, and can therefore enhance the functions of the switch, for example, for recognizing and distinguishing hand gestures.
  • a camera can capture image data corresponding to the hand gesture and a processor can convert the captured image data into a cursor or joystick command signal that controls a cursor or joystick.
  • the function keys become inactive, such that only camera images of hand gesture is processed.
  • a low resolution staring non-contact sensor with a small detector array can also be used to construct a fast and inexpensive non-contact switch. For staring systems, all image pixels are generated at the same time. Low resolution thermal detector arrays such as 4 ⁇ 4 or 8 ⁇ 8 are relatively inexpensive.
  • the non-contact sensor can be of thermal type or a radiometric calibrated color type.
  • the thermal type sensor uses the heat from hands while the radiometric calibrated color sensor uses color of hands.
  • a hand can be therefore be distinguished from another object, for example, a sheet of paper, when positioned in the field of view of the non-contact sensor.
  • a diffuser is a well-known device that scatters light in different directions. Accordingly, light from different scene contents can become mixed after engaging with a diffuser. The net result is a smearing of the image scene or other undesirable effect. The smearing is worse as the scene moves further away from the diffuser.
  • a diffuser based switch or controller can be constructed, for example, described herein.
  • an imaging sensor is positioned behind a diffuser, which can resolve images related to a hand gesture or the like only when the hand or the like is close or touches the diffuser. By moving the hand gesture near or touching a selected control spot on the diffuser, a command based on the control spot is generated and sent to the device under control.
  • a non-contact switch in accordance with some embodiments works best for remote applications, the diffuser based switch or controller works best for close proximity applications, and can complement or otherwise co-exist with a non-contact switch.
  • FIG. 16 is a view illustrating an operation of elements of a non-contact switch system 1600 , in accordance with some embodiments.
  • the system 1600 includes an imaging optics 1602 and a detector array 1604 .
  • the detector array 1604 comprises a plurality of image sensing detectors 1606 , arranged in rows and columns at the array 1604 , which incorporate an array of pixels.
  • Each detector 1606 also referred to as a function key sensor, collects incident light from a small region of an imaging area 1608 , or ground sample A.
  • Each function key sensor at the detector array 1604 contains a light sensitive photo diode or the like for measuring light, thereby recording an image of the ground sample. Accordingly, the function key sensor can be identified from or formed according to selected pixels of the array 1604 .
  • the output signals produced by the pixels are read out, for example, one row at a time, to form an image.
  • the captured images can be output as an output image 1612 , for example, to a display.
  • the solid angle extending from detector 1606 and imaging optics 1602 is referred as an instantaneous field of view (IFOV).
  • IFOV instantaneous field of view
  • the projection angle of detector 1606 to ground sample A through the imaging optics can be referred to as an IFOV of detector 1606 .
  • Detector 1606 collects only light within this IFOV.
  • the output of detector 1606 or pixel A′ is the image of ground sample A.
  • the volume within this IFOV can be used as a function key of a non-contact switch in some embodiment.
  • a target such as a hand detection within this IFOV can be assigned a specific meaning or function in some embodiment. Target detection only occurs when the signal level of pixel A′ falls within a certain range. For example, a thermal signal from a hot soldering iron or a book at ambient temperature is not interpreted as a target signal because it is either too high or too low.
  • FIG. 17 is a view of a function key 1708 of a non-contact switch, in accordance with some embodiments. In describing FIG. 17 , reference is made to elements of the non-contact switch system 1600 of FIG. 16 .
  • pixel A′ of the system 1600 shown in FIG. 16 is identified as a function key for the non-contact switch.
  • a collection of pixels can be used as a function key.
  • Detector 1606 can sense an object such as a hand carrying signal entering the IFOV of the image, or ground sample, selected as the function key 1708 .
  • the hand can block the IFOV provided by a function key pixel sensor 1706 to activate the function key 1708 .
  • the imaging sensor in an embodiment can include multiple function keys.
  • the function key and the imaging optics form a function key sensor in some embodiments.
  • the imaging sensor therefore can include multiple function key sensors.
  • FIG. 18 is an illustration of multiple non-contact keys constructed from the pixels of a sensor image 1800 , in accordance with some embodiments.
  • some function keys can be configured for switching functions, for example, function key control signals generated by one or more function key pixel sensors 1706 of FIG. 17 , while other function keys can be configured to permit control-related functions to be performed.
  • function keys are identified according to a calibration procedure in FIG. 20 .
  • Non-contact function keys are separated from each other at an image area to avoid unintentionally activating multiple keys at the same time.
  • a non-contact switch in accordance with some embodiments can provide a sufficient number of keys 1808 to perform both switching and controlling functions.
  • images captured at a non-contact sensor have a low resolution comprising a small number of pixels. Accordingly, the collection time and processing time are short and the cost is low.
  • a non-contact switch including non-contact sensors can therefore be fast and inexpensive.
  • images of a non-contact sensor can be of a high resolution.
  • a function key is identified by inserting a ground truth target such as a hand into a control spot and collecting an image of the ground truth target.
  • a pixel or group of pixels with a highest detector output is identified as the function key.
  • Other function keys can be identified in the same way. The positions of the function keys can be stored in the processor. Control functions of one or more devices under control can be assigned to function keys in the processor prior to a control operation.
  • FIGS. 19A and 19B are views of a staring-type non-contact sensor 1900 and a scanning-type non-contact sensor 1940 , in accordance with some embodiments.
  • the staring non-contact sensor 1900 comprises of imaging optics 1902 and a two-dimension detector array 1904 , or focal plane detector array.
  • the detector array 1904 can be the same as or similar to the detector array 1604 of FIG. 16 .
  • a full image, for example, of the imaging area 1608 of FIG. 16 can be formed by all detectors in the two-dimension detector array 1904 .
  • an image can be created by focusing light received from the scene onto individual detectors of the two-dimension detector array 1904 .
  • the scanning non-contact sensor 1940 comprises of imaging optics 1944 , a scan mirror 1942 , and a line of detectors or a single element detector 1943 .
  • the scan mirror 1942 can be configured for one axis for a line detector array and for two axes for a single element detector.
  • the scan mirror 1942 is a MEMS type or the like.
  • the scan mirror 1942 is a piezoelectric type.
  • the scan mirror 1942 is an electromagnetic type.
  • the scan mirror 1942 is a mechanical type.
  • the scanning non-contact sensor 1940 creates an image by sequentially scanning and focusing light from various parts of the scene onto the detector 1943 , i.e., a single detector or a one-dimension array of detectors.
  • the scan mirror 1942 scans portion of the field of view (FOV) of the sensor 1940 at a time generating a partial image. It continues to scan until the full FOV is covered. The full image is then created.
  • the scan mirror 1942 scans one IFOV corresponding to one pixel at a time.
  • the full image is output of the same detector looking in different directions within the sensor FOV at different times.
  • FIG. 21A the scan mirror scans one IFOV for a column of detectors at a time. It continues from left to right until the full image is obtained.
  • FIG. 21B the scan mirror scans from FOV from left to right and right to left in a raster pattern until the full image is obtained.
  • FIGS. 19C and 19D are views of a thermal sensor 1950 and a skin color sensor 1960 , respectively, in accordance with some embodiments.
  • the sensing mode of a non-contact sensor can be emissive.
  • Thermal radiation emitted from a body object, for example, a human hand is collected by the thermal sensor 1950 , which can be staring or scanning type.
  • the thermal non-contact sensor 1950 is part of a staring system, the sensor 1950 comprises thermal imaging optics 1952 and a detector array 1954 .
  • the thermal non-contact sensor 1950 is part of a scanning system, the sensor 1950 comprises of imaging optics, a scan mirror, and detector line array or a single detector, for example, similar to the sensor 1940 of FIG. 19B
  • the sensing mode can be reflective.
  • Color light reflected off skin of body object is collected by the skin color sensor 1960 , which can be scanning or staring type.
  • a skin color non-contact sensor 1960 comprises of imaging optics 1962 , color filters (not shown), and a detector array 1964 for a staring system. It comprises of imaging optics, color filters, scan mirror, and detector array in a scanning system.
  • the skin color non-contact sensor 1660 can include other elements and functions described in U.S. patent application Ser. No. 13/826,177, the entire contents of which are incorporated herein by reference.
  • FIG. 20 is a view of a system 2000 for aligning a function key pixel and a control spot, in accordance with some embodiments.
  • the system 2000 includes a spot generator 2004 and a beamsplitter 2002 . As illustrated in FIG. 20 , the beamsplitter 2002 is placed at the angular bisector between the nadir looking non-contact sensor 2006 and the horizontal pointing control spot generator 2004 , in some embodiments. The reflected light of the control spot generator 2004 at the beamsplitter 2002 coincides with the IFOV of the relevant function key.
  • the alignment of a control spot 2012 and identification of a function key is achieved by inserting a hand or other thermal emitting object into the control spot region 2012 and take an image of target in the control spot.
  • the pixel or group of pixels with high output is the non-contact key pixel or pixels for this control spot 2012 .
  • the other function key pixels for other control spots can be identified. Once the function key pixels for all control spots are identified, they can be assigned to various functions in some embodiment.
  • the beamsplitter 2002 can be of a coated type or uncoated type. The coating can be applied to one or both substrate surfaces.
  • the uncoated type is usually made of infrared materials such as Si. Here the transmission and reflection is governed by Fresnel reflection.
  • FIG. 21A is an illustration of a parallel scanning configuration for a line detector array 2100 A, in accordance with some embodiments.
  • Each element represents the IFOV of a pixel.
  • the whole array in FIG. 21A is the field of view (FOV) of the sensor.
  • FOV field of view
  • the IFOVs of a line array 2100 A are oriented in a vertical position and scanned in one direction, for example, from left to right as shown in FIG. 21A , by a scan mirror or the like, for example, illustrated at FIG. 19 , to create a full image.
  • the array 2100 A can include a mixture of function key pixels and non-function key pixels, described herein.
  • FIG. 21B is an illustration of a scanning configuration for a single element non-contact sensor 2100 B, in accordance with some embodiments.
  • the scanning method applied in FIG. 21B is serial.
  • the IFOV of the single element detector is scanned left to right, right to left, and so on, or a serpentine flow, in a raster scanning path to create a full image. Because only a few pixels out of the whole image are assigned to function key pixels, most of the pixels in the full image are not used. Instead of creating a full image, a partial image comprising of all function key pixels and minimum number of non-function key pixels in some embodiments.
  • FIG. 21C is an illustration of a fast scanning configuration for a single element non-contact sensor 2100 C, in accordance with some embodiments.
  • a scan path is formed that creates a partial image including all function key pixels and a minimum number of non-function key pixels.
  • the scan path can be any route as long as it contains all the function key pixels. Because the number of pixels that need to be scanned is small and only one detector is required, a non-contact switch that incorporates this configuration can be fast and inexpensive as compared to other configurations.
  • FIG. 22 is a view of a control spot generator 2200 , in accordance with some embodiments.
  • the control spot generator 2200 comprises of a white LED 2202 , a control spot generator plate 2204 with color filters 2205 , and a lens 2206 .
  • the lens 2206 is separated from the filter plate 2204 by a distance f.
  • Color light is generated from color filters 2205 when the white LED light illuminates. Images of the color filters are created by the lens.
  • the control spot generator 2200 can project multiple well defined illumination control spots onto a projection surface 2207 .
  • a projection pattern on the surface 2207 can comprise blue (B), green (G), yellow (Y), red (R), and purple (P) control spots.
  • FIG. 23 is a view of a control spot generator 2300 , in accordance with some embodiments.
  • the control spot generator 2300 comprises a plurality of color LEDs 2302 , light pipes 2303 , a light pipe mounting plate 2304 , and a lens 2306 .
  • the lens 2306 is separated from the mounting plate 2304 by a distance f.
  • the light pipes 2303 include exit ports at a focal plane of the lens 2306 .
  • LEDs 2302 and corresponding light pipes 2303 can generate control spots of different colors, for example, blue (B), green (G), yellow (Y), red (R), and/or purple (P) control spots.
  • the control spot light is used to illuminate the IFOVs of function keys shown in FIG. 24-28 .
  • a control spot generator can also be constructed from color lasers as oppose to LEDs as in FIG. 22-23 .
  • FIG. 24 is a view of a non-contact switch 2400 including a scanning non-contact sensor 2410 , in accordance with some embodiments.
  • the non-contact switch 2400 can be the same as or similar to other embodiments herein. Details thereof are not therefore not repeated due to brevity.
  • the non-contact switch 2400 comprises the scanning non-contact sensor 2410 , a beamsplitter 2420 , and a control spot generator 2224 .
  • the sensor 2410 includes a detector or a line detector array 2414 and a scan mirror 2412 .
  • a scan mirror 2412 can be a one-axis mirror for a line detector array, or a two-axes mirror for a single element detector, as described herein with reference to other embodiments.
  • the scan mirror 2412 scans the FOV of the sensor continuously in pattern shown FIG. 21A or 21C .
  • the scanning sensor 2410 preferably scans a portion of a field of view at a time until the complete field of view is scanned.
  • Detector or detectors continuously send output light signals captured from IFOVs of function and non-function key pixels to a processor. Light signals from various IFOVs are captured by the same detector for single element detector or same detectors for line detector array.
  • the processor only processes signal from function key pixels. If a target is detected in a function key, then a command is generated and output to a device under control 2602 as shown in FIG. 26 .
  • FIG. 25 is a view of a non-contact switch 2500 including a staring non-contact sensor 2510 , in accordance with some embodiments.
  • the switch 2500 comprises a beamsplitter 2520 , and a control spot generator 2524 .
  • the non-contact switch 2500 can be the same as or similar to other embodiments herein. Details thereof are not therefore not repeated due to brevity.
  • the operation of this non-contact switch is similar to that of scanning non-contact switch except the light signals from various IFOVs are captured by different detectors, and that the staring sensor 2510 generates all image pixels simultaneously.
  • the function keys 2408 , 2508 are aligned with control spots 2409 , 2509 , respectively, along the paths of the IFOVs by the beamsplitter 2420 , 2520 , respectively.
  • two of the function keys 2408 , 2508 can correspond to on/off switches.
  • the R function key can function as an off switch when a hand gesture is positioned in the IFOV of the R function key/control spot
  • the G function key can function as an on switch when a hand gesture is positioned in the IFOV of the G function key/control spot.
  • extra function keys can be used for controlling.
  • a non-contact switch can be also referred as non-contact controller.
  • two other function keys 2408 , 2508 can be assigned to amplitude increasing/decreasing functions.
  • the amplitude can be light level if it is a light switch, or can refer to the speed of a fan if it is a fan switch.
  • the B function key can be used to increase an amplitude of an apparatus under control when a hand gesture is positioned in the IFOV of the B function key/control spot
  • the Y function key can be used to decrease an amplitude of an apparatus under control when a hand gesture is positioned in the IFOV of the Y function key/control spot.
  • the last function key can be assigned to an ambient background measurement function. This is especially important for a thermal non-contact sensor because thermal detectors generate more noise than other detectors. Removing the ambient background can reduce undesirable noise. In embodiments where the thermal sensor is a scanning system with a single element detector, removing the background also removes the dark current of the detector.
  • the user can verify that the IFOV only intercepts an ambient background, and that a hand or other warm targets are not placed in the IFOV.
  • FIG. 26 is an illustration of an apparatus 2602 controlled by a non-contact switch, in accordance with some embodiments.
  • the non-contact switch can be the same as or similar to other embodiments herein. Details thereof are not therefore not repeated due to brevity.
  • a non-contact sensor 2610 of the non-contact switch detects a hand signature in a control spot 2604 generated by a control spot generator 2624
  • the sensor 2610 outputs the information to a processor 2612 , which can be integral with or physically separate from other elements of the non-contact switch.
  • the processor 2612 can include a transceiver.
  • the non-contact switch 2610 and the device under control 2602 are separated by large distance, the transceivers can communicate wirelessly.
  • the processor 2612 determines which function key is in communication with a hand gesture.
  • the locations of control spots correspond to the locations of the function keys. Function keys do not have to fill the control spots completely, but must be at least partially aligned. Accordingly, the control spots can serve as function keys.
  • command C1 can be a switch on command
  • command C2 can be a switch off command
  • command C3 can be an amplitude increase command
  • command C4 can be an amplitude decrease command
  • command C5 can be an ambient background command.
  • FIG. 27 is a view of a non-contact switch 2700 including a camera or cameras 2722 , in accordance with some embodiments.
  • the camera 2722 is a visible band camera.
  • the camera 2722 is a near infrared (NIR) band camera.
  • the camera 2722 is a shortwave infrared (SWIR) band camera. Other spectrums can equally apply.
  • the camera or cameras 2722 is placed next to the spot generator. In some embodiment, cameras 2722 can be placed in other locations.
  • the non-contact switch 2700 can include a non-contact sensor 2710 or the like, which can be the same as or similar to other embodiments herein.
  • the non-contact switch 2700 can include imaging optics 2702 , scan mirror 2712 , detector 2714 , beamsplitter 2720 , and control spot generator 2724 , which can be the same as or similar to those described in other embodiments herein.
  • the camera 2722 receives hand gesture information that can be used to enhance a control function of the non-contact switch 2700 . Accordingly, multiple hand gestures can be processed for a given function key as illustrated by the examples of FIGS. 30, 31, 32, 33, 35, and 36 . For example, the hand gestures are used to control the direction of motion for a drone. In some embodiments, one can control multiple devices by using a combination of function keys and hand gestures. For example, the number of fingers can be assigned to control each of a plurality of lamps, for example, one finger for first lamp, two fingers for a second lamp, three fingers for a third lamp, and so on. A plurality of function keys 2709 and corresponding control spots 2708 can be provided. To switch from the current device to a different device, two hand gestures at two different control spots at simultaneously are needed in some embodiment. FIG. 28 provides more details.
  • FIG. 28 is an illustration of multiple devices 2802 a - c (generally, 2602 ) controlled by a non-contact switch 2800 .
  • the non-contact switch 2800 can include a non-contact sensor 2810 or the like, which can be the same as or similar to other embodiments herein.
  • the non-contact switch 2700 can include a non-contact sensor 2810 , a camera 2814 or multiple cameras, beamsplitter 2820 , and control spot generator 2824 and/or other elements that are the same as or similar to those described in other embodiments herein.
  • the cameras 2814 can be mounted next to the control spot generator 2824 or other place.
  • An illumination source 2822 can provide light for the cameras.
  • a hand gesture can be positioned at different control spots to control the current device. To switch to another device, two hands at two different control spots at the same time are needed. For example, when both hands are inserted in on and off control spots C1, C2 at the same time, a device changing signal, or selection signal, is triggered in the processor 2812 .
  • the hand gesture in C2 contains device number. Number of fingers can be used as device number, for example.
  • a two fingers hand gesture in C2 signals switching from current device to device 2 .
  • the processor 2812 analyzes the hand gesture image from the camera 2814 , a device number is obtained. The processor 2812 communicates directly with the selected device 2802 .
  • the non-contact switch 2800 can be in a tracking mode. This is usually for slow moving controlling such as a non-contact mouse motion of a computer.
  • One control spot C1 can be used to activate the tracking mode using a special hand gesture in some embodiment.
  • the control spots C2-C5 are inactive. Signals from C2-C5 will not be processed.
  • the camera 2814 takes over.
  • the processor now only processes camera images and signal from the control spot C1.
  • a stop hand gesture for example, at a predetermined control spot C1, can be used to end the tracking.
  • FIG. 29 is a flowchart illustrating the relationship between elements of a non-contact switch in an operation, in accordance with some embodiments.
  • the non-contact sensor 2910 can scan the control spots 2909 for a hand heat signature or color information.
  • a hand gesture is inserted into a control spot first.
  • the thermal signal from the hand is captured by the non-contact sensor 2910 .
  • a camera 2914 can capture an image of a region of the control spot 2909 . More specifically, the camera captures the hand gesture, control spot and the background, but only the hand gesture is extracted for processing. If the heat signature or color of hand gesture 2911 is detected in one control spot 2909 , then the operation will control a current device.
  • the camera or cameras 2914 will capture images of hand gesture.
  • the heat signature or color signal from control spot 2909 and the hand gesture information from the camera 2914 are used by the processor 2912 to generate a command.
  • control spots 2909 become inactive.
  • the processor simply stops process the control spot information. Hand tracking can continue until a stop hand gesture is given. When tracking ends, the control spots 2909 become active again. The command is sent to the device 2902 under control. When a hand gesture is detected in another control spot 2909 , the same process will be repeated. If a heat signature or color signal is detected in two control spots 2909 , on and off control spots for example, simultaneously, a device change mode is activated in the processor 2912 . The number of fingers in the image of the hand gesture 2911 will determine which device 2902 the operation will be in. The processor 2912 now communicates only to the current device 2902 . The switching or controlling only occurs in at the selected device 2902 .
  • a non-contact switch can be mounted on any surface, on the ceiling, the wall, or floor. In some embodiment, a non-contact switch can be mounted on forehead, for example, by a headband shown in FIG. 15 . Accordingly, a video game player can use the non-contact switch as a video game controller in some embodiments.
  • the control spots can serve as “buttons” of a video game controller. A player can play the video game by inserting his/her hand gestures into the control spots.
  • the non-contact game controller in a car racing video game can control a remote control car by wearing the controller on the user's head. Because of the simplicity of the process, the process time is fast.
  • an inexpensive reflection panel can be used to enhance the control spot visibility with respect to such games. This may be helpful for beginner game players. As the player plays more, he/she will remember the positions of the control spots.
  • FIG. 30 is a view of control spot functions of a remote controller 3000 for a remote control vehicle, in accordance with some embodiments.
  • the remote controller 3000 includes elements of a non-contact switch in other embodiments herein. Details thereof are therefore not repeated for brevity.
  • a remote control vehicle is controlled by two non-contact controllers, in particular, a left controller 3010 A for direction control and a right controller 3010 B for speed control.
  • the control spots from the non-contact controllers 3010 A, 3010 B are projected on a surface.
  • the control spots can be distinguished from each other by color, size, or other characteristic.
  • the four control spots displayed by the left controller 3010 A are provided for direction control while the four control spots displayed by the right controller 3010 B are for speed control.
  • the hand gesture is a first corresponding to a normal operation command
  • the left hand gesture with a thumb is for fast turning
  • the right hand gesture with a thumb is for fast acceleration or hard braking, depending on which control spot the right hand is in.
  • a predefined key can be established for associating each control spot with a particular hand gesture and corresponding command.
  • control spot A corresponds to an accelerate command
  • control spot B corresponds to a backup command
  • control spot C corresponds to a cruising command
  • control spot D corresponds to a decelerating command
  • control spot L corresponds to a left command
  • control spot P corresponds to a park command
  • control spot R corresponds to a right command.
  • FIG. 31 is a view of control spot functions for a game controller 3100 for a video game console, in accordance with some embodiments.
  • the controller 3100 includes elements of a non-contact switch in other embodiments herein. Details thereof are therefore not repeated for brevity.
  • one or multiple non-contact controller units similar to or the same as non-contact switches herein can be used as a video game controller 3110 A, 3110 B (generally, 3110 ), for example to control a martial arts video game such as the KickstarterTM program, but not limited thereto.
  • a video game controller 3110 A, 3110 B generally, 3110
  • Other video games or electronic devices can equally apply.
  • the left controller 3110 A controls movements of left limbs of a martial arts character displayed in the video game.
  • the right controller 3110 B controls movements of right limbs of the martial arts character.
  • a plurality of control spots from the non-contact controllers 3110 A, 3110 B are projected on a surface.
  • the control spots can be distinguished from each other by color, size, or other characteristic.
  • the four control spots displayed by the left controller 3110 A are provided for left limb control while the four control spots displayed by the right controller 3110 B are for right limb control.
  • a predefined key can be established for associating each control spot with a particular hand gesture and corresponding command.
  • control spot P corresponds to a punch command
  • control spot K corresponds to a kick command
  • control spot B corresponds to a block command
  • control spot G corresponds to a grab command.
  • Hand gestures can be used as additional features in some embodiment. For example, a hand gesture with an extended thumb at the K control spot generates a signal instructing a sidekick to be executed by the video game character.
  • one or more non-contact switch/controllers can be used to control a mechanical apparatus.
  • the control spots of two non-contact controllers 3210 A, 3210 B (generally, 3210 ) of a controller system 3200 can be used to control an excavator or related apparatus.
  • the two controllers 3210 A, 3210 B can replace two joysticks conventionally located at an excavator.
  • the left controller 3210 A corresponding to a left joystick is for controlling the fore arm and bucket of the excavator.
  • the right controller 3210 B corresponding to a right joystick is for controlling the back arm and housing movements of the escalator.
  • a plurality of control spots from the non-contact controllers 3210 A, 3210 B are projected on a surface.
  • the control spots can be distinguished from each other by color, size, or other characteristic.
  • a predefined key can be established for associating each control spot with a particular hand gesture and corresponding command.
  • control spot A1 of the right controller 2310 B corresponds to a command whereby the escalator arm moves up.
  • Control spot A2 corresponds to a command whereby the escalator arm moves down.
  • Control spot A3 corresponds to a command whereby the escalator arm rotates.
  • Control spot H corresponds to a rotation of the escalator housing.
  • the escalator housing rotates to the left.
  • the escalator housing rotates to the right.
  • control spot B1 of the left controller 2010 B corresponds to a command whereby the escalator bucket makes a scoop motion.
  • Control spot B2 of the left controller 2010 B corresponds to a command whereby the escalator bucket makes a dump motion.
  • Control spot F1 of the left controller 2010 B corresponds to a command whereby the escalator forearm is extended.
  • Control spot F2 of the left controller 2010 B corresponds to a command whereby the escalator forearm is curled or retracted.
  • Hand gestures can be used to enhance the controlling functions. For example, a thumb pointing left on control spot A3 rotates the back arm to the left while a thumb pointing to the right on control spot A3 rotates the back arm to the left.
  • a closed first hand gesture can correspond to an instruction to remain at a current status.
  • FIG. 33A is a side view of a non-contact controller system 3300 for controlling a drone 3350 , in accordance with some embodiments.
  • FIG. 33B is a front view of the non-contact controller system 3300 of FIG. 33A .
  • FIG. 33C is a view of a drone 3350 controlled by the non-contact controller system 3300 of FIGS. 33A and 33B .
  • the system 3300 comprises two non-contact controllers 3304 A, 3304 B (generally, 3304 ) and a head mounted display 3302 such as a Google Glass wearable computer.
  • a transceiver 3306 can be also worn on the head or any part of the user's body. Communications between the non-contact controller 3300 and the drone 3350 is via the transceiver 3306 of the controller 3300 and a transceiver on the drone 3350 .
  • the non-contact controllers 3304 are used to control the drone 3350 while the head mounted display 3302 is used to display flight information, onboard camera images, and/or other information related to an operation of the drone 3350 .
  • the two non-contact controllers 3304 are mounted on the user's forehead above the display device 3302 .
  • Two non-contact controllers allow the user to have enough function keys for controlling certain devices.
  • control spot beams 3305 can be projected from the non-contact controllers 3304 in front of the user.
  • the user can position hand gestures along a path of the control spot beams.
  • the control spots 3305 can be formed on and/or about the hand gestures for controlling the drone 3350 .
  • the control spots 3305 can illuminate a surface, and a hand gesture can be made over the surface but at the control spot for controlling the drone 3350 .
  • FIG. 34 is another illustration of a non-contact controller system 3400 for controlling a drone, in accordance with some embodiments.
  • the remote controller 3400 includes elements of a non-contact switch in other embodiments herein such as the controllers 3304 of FIG. 33 . Details thereof are therefore not repeated for brevity.
  • the controllers 3304 A, 3304 B can be left and right controllers, respectively. Only the function keys illuminated by control spot light in the two controllers are shown.
  • the physical controllers can be similar to those described with reference to FIG. 27 .
  • Each controller 3304 can provide different control spot functions with respect to controlling a drone or other remote device.
  • the controller system 3400 performs two modes of operation: manual control and automatic control, which can be established by a hand gesture, described below.
  • a plurality of control spots from the non-contact controllers 3304 A, 3204 B are projected on a surface.
  • the control spots can be distinguished from each other by color, size, or other characteristic.
  • control spot Y of the right controller 3304 B corresponds to a command that instructs the drone 3350 to move in a yaw direction.
  • Control spot AD corresponds to a command that instructs the drone 3350 to ascend or descend.
  • Control spot F corresponds to a command that instructs the drone 3350 to move forward.
  • Control spot H corresponds to a command that instructs the drone 3350 to hover.
  • the drone 3350 When a hand gesture having a thumb extending in a right direction is placed at control spot Y, the drone 3350 is instructed to yaw to the right. When a hand gesture having a thumb extending in a left direction is placed at control spot AD, the drone 3350 is instructed to ascend. When a hand gesture having a thumb extending in a right direction is placed at control spot AD, the drone 3350 is instructed to descend.
  • control spot P corresponds to a command that instructs the drone 3350 to pitch.
  • Control spot LT corresponds to a command that instructs the drone 3350 to land or takeoff
  • Control spot R corresponds to a command that instructs the drone 3350 to roll.
  • Control spot T corresponds to a command that instructs the drone 3350 to toggle between automatic and manual modes and/or to change a camera configuration.
  • the drone 3350 When a hand gesture having a thumb extending in a right direction is placed at control spot LT, the drone 3350 is instructed to take off. When a hand gesture having a thumb extending in a left direction is placed at control spot LT, the drone 3350 is instructed to land. A closed first hand gesture can correspond to an instruction to remain at a current status. When a hand gesture having a thumb extending in a left direction is placed at control spot R, the drone 3350 is instructed to roll in the positive direction. When a hand gesture having a thumb extending in a right direction is placed at control spot R, the drone 3350 is instructed to roll in the negative direction. The direction of roll obeys the right hand rule.
  • a hand gesture with a thumb pointing right is for a manual mode.
  • a hand gesture with a thumb pointing left is for an automatic mode.
  • FIG. 35A is an image generated from an onboard camera of the drone 3350 of FIG. 33C when the controller system is configured for a manual mode of operation, in accordance with some embodiments.
  • FIG. 35B is an image generated from the onboard camera of the drone 3350 of FIG. 33C when the controller system is configured for an automatic mode of operation, in accordance with some embodiments.
  • FIG. 35C is a view of the flight track of the drone 3350 of FIG. 33C .
  • the display device 3302 can be a Google Glass screen or the like, and can include an onboard camera.
  • the image 3500 A captured by the onboard camera of a drone, for example, described herein, is shown on the top with drone information such as altitude, heading, pitch/yaw, and image mean in percentage of the camera full dynamic range.
  • drone information such as altitude, heading, pitch/yaw, and image mean in percentage of the camera full dynamic range.
  • Other information regarding the camera such as roll, pitch/yaw, and frame rate can also be displayed at the bottom of the display.
  • the upper left corner includes the onboard camera image with drone information.
  • Other information regarding the camera such as roll, pitch/yaw, and frame rate can also be displayed at the bottom of the display.
  • other information can include a flight track and current position of the drone overlaying the map of the flight area. Flight track and altitude are pre-planned using GPS coordinates. The drone 3350 will fly according to the planned track at the prescribed altitude and return to the ground automatically.
  • the user can control the pitch, yaw (heading), roll, forward, and hovering motions of the drone 3350 by putting hand gestures in the appropriated control spots in some embodiment.
  • multiple hand gestures can be employed, for example, fist, hand with thumb pointing left, and hand with thumb pointing right.
  • the left thumb pointing gesture can be changed to right pointing gesture by rotating the arm or vice-versa.
  • the direction of the thumb is the direction of an operation. For example, if the user wants the drone 3350 to turn left from the current direction, he/she will put one hand in control spot Y with thumb pointing left for turning left and another hand on control spot R with thumb pointing right for negative roll.
  • the sign of the roll obeys the right hand rule.
  • the drone 3350 must roll slightly.
  • the limited number of simple hand gestures allows the algorithm short processing time. In some embodiments, more complex hand gestures can be employed.
  • FIG. 36 is an illustration of a controller system 3600 controlling an onboard camera, in accordance with some embodiments.
  • the camera can be on the drone 3350 described with respect to FIG. 33C .
  • the controller 3600 includes elements of a non-contact switch in other embodiments herein. Details thereof are therefore not repeated for brevity.
  • the controllers 3604 A, 3604 B (generally, 3604 ) can be left and right controllers, respectively. Each controller 3604 can provide different control spot functions with respect to controlling an onboard camera configuration.
  • a plurality of control spots from the non-contact controllers 3604 A, 3604 B are projected on a surface.
  • the control spots can be distinguished from each other by color, size, or other characteristic.
  • control spot Y of the right controller 3604 B corresponds to a command that instructs the onboard camera to move in a yaw direction.
  • Control spot P corresponds to a command that instructs the onboard camera to pitch.
  • Control spot R corresponds to a command that instructs the onboard camera to roll.
  • Control spot FR corresponds to a command that changes a frame rate of the onboard camera.
  • control spot I corresponds to a command that instructs the camera to be inactive.
  • Control spot LT corresponds to a command that instructs the drone 3350 to land or takeoff.
  • Control spot T corresponds to a command that toggles the camera between automatic and manual modes and/or to change a camera configuration.
  • the onboard camera can be placed in a manual mode.
  • the onboard camera can be placed in an automatic mode.
  • the function keys on the left controller 3604 A can become inactive when a first is inserted into control spot T.
  • FIG. 37 is an illustration of a diffuser 3700 scattering light, in accordance with some embodiments.
  • the diffuser 3700 scatters light into different directions. The scattering can occur in the diffuser's volume or on its surfaces. It can occur in transmission or reflection. Scattering not only scattered light out of the original direction of an incident light ray, it can also scatter light into the original direction from other light rays. As shown in FIG. 37 , the unscattered transmitted light rays of incident light rays I 0 and I 1 are I 0t and I 1t , respectively. The scattered transmitted light rays of incident light rays I 0 and I 1 are I 0s and I 1s , respectively.
  • a ground glass diffuser is positioned between a camera and object, namely, a U.S. map so that the field of view of the camera is directed at a portion of the map via the diffuser.
  • the map is about 2 feet away from the diffuser.
  • the region of the map behind the diffuser shows an uniform background with no detail due to smearing of the diffuser.
  • map details are shown as illustrated in image 3800 A of FIG. 38A .
  • image 3800 B shown in FIG. 38B a hand touches the diffuser, whereby image of the hand is clearly visible although a bit blurry.
  • the hand is near but does not touch the diffuser.
  • the hand is still visible but more distorted than in the image 3800 B.
  • the hand moves further away from the diffuser, whereby the hand has lost details. How fast the diffuser smears out details depends also on density of the diffuser grid. It is faster for denser grid and slower for less denser grid.
  • FIG. 39 shows a layout for such a switch 3900 .
  • the switch 3900 comprises a diffuser 3904 , an imaging sensor 3906 , a light source 3902 , a processor 3912 , and the device under control 3905 .
  • the light source 3902 can illuminate a target from either side of the diffuser 3904 .
  • a light guide is usually used in conjunction with the diffuser 3904 .
  • the processor 3912 can process the hand gesture image and send a command to the device under control 3905 based on hand gesture information, for example, described in other embodiments herein.
  • two imaging sensors 3906 can be used.
  • FIG. 40A is a top view of a switch 4000 having a diffuser, in accordance with some embodiments.
  • FIG. 40B is a side view of the switch 4000 of FIG. 40A .
  • a plurality of control spots are formed by a diffuser 4004 in some embodiments. When user's hand is placed on the control spots, this indicates that the user intends to control the device in communication with the switch 4000 .
  • color control spots are placed on the diffuser 4004 .
  • the control spots can be generated by color filters 4006 coated or mounted on the surfaces of the diffuser 4004 in some embodiments. In other embodiments, the control spots can be generated by shining color lights on the control spots.
  • the control spots correspond to function key positions, for example, similar to other embodiments herein.
  • the switch 4000 can be useful for machine operation and video gaming in some embodiments.
  • the non-contact controllers 3200 in FIG. 32 can be replaced by two diffuser based controllers: one positioned at the left side of a driver's seat, the other at the right side.
  • the functions keys of the switch 4000 can be the same as those of other switches or controllers described herein except the control spots are located on the surfaces of the diffuser 4004 .
  • Same hand gestures can be used except imaging sensors 4010 are implemented instead of cameras.
  • the joysticks described in FIG. 32 can be replaced by diffuser based switch/controllers in some embodiments.
  • Joystick functions can be assigned to control spots and hand gestures.
  • FIG. 41 is an illustration of an operator using the diffuser-based controller 4000 of FIG.
  • a hand gesture 4012 is positioned at the diffuser 4004 , under which one or more control spots are provided corresponding to various function key positions.
  • one of the control spot can be reserved for a tracking mode. When user's hand is placed over this control spot, all other control spots are inactive.
  • a sensor is provided to for tracking a user's hand motion. Tracking will stop when the user inserts a particular hand gesture over the same control spot.
  • Hand tracking can be used in beam steering control in some embodiments. It can also be employed as a mouse for a computer in some embodiments.
  • a combination lock used combination of integers to lock and unlock.
  • Hand gestures can be used to represent these integers as illustrated in FIG. 42 in some embodiments.
  • the number panel of a combination lock 4200 is also shown in FIG. 42 .
  • hand gestures can be used to lock and unlock a door or any device.
  • FIG. 43 is an illustration of a diffuser-based hand gesture lock mechanism 4300 , in accordance with some embodiments.
  • the hand gesture lock mechanism 4300 can comprise of a light source 4302 , an imaging sensor 4304 , a diffuser 4306 , and a processor 4312 , which are the same as or similar to those of other switches and controllers herein. A description thereof is not repeated for brevity.
  • An electromechanical locking/unlocking mechanism 4308 is also provided. For example, if the combination is 5912, then a hand gesture of 5 fingers is placed near or on the diffuser first, followed by a “pinkyless” hand gesture, proceeded by index finger hand gesture. Index and middle fingers hand gesture is the last hand gesture.
  • Processor 4312 can process the images and generate a command to the electromechanical locking/unlocking mechanism 4308 to unlock the door. The background shown in FIG. 43 is not seen by the imaging sensor because it is smeared out due to its distance from the diffuser.

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Theoretical Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Human Computer Interaction (AREA)
  • Social Psychology (AREA)
  • Psychiatry (AREA)
  • General Health & Medical Sciences (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Health & Medical Sciences (AREA)
  • User Interface Of Digital Computer (AREA)
  • Spectroscopy & Molecular Physics (AREA)
  • Position Input By Displaying (AREA)

Abstract

Provided is a non-contact sensing device that comprises a sensor comprising a plurality of function key sensors. A function key sensor of the plurality of function key sensors has a field of view. The function key sensor is constructed and arranged to detect a hand gesture at the field of view and to generate a function key control signal in response to detecting the hand gesture at the field of view. A processor processes the function key control signal from the function key sensor and outputs a command to a remote apparatus in response to the processed control signal.

Description

RELATED APPLICATIONS
This application claims the benefit of U.S. Provisional Patent Application No. 61/840,791 filed Jun. 28, 2013, the content of which is incorporated herein by reference in its entirety. This application is related to U.S. patent application Ser. No. 13/826,177 filed on Mar. 14, 2013, U.S. Provisional Patent Application No. 61/643,535 filed on May 7, 2012, U.S. Provisional Patent Application No. 61/684,336 filed on Aug. 17, 2012, U.S. Provisional Patent Application No. 61/760,966 filed on Feb. 5, 2013, PCT Patent Application No. PCT/US13/39666 filed May 6, 2013, U.S. Provisional Patent Application No. 61/696,518 filed on Sep. 4, 2012, U.S. Provisional Patent Application No. 61/846,738 filed on Jul. 16, 2013, U.S. patent application Ser. No. 14/048,505 filed on Oct. 8, 2013, U.S. Provisional Patent Application No. 61/985,762 filed on Apr. 29, 2014, the content of each of which is incorporated herein by reference in its entirety.
TECHNICAL FIELD
The present inventive concepts generally relate to device control, and more particularly relate to devices, systems, and methods for controlling a machine, instrument, robot, vehicle, or other device or object according to hand gestures.
BACKGROUND
Operation of a machine by a human operator can be summarized as follows. First, an operator can observe or inspect the result of a previous operation. If the desired result is not yet obtained, the operation can continue, or adjustments can be made to the inputs of the next operation. This process can continue until a desired result is obtained.
Conventional approaches require an operator to control an operation of a machine, device, and/or instrument by way of mechanical elements in communication with the machine, device, and/or instrument, such as a joystick, steering wheel, or foot pedestal.
SUMMARY
In one aspect, provided is a non-contact sensing device, comprising a sensor comprising a plurality of function key sensors. A function key sensor of the plurality of function key sensors has a field of view. The function key sensor is constructed and arranged to detect a hand gesture at the field of view and to generate a function key control signal in response to detecting the hand gesture at the field of view. A processor processes the function key control signal from the function key sensor and outputs a command to a remote apparatus in response to the processed control signal.
In some embodiments, the non-contact sensing device comprises one or more cameras that recognize the hand gesture, wherein the command is generated from a combination of a function key corresponding to the function key sensor and the recognized hand gesture.
In some embodiments, the sensor is a staring sensor comprising a detector array that includes a combination of the function key sensors and non-function key sensors, the staring sensor generating all image pixels of the detector array simultaneously.
In some embodiments, the sensor is a scanning sensor that scans a portion of a field of view at a time.
In some embodiments, the sensor includes a scan mirror that scans all function key sensors and only the non-function key sensors in the path of the scan to shorten the data acquisition time.
In some embodiments, the sensor is constructed and arranged as an emissive mode sensor comprising a thermal sensor that collects thermal radiation emitted from the hand gesture.
In some embodiments, the sensor is constructed and arranged as a reflective mode sensor comprising a color sensor that collects color light reflected from the hand gesture.
In some embodiments, the non-contact sensing device further comprises a control spot generator that generates a control spot that is aligned with the field of view, and the function key sensor detects a target within the control spot
In some embodiments, the sensor detects the hand gesture at the control spot.
In some embodiments, the non-contact sensing device further comprises a beamsplitter positioned between the sensor and the control spot generator, wherein light output from the control spot generator directed at the beamsplitter coincides with the field of view.
In some embodiments, a function key corresponding to the function key sensor distinguished from other function key sensors is identified by positioning a ground truth target such as a hand at the control spot among a plurality of control spots and collecting by the non-contact sensing device an image of the ground truth target, wherein a pixel or group of pixels at the sensor having a highest detector output is identified as the function key.
In some embodiments, the control spot generator comprises a white light emitting diode (LED), a control spot generator plate having a plurality of color filters, and a lens, wherein color light is generated from the color filters when the white LED illuminates, and wherein a plurality of control spots are generated.
In some embodiments, each color control spot is aligned with a field of view of a function key sensor.
In some embodiments, the control spot generator comprises a plurality of color LEDs, light pipes, a light pipe mounting plate, and a lens, wherein the color LEDs are placed at the input ends of light pipes and the output ends of light pipes are placed at the focal plane of the lens, thereby generating control spots of different colors that each illuminate a field of view of a different function key sensor, and wherein the light pipe plate holds the light pipes together at the focal plane of the lens.
In some embodiments, the remote apparatus comprises a plurality of devices, and the processor generates a device number for a device of the plurality of devices, each device number corresponding to a hand gesture at a designated control spot, thereby allowing a user to choose what device to operate.
In some embodiments, a function key corresponding to the function key pixel become inactive when the hand gesture is placed in the control spot, and the function key is reactivated when the hand gesture is placed in the control spot.
In some embodiments, the sensor is a color sensor comprising color filters on a rotating wheel in front of the sensor, wherein an image of a scene is taken for each color filter, and wherein function key pixels of color images are processed to determine if a skin color spectrum is detected.
In some embodiments, the sensor is a color sensor comprising a color camera, and wherein function key pixels of color images are processed to determine if a skin color spectrum is detected.
In some embodiments, the processor distinguishes the function key sensor from other sensors of the plurality of sensors from the function key positions stored in the processor during a function key identification calibration process.
In some embodiments, the non-contact sensing device further comprises a head mounted display collocated with the sensor, the display providing visual information regarding an operation of the remote apparatus.
In some embodiments, the remote apparatus is a drone, the non-contact sensing device is constructed and arranged to control the operation of the drone, and the head mounted display displays a combination of flight information, onboard camera images, and other information regarding the operation of the drone.
In some embodiments, a camera captures image data corresponding to the hand gesture and the processor converts the captured image data into a cursor command signal that controls a cursor at a display.
In another aspect, provided is a hand gesture control system, comprising: a control spot generator that forms a control spot at a surface; a sensor that senses the presence of a hand in the control spot; at least one hand gesture sensor that provides a field of view for capturing images of a hand gesture at the control spot; a processor that converts the captured images of the hand gesture into a command signal; and a transmitter that outputs the command signal to an apparatus that translates the command signal to an action performed by the apparatus.
In some embodiments, the sensor is at least one of a color sensor or a thermal sensor.
In some embodiments, the hand gesture control system further comprising a display for operating the apparatus, and a plurality of applications that are displayed from the display, which are activated in response to the command signal corresponding to the hand gesture.
In some embodiments, the sensor comprises a visible-thermal dual band camera or multiple cameras for recognizing the hand gesture.
In some embodiments, the processor converts the captured images of the hand gesture into a cursor command signal that controls a cursor at a display.
In some embodiments, the processor converts the captured images of the hand gesture into a joystick command signal that controls the apparatus without interaction with the display.
In some embodiments, the hand gesture control system is constructed and arranged for mounting to a ceiling, a mount, or a head band.
In another aspect, provided is a method for providing non-contact switching and controlling an apparatus, comprising: sensing of hand signal in the field of view of a function key sensor of a imaging sensor lit up by color light of a control spot generator; capturing a hand gesture images by one or multiple cameras in a control spot; issuing, by a processor, a control command based on the assigned task of the function key sensor, the assigned task of the hand gesture, or both, sample of the image area to form an image pixel; converting the image pixel to a function key pixel; and controlling an apparatus by blocking a field of view of the function key pixel.
In another aspect, provided is a diffuser based control system comprising: a diffuser for smearing background details while keeping details for target near it; a imaging sensor for capturing hand gestures; one or more function key sensors constructed from selected pixels of the imaging sensor; a light source for illuminating the target; control spots for identifying field of views of function key sensors; a processor for processing a combination of hand gesture images, function key sensor signals, and converting them into commands; and a transceiver for sending commands to and receiving information from the control apparatus.
BRIEF DESCRIPTION OF THE DRAWINGS
The foregoing and other objects, features and advantages of embodiments of the present inventive concepts will be apparent from the more particular description of preferred embodiments, as illustrated in the accompanying drawings in which like reference characters refer to the same elements throughout the different views. The drawings are not necessarily to scale, emphasis instead being placed upon illustrating the principles of the preferred embodiments.
FIG. 1 is a flowchart illustrating an operation requiring manual control.
FIG. 2A is a flowchart illustrating the operation of an apparatus, in accordance with some embodiments.
FIG. 2B is a block diagram illustrating an operation of a hand gesture control system, in accordance with some embodiments.
FIGS. 2C and 2D are views of a relationship between a hand gesture and a cursor and corresponding display views, in accordance with some embodiments.
FIG. 3A is a diagram illustrating the operation of an apparatus controlled by a hand gesture control module that provides an interaction between a hand gesture and a wearable computer display, in accordance with some embodiments.
FIGS. 3B and 3C are diagrams illustrating views of a wearable computer display, in accordance with some embodiments.
FIGS. 4A to 4C illustrate different hand gestures and corresponding commands for performing cursor operations, in accordance with embodiments.
FIGS. 5A to 5F illustrate different hand gestures and corresponding commands for performing joystick operations, in accordance with embodiments.
FIG. 6 is a diagram illustrating a hand gesture control module remotely controlling a light emitting diode (LED) lamp, in accordance with some embodiments.
FIG. 7 is a diagram illustrating a hand gesture control module remotely controlling a television set, in accordance with some embodiments.
FIG. 8 is a block diagram of a hand gesture control module, in accordance with some embodiments.
FIG. 9 is a diagram of a flashlight camera coupled to a gimbal of a beam steering mechanism, in accordance with some embodiments.
FIG. 10 is a diagram of a dual-axis pitch-roll gimbal of a beam steering mechanism, in accordance with some embodiments.
FIG. 11A a top view of a visible-thermal dual-band flashlight camera, in accordance with some embodiments.
FIG. 11B a side view of the visible-thermal dual-band flashlight camera of FIG. 11A.
FIG. 12 is a diagram of a hand gesture control module, in accordance with some embodiments.
FIG. 13 is a diagram of a control spot generator, in accordance with some embodiments.
FIG. 14 is a view of a visible-thermal dual-band camera of a hand gesture sensor, in accordance with some embodiments.
FIGS. 15A-15B are views of various headsets, each including a three-dimensional hand gesture control module and a predetermined number of cameras, in accordance with some embodiments.
FIGS. 15C and 15D are views of a control module outputting color control spots at two different heights before and after a hand gesture is positioned, respectively, in accordance with some embodiments.
FIG. 16 is a view illustrating an operation of elements of a non-contact switch system, in accordance with some embodiments.
FIG. 17 is a view of a function key of a non-contact switch, in accordance with some embodiments.
FIG. 18 is an illustration of multiple non-contact keys constructed from the pixels of a sensor image, in accordance with some embodiments.
FIGS. 19A and 19B are views of a staring-type non-contact sensor and a scanning-type non-contact sensor, respectively, in accordance with some embodiments.
FIGS. 19C and 19D are views of a thermal sensor and a skin color sensor, respectively, in accordance with some embodiments.
FIG. 20 is a view of a system 2000 for aligning a function key pixel and a control spot, in accordance with some embodiments.
FIG. 21A is an illustration of a parallel scanning configuration for a line detector array, in accordance with some embodiments.
FIG. 21B is an illustration of a scanning configuration for a single element non-contact sensor, in accordance with some embodiments.
FIG. 21C is an illustration of a fast scanning configuration for a single element non-contact sensor, in accordance with some embodiments.
FIG. 22 is a view of a control spot generator, in accordance with some embodiments.
FIG. 23 is a view of a control spot generator, in accordance with some embodiments.
FIG. 24 is a view of a non-contact switch including a scanning non-contact sensor, in accordance with some embodiments.
FIG. 25 is a view of a non-contact switch including a staring non-contact sensor, in accordance with some embodiments.
FIG. 26 is an illustration of an apparatus controlled by a non-contact switch, in accordance with some embodiments.
FIG. 27 is a view of a non-contact switch including a camera, in accordance with some embodiments.
FIG. 28 is an illustration of multiple devices controlled by a non-contact switch, in accordance with some embodiments.
FIG. 29 is a flowchart illustrating the relationship between elements of a non-contact switch in an operation, in accordance with some embodiments.
FIG. 30 is a view of control spot functions of a remote controller for a remote control vehicle, in accordance with some embodiments.
FIG. 31 is a view of control spot functions of a game controller for a video game console, in accordance with some embodiments.
FIG. 32 is a view of control spot functions for two non-contact controllers controlling an excavator, in accordance with some embodiments.
FIG. 33A is a side view of a non-contact controller system for controlling a drone, in accordance with some embodiments.
FIG. 33B is a front view of the non-contact controller system of FIG. 33A.
FIG. 33C is a view of a drone controlled by the non-contact controller system of FIGS. 33A and 33B.
FIG. 34 is another illustration of control spot functions of a non-contact controller system for controlling a drone, in accordance with some embodiments.
FIG. 35A is an image generated from an onboard camera of the drone of FIG. 33C when the controller system is configured for a manual mode of operation, in accordance with some embodiments.
FIG. 35B is an image generated from the onboard camera of the drone of FIG. 33C when the controller system is configured for an automatic mode of operation, in accordance with some embodiments.
FIG. 35C is a view of the flight track of the drone of FIG. 33C.
FIG. 36 is an illustration of a controller system controlling an onboard camera, in accordance with some embodiments.
FIG. 37 is an illustration of a diffuser scattering light, in accordance with some embodiments.
FIGS. 38A-D are images generated when a diffuser is between a camera and an object, in accordance with some embodiments.
FIG. 39 is a view of elements of a switch having a diffuser, in accordance with some embodiments.
FIG. 40A is a top view of a switch having a diffuser, in accordance with some embodiments.
FIG. 40B is a side view of the switch of FIG. 40A.
FIG. 41 is an illustration of an operator using a diffuser-based controller, in accordance with some embodiments.
FIG. 42 are views of hand gestures representing combination lock integers, in accordance with some embodiments.
FIG. 43 is an illustration of a diffuser-based hand gesture lock mechanism, in accordance with some embodiments.
DETAILED DESCRIPTION OF EMBODIMENTS
The terminology used herein is for the purpose of describing particular embodiments and is not intended to be limiting of the inventive concepts. As used herein, the singular forms “a,” “an” and “the” are intended to include the plural forms as well, unless the context clearly indicates otherwise. It will be further understood that the terms “comprises,” “comprising,” “includes” and/or “including,” when used herein, specify the presence of stated features, integers, steps, operations, elements, and/or components, but do not preclude the presence or addition of one or more other features, integers, steps, operations, elements, components, and/or groups thereof.
It will be understood that, although the terms first, second, third etc. may be used herein to describe various limitations, elements, components, regions, layers and/or sections, these limitations, elements, components, regions, layers and/or sections should not be limited by these terms. These terms are only used to distinguish one limitation, element, component, region, layer or section from another limitation, element, component, region, layer or section. Thus, a first limitation, element, component, region, layer or section discussed below could be termed a second limitation, element, component, region, layer or section without departing from the teachings of the present application.
It will be further understood that when an element is referred to as being “on” or “connected” or “coupled” to another element, it can be directly on or above, or connected or coupled to, the other element or intervening elements can be present. In contrast, when an element is referred to as being “directly on” or “directly connected” or “directly coupled” to another element, there are no intervening elements present. Other words used to describe the relationship between elements should be interpreted in a like fashion (e.g., “between” versus “directly between,” “adjacent” versus “directly adjacent,” etc.). When an element is referred to herein as being “over” another element, it can be over or under the other element, and either directly coupled to the other element, or intervening elements may be present, or the elements may be spaced apart by a void or gap.
FIG. 1 is a flowchart illustrating an operation requiring manual control of a device.
A human operator 12 participates in an operation 18 requiring manual control 14 of a device 16, such as a machine, instrument, robot, vehicle, or other device or object. Manual control 14 can include direct manual control of the apparatus 16, for example, the operator 12 sitting in a car and turning the car in a desired direction by way of a steering wheel in the car. Manual control 14 can alternatively include indirect control, or remote control, of the apparatus 16, for example, the operator 12 remotely controlling a drone by way of a joystick in communication with a computer console. When the operation 18 is performed, a human observation 20 may establish that adjustments are required regarding the control of the apparatus 16. For example, the human operator 12 may determine that a joystick must be moved in a different direction as part of the operation 18 in response to human observation.
In some applications, for example, when seated in a vehicle, the operator may have to sit in a cramped space and repeat laborious hand and foot motions for hours when performing operations, which can be ergonomically hazardous. In some applications, the operator may be physically close to a hazardous operation area.
Remote controls may be used to address these problems. While remote controls have advantages in some situations because they are inexpensive and easily constructed, they have extensive weight and require repeated hand motions, for example, when using a joystick or computer mouse to move a cursor translating to a movement of the apparatus.
Hand gesture control mechanisms do not have extra weight, since hand gesture motions are natural motions of a user's hands and fingers. As described in U.S. patent application Ser. No. 13/826,177, incorporated herein by reference in its entirety, hand gestures can be used to control a light emitting diode (LED) lamp or related light-emitting device. Here, the hand gesture recognition can be accomplished by a visible camera in conjunction with either a thermal sensor or a radiometric skin detection sensor. A beam steering mechanism can steer the illumination spot and control spot generated by the device as well as the field of view (FOV) of the sensors and visible camera according to the user's hand gestures.
In brief summary, embodiments of the present inventive concepts include systems and methods for permitting a hand gesture to be used to remotely control an apparatus such as a mechanical device, instrument, machine, robot, gaming console, and/or other apparatus capable of receiving and processing a signal output in response to the hand gesture. The system generates and outputs a control spot so that a user can determine where to position a hand or related object. In doing so, the user can make a hand gesture prior to or at the control spot to control the apparatus. Accordingly, the hand gesture can represent a command. In some embodiments, a hand gesture control module can replace a control panel, joystick, computer mouse, or other conventional peripheral device, or direct manual action commonly use to communicate with one or more machines, robots, devices, instruments, and game consoles. In other embodiments, a hand gesture control module can complement a control panel, joystick, computer mouse, or other conventional peripheral device, or direct manual action commonly use to communicate with one or more machines, robots, devices, instruments, and game consoles.
FIG. 2A is a flowchart illustrating the operation of an apparatus 16, in accordance with some embodiments.
The apparatus 16 can include an instrument, robot, vehicle device such as a crane shown in FIG. 3, gaming console, and/or other mechanical device that can receive and process an electronic, optical, RF, or other control signal 23 output to the apparatus in response to a hand gesture or the like performed to control a movement or other action of the apparatus.
As shown in FIG. 2A, a control signal 23 is generated by a hand gesture control system 30 instead of conventional manual control 14. Also, in FIG. 2A, a human observation decision diamond 22 can be performed by human observation with or without a wearable computer, such as a Google Glass computer. The wearable computer, preferably with an optical head-mounted display (OHMD), can be added to monitor an operation and assist in observing the operation and enhancing the hand gesture functions during feedback from the operation 18 performed by the apparatus 16. In some embodiments where only a few hand gestures are needed, a wearable computer is not part of the system.
As shown in FIG. 2B, the flowchart of FIG. 2A can be implemented, whereby a hand gesture 71 can be used to control a movement of a cursor 72 in some embodiments, for example, between different displayed icons, buttons, windows, or other display elements known to those of ordinary skill in the art. The cursor 72 is shown as an arrow in the display 44 in FIG. 2C. FIG. 2D illustrates a hand gesture in an image grid of pixels 78. The pixel location 71 a of the hand gesture in the image 78 is known. The pixel location of the hand gesture 71 is converted into a pixel position of the cursor 72 in the display 44. The cursor 72 can move to a new position from a current position. FIG. 2B illustrates the process of moving the cursor 72 by the hand gesture 71. Image 78 a corresponding to the hand gesture is captured. The location 71 a of the hand gesture is extracted from the image 78 a. The location 71 a is converted into a cursor position 72 a in the display 44. The cursor 72 moves to new position from current position. The process keeps repeating until the hand gesture 71 stops moving.
FIG. 3A is a diagram illustrating the operation of an apparatus 16 controlled by a hand gesture control system 30 that provides an interaction between a hand gesture and a wearable computer display 44, in accordance with some embodiments. In FIG. 3A, the apparatus 16 is a crane. However, the inventive concepts are not limited thereto, and can be implemented for an operation of another apparatus known to those of ordinary skill in the art, for example, pilotless airplanes, robots, instruments, game consoles, and so on. In some embodiments, the hand gesture control system 30 can be mounted on a permanent platform such as ceilings, walls, or fixtures. In some embodiments, the hand gesture control system 30 can be mounted to temporary platforms such as tripods, poles, and other fixtures for field operations. In some embodiments, as shown in FIGS. 15A and 15B, a hand gesture control module can be mounted on a head set worn by the operator.
During operation, a crane operator can place a hand in an illuminated control spot 46 and make a hand gesture. The control spot 46 can be generated by a light source, for example, an LED light, that illuminates a surface with the control spot 46. In some embodiments, a single control spot 46 is employed. In other embodiments, multiple control spots are employed. In some embodiments, the control spot 46 is a color in the visible spectrum. The control spot 46 can be produced by a filter, a light pipe, a control spot LED, or a combination thereof, for example, described herein.
The hand gesture can be presented as a cursor 72 or the like at the display 44. The image size corresponding to the hand gesture and the display size corresponding to the cursor can be scaled. For example, if the image size is twice the size of the display, then a hand gesture position at i, j (200,400) is cursor position (100,200) in the display 44.
When the hand gesture moves over a control spot generated by the hand gesture control system 30, the cursor also moves in the same direction. The display and camera image are preferably aligned so that when the hand gesture moves in the horizontal direction in the image, the cursor also moves in the horizontal direction at the display 44. At the display screen 44A, the operator can choose the apparatus to operate on by selecting, e.g., double clicking, the icon 51 corresponding to the apparatus 16, for example, the crane. In some embodiments, the clicking is performed by hitting the index finger by the thumb, i.e., a motion whereby the user snaps the fingers together as shown in FIG. 4(B). Other hand gestures can translate to different cursor functions. For example, holding down the cursor is accomplished by touching the index finger with the thumb as shown in FIG. 4(C).
During operation, the hand gesture control module mounted above the user or worn by the user captures image of the hand gesture. The processor analyzed the hand gesture type. If it is a cursor hand gesture, then its position is a cursor position. When the hand gesture moves, the cursor also moves to a new location. By moving the hand gesture to the upper left corner of the image, the cursor also moves to the upper left corner of the display. When the cursor arrives at icon 51, using the clicking hand gesture 70B shown in FIG. 4(B), the user can then access the control panel 58 of the crane. One hand is shown being used for gesture control. In some embodiments, two hands can be placed in the control spot for hand gesture control, for example, to enhance hand gesture control capability permitting additional hand gestures to be processed.
FIGS. 3B and 3C are diagrams illustrating views of a wearable computer display 44, in accordance with some embodiments.
At display 44, a computer application screen 44A displays icons of various devices, robots, equipment, instruments, vehicles, games, and so on. Instead of using a mouse or the like to move a cursor over the icon, a user can activate an icon, for example, icon 55, by performing a hand gesture. In doing so, an operation screen 44B can be displayed. In some embodiments, the operation screen 44B includes an instrument status sub-window 56 showing current information of the instrument, for example, device status information and an environmental sub-window 57 showing environmental information, which is also important in crane operation, for example, providing weather information so that the operator is aware of a rainy day, and icons of various tools for controlling the instrument, such as a button mode icon 58, and a joystick mode icon 60. In some embodiments, hand gesture control can operate in cursor mode, shown in the selection menu display of the control panel of FIG. 3B and FIG. 3C. In a cursor mode, the user can click on buttons on the control panel. In some embodiments, hand gesture control can be performed in a joystick mode. In this mode, hand gestures for various motions can be used. In some embodiments, hand gesture control can be performed in both a cursor and a joystick mode.
FIGS. 4A to 4C illustrate different hand gestures 70A-70C and corresponding commands for performing cursor operations, in accordance with embodiments. In FIG. 4A, a hand gesture 70A relates to a command so that the hand gesture 70A emulates a cursor, or otherwise performs a function of a cursor. For example, a cursor clicking function can be performed by making a hand gesture 70B that includes hitting the index finger with the thumb as shown in FIG. 4B. The hand gesture 70B can be performed to “click” on icon 55 of FIG. 3B, and activate a program corresponding to icon 55. In another example, a function that relates to holding down the cursor is accomplished by touching the index finger with the thumb as illustrated by the hand gesture 70C shown in FIG. 4C, for example, to drag and drop icons, windows, buttons, on a display, similar to a cursor function.
FIGS. 5A to 5F illustrate different hand gestures 80A-80F and corresponding commands for performing joystick operations, in accordance with embodiments. In FIG. 5A, a hand gesture 80A relates to a command for emulating an up motion or pitching up motion of a joystick, which translates to a motion on a computer screen of an item, character, or other element displayed on the computer screen that moves from one point at the computer screen to another point in response to the hand gesture. During a typical joystick mode operation, the operator does not look at the display, but instead focuses attention at the operation at hand by looking at the joystick. A human response in joystick mode, i.e., using hand gestures, is faster than in a cursor mode. In other words, in a joystick mode, minimum interaction occurs between the operator and the monitor. The operator memorizes the hand gestures for various motions. A joystick mode can therefore be faster than a cursor mode because the operator can focus attention on the operation. An operation in a joystick mode may therefore be preferred by gamers or the like.
In FIG. 5B, a hand gesture 80B relates to a command for emulating joystick down motions, including a pitching motion. In FIG. 5C, a hand gesture 80C relates to a command for stopping up and down motions, including pitching motions. Circular motions, for example, for controlling a rotating motion of a steering wheel, can be obtained by making a rotating hand gesture in a circular pattern as illustrated by the hand gesture 80D shown in FIG. 5D. A linear motion can be obtained a hand gesture motion along the horizontal plane as illustrated by the hand gesture 80E shown in FIG. 5E. The hand gesture 80F on the lower right corner can be for triggering, for example, pulling the trigger on a joystick, gun, or other device. In some embodiments as shown in FIGS. 4 and 5, hand gesture motions 70, 80 are two-dimensional hand gestures.
Referring again to FIG. 3B, the application screen 44A can display a large number of icons, windows, buttons, or other visual representations of computer applications of various types. A hand gesture control system 30 in accordance with some embodiments can be used to control many different devices using the applications corresponding to the icons 55 displayed at the application screen 44A. For example, the same hand gesture control system 30 can be used to control a robot or other mechanical device having one or more elements that move relative to each other. The same hand gesture control system 30 can also be used to control one or more other apparatuses.
FIG. 6 is a diagram illustrating a hand gesture control system 30 remotely controlling a light emitting diode (LED) lamp 90, in accordance with some embodiments. Individual elements of the hand gesture control system 30 and/or LED lamp 90 can be similar to or the same as those described in U.S. patent application Ser. No. 13/826,177, incorporated by reference herein.
As shown in FIG. 6, the hand gesture control system 30 and the LED lamp 90 are physically separate. The hand gesture control system 30 includes a signal transmitter 31. The LED lamp 91 includes a signal receiver 91 that communicates with the signal transmitter 31 via wireless communication signals, for example, radio waves, optical signals, or the like. In some embodiments, the signal can be transmitted via a fiber or Ethernet cable. Since the hand gesture control system 30 generates a control spot 46, and since the LED lamp 90 generates an illumination spot 92, a hand gesture can be placed under the control spot 30 to control the illumination spot 92. If a user or other human observer wants to steer the illumination spot 92 to a desired position, he/she can adjust hand gesture position and compare the current illumination spot 92 position to the desired position. Continue this process can continue and be repeated until illumination spot 92 reaches its destination.
One or both of the hand gesture control system 30 and the LED lamp 90 can have a beam steering mechanism (not shown), for example, described in U.S. patent application Ser. No. 13/826,177, incorporated by reference herein. In some embodiments, a monitor and corresponding processor are provided for controlling multiple LED lamps through the same hand gesture control system 30.
FIG. 7 is a diagram illustrating a hand gesture control module 130 remotely controlling a television set 102, in accordance with some embodiments. Although a television set 102 is described, related vision displays can equally apply, for example, a computer monitor. The control module 130 includes a transmitter 131 that exchanges communication signals with a receiver 103 at the television set 102, for example, wireless communication signals such as radio waves, optical signals, or the like. An example of a communication signal is a control signal similar or the same as control signal 23 described with reference to FIG. 2A,
Hand gestures 70, 80 in a control spot 46 generated by the control module 130 can replace a remote control device, or the like, and be used to access a selection menu of the like on the television set 102, which can be achieved by cursor or joystick hand gestures described herein.
FIG. 8 is a block diagram of a hand gesture control system 30, in accordance with some embodiments. The hand gesture control system 30 comprises a beam steering mechanism 202, a computation processor 204, a transceiver 206, a hand gesture sensor 208, a control spot generator 210, and a wearable computer display 212, some or all of which can be co-located under a same housing as a single unit.
The control spot generator 210 generates a control spot at a region at or proximal to that where hand gestures will be sensed. In some embodiments, the control spot is sufficiently large to surround at least one human hand, for example, 12″ diameter. In other embodiments, the control spot is small, for example, about a 1″ diameter. The control spot generator 210 can include filters, light pipes, LEDs, or a combination thereof for generating a control spot, which can be the same as or similar to that described in U.S. patent Ser. No. 13/826,177 incorporated herein by reference above.
In some embodiments, as shown in FIG. 13, the control spot generator 210 comprises one or more LEDs 602 or other light source and a lens 604. An LED 602 can comprise narrow-beam optics for generating a narrow light beam at the lens 604 so that its diameter is equal or smaller than the aperture diameter of the lens. The control spot generator 210 can further include a heat sink 606 for dissipating heat generated by the LEDs 602. Although not shown, other sources that emit light can equally apply. The light output from the LEDs 602 or other light source can be in the visible light spectrum, or other light spectrum.
The hand gesture sensor 208 captures hand gesture images taken from a generated control spot, which can be the same as or similar to that described in U.S. patent Ser. No. 13/826,177 incorporated herein by reference above.
In some embodiments, the hand gesture sensor 208 comprises one or multiple thermal cameras, which capture thermal hand gesture images. In some embodiments, the hand gesture sensor comprises one or multiple visible cameras, which capture only visible hand gesture images. In some embodiments, the hand gesture sensor comprises of one or multiple visible, thermal dual-band cameras, for example, illustrated in FIG. 14. In some embodiments, a visible, thermal dual-band camera 700 comprises a visible camera 710, a thermal camera 720, and an infrared window 702 that reflects visible light into the visible camera 710 and transmits thermal light to the thermal camera 720. The thermal camera 720 can include an infrared focal plane detector array 722 and an optical element such as an infrared lens 724 or the like. The visible camera 710 can include a visible FPA or the like and a visible lens 714 or related optical element.
In other embodiments, as shown in FIGS. 11A and 11B, a visible-thermal dual-band flashlight camera 400 comprises an infrared lens 402 shared by a visible FPA detector 426 and a thermopile FPA or detector 428, and a three-face mirror 404. The visible, thermal dual-band camera 400 captures both visible and thermal images of a hand gesture. The visible FPA 426 and the thermal FPA 428 can respond to different wavelengths. In particular, the thermal FPA 428 can respond to wavelengths corresponding to emitted thermal light of a hand making a gesture, while the visible FPA 426 can response to wavelengths of light of the hand in the illuminated control spot. A high signal-to-noise ratio thermal image can distinguish a hand gesture from a background, and therefore be used to extract the hand gesture.
Returning to FIG. 8, the beam steering mechanism 202 can include a dual-axis gimbal or the like for steering a control spot and a field of view (FOV) provided by the hand gesture sensor 208. In some embodiments, the beam steering mechanism 202 includes a steering mirror on a gimbal, for example, described in U.S. patent application Ser. No. 13/826,177, incorporated by reference herein. The hand gesture sensor 208 and the control spot generator 210 are preferably placed in front of the mirror of the beam steering mechanism 202. The gimbal mirror steers the beam of the control spot generator 210 and the field of view of the hand gesture sensor 208 as the hand gesture moves.
In some embodiments, the beam steering mechanism 202 comprises a Micro-Electro-Mechanical Systems (MEMS) mirror array as described in U.S. patent application Ser. No. 13/826,177 incorporated by reference herein. The MEMS mirror array can steer the field of view of the hand gesture sensor 208 and the beam output by the control spot generator 210 as the hand gesture moves.
In some embodiments, the beam steering mechanism 202 comprises two counter-rotating prism wedges as illustrated in U.S. patent application Ser. No. 13/826,177 incorporated by reference herein. The hand gesture sensor 208 and the control spot generator 210 can be placed in front of the counter rotating wedge assembly. Beam steering can be performed by a combination of counter-rotation and co-rotation of the two wedge prisms.
In some embodiments, the beam steering mechanism 202 comprises a dual-axis gimbal 302. In some embodiments, the hand gesture sensor 208 and the control spot generator 210 can be coupled to a mounting plate 512 on the gimbal 302 as illustrated in FIG. 12. In other embodiments, as shown in FIG. 9, a flashlight camera 400 is coupled to the gimbal 302. The flashlight camera can be the same as or similar to the flashlight camera 400 described at FIGS. 11A and 11B. In some embodiments, the dual-axis gimbal 302 can be pitch-roll type as illustrated by FIG. 10 or pitch-yaw type as illustrated by described and illustrated in U.S. patent application Ser. No. 13/826,177 incorporated by reference herein. Accordingly, the gimbal 302 can include an outer ring and an inner ring rotating relative to each other by shafts or the like. One or more motors (not shown) and can be mounted on the inner ring and outer ring, respectively. Counterweights (not shown) can be mounted on the outer ring and inner ring, respectively, for balancing and stabilizing the gimbal 302, and moving that gimbal 302, i.e. pitch, yaw, and/or roll.
The hand gesture sensor 208 and a control spot generator unit 210 are constructed and arranged on the mounting plate 512 so that the lines of sight (LOS) of the gesture sensor 208 and a control spot generator unit 210 are parallel. Because of the proximity to each other, the center of a control beam spot generated by the control spot generator unit 210 can coincide or nearly coincide with the center of the imaging region provided by the hand gesture sensor 208.
Returning to FIG. 8, the computer processor 204 is responsible for hand gesture recognition, tracking, beam steering, and control command signal generation. In some embodiments, the computer processor 204 can be a computer, for example, comprising a processor, a memory, and a connector between the processor and the memory and/or other elements of the hand gesture control system 30. In other embodiments, the computer processor 204 is a DSP chip. The computer processor 204 is responsible for hand gesture recognition, tracking, beam steering, control command signal generation, communicating with the wearable computer 212, e.g., Google Glass or the like, and/or lighting control. In some embodiments, icons of applications for controlling various machines, instruments, robots, devices, and playing games can be displayed on the wearable computer display 212 or related display screen. In some embodiments, the computer processor 204 in communication with the hand gesture sensor 208 can permit hand gestures to appear as cursors, joysticks, and other symbols on the wearable computer display 212 or related display screen. For example, a cursor can be moved at the display 212 by making a hand gesture as shown in FIG. 4 in a control spot formed by the control spot generator 210. In another example, hand gesture motions can be made that emulate joystick control motions as shown in FIG. 5. Other software applications can equally apply, which are stored in memory, executed by one or more processors. Some or all applications can be displayed at the display 212, and executed in responds to hand gestures or the like, for example, described herein.
The transceiver 206 of the hand gesture control module sends command signals to an operating instrument controlled according to hand gestures processed by the hand gesture control system 30. The transceiver 206 can be similar to or the same as the transmitters 31 and 131 of FIGS. 6 and 7, respectively. A receiver of the operating instrument, for example can provide status information or the like regarding the instrument. Information can be displayed on the wearable computer display 212, for example, a control panel screen 44B shown in FIG. 3C. In some embodiments, the signal and information can be sent wirelessly by radio wave or light beam. In some embodiments, the signal can be transmitted via a fiber or Ethernet cable.
In some embodiments, the hand gesture sensor 208 and the control spot generator 210 are assembled into one unit. In some embodiments, the hand gesture control module 20 comprises one or more of such units. In some embodiments, the assembled unit is constructed and arranged as a visible, thermal dual-band flashlight camera, for example, described in U.S. patent application Ser. No. 13/826,177, incorporated by reference herein, and/or the dual-band flashlight camera 400 illustrated in FIGS. 11A and 11B, which can comprise an infrared lens 408, a three-face pyramid prism mirror 404, a visible focal plane array 426, a thermal focal plane array 428, and one or more color LEDs or related light source 402. In the flashlight camera 400, a hand gesture sensor and the control spot generator are assembled under a single housing, i.e., the flashlight housing. The LEDs 402, visible FPA 426, and thermal detector or FPA 428 can share the same lens 402 but split the aperture, and illuminate at the same target. Therefore, the illumination area of the color LED 402 is also the imaging area of the visible FPA 426. The thermal FPA 428 also provides imaging in the same area as the visible FPA 426. However, the light source 402 is the actual thermal emission from the target under a control spot. In some embodiments, the lens 408 is a transmissive type lens. In other embodiments, reflected Cassagrain optics are provided instead of an infrared lens. In particular, the control spot generator 210 is placed next to a hand gesture sensor 208 as shown in FIG. 12. The flashlight camera functions as the hand gesture sensor.
FIGS. 15A-15B are views of various headsets 800A-800B, respectively, each including a three-dimensional hand gesture control module and a predetermined number of cameras, in accordance with some embodiments.
In some embodiments, shown in FIG. 12, a hand gesture sensor and a control spot generator are co-located, and mounted in the front of a platform, in particular, for 2-dimension hand gesture motion sensing. In other embodiments, as shown in FIG. 15B, two side cameras 812A, 812B are added to a headset 800C for 3-dimension hand gesture motion sensing. In some embodiments, the front 806 and side cameras 802, 812 can be visible cameras. In some embodiments, they can be visible, thermal dual-band cameras.
As described above, some embodiments provide for a hand gesture control module that processes three-dimensional hand gesture motions.
To obtain 3-dimension hand gesture motions, multiple cameras at various point angles are required. FIG. 15B includes an example of a hand gesture control module with 3 cameras, for example, two side cameras and a front camera, in accordance with some embodiments. The hand gesture control modules can be worn by an operator on his/her head. To detect 2-dimensional motions, only one camera looking straight down at the hand gesture is needed. To detect 3-dimensional motions, two side cameras 802A, 802B are needed.
In some embodiments, a hand gesture can be separated from background by using two cameras separated by a distance and a control spot generator. As illustrated in FIGS. 15C and 15D, the color control spot 9004 is at two different heights before and after a hand gesture 9005 is inserted. This color control spot 9004 can be used as a reference. The control spot 9004 appears in different locations in the camera images due to separation between cameras 9001A and 9001B. The distance of control spot between two images increases for control spot near the cameras and decreases for control spot further away from cameras. Before hand gesture 9005 is inserted into the control spot 9004, the control spot is projected onto the background 9003. When hand gesture 9005 is inserted into the control spot 9004, the control spot is at the hand which is closer the cameras than the background. Using this information and 3D stereoscopic method, hand gesture 9005 in the control spot 9004 can be separated from the background 9003. This is done by separating pixels into two groups: one with large image separation and one with small image separation between two camera images. The group with the large image separation belongs to the hand gesture 9005. The group with the small image separation belongs to the background 9003. Normally 3D stereoscopic imaging requires calibration of the camera system to extract the depth information. The reference positions of the control spot 9004 before and after insertion of hand gesture makes the calibration unnecessary. In some embodiments, the two cameras 9001A and 9001B can be color cameras or multispectral cameras. The control spot 9004 can be easily seen in the color band whose color matches that of the control spot. The cameras 802A and 802B in the headset in FIG. 15B can perform some or all of this technique to separate a hand gesture from a background.
In some embodiments where two or more cameras are provided, a control spot can be used as a reference beam for separating hand gesture from background using the fact that the parallax of the hand gesture is larger than that of the background, which is further away from the cameras. The control spot appears in two different distances before and after inserting a hand into the control spot.
Conventional thermal non-contact switches measure thermal signal changes due to an approaching warm body or object in a field of view (FOV) of a passive infrared sensor. A conventional non-contact switch can comprise a thermal detector such as a pyroelectric detector and a Fresnel lens. The thermal detector is located at the focal plane of the lens. A feature of the thermal non-contact switch is that as a warm body approaches the FOV of the thermal sensor, a change of signal from a cooler background to a hotter body is detected. As a result, the switch is triggered. However, such non-contact switches are limited in functionality. For example, a conventional thermal non-contact switch does not have an amplitude increase or decrease capability.
In accordance with embodiments of the present inventive concepts, selected pixels from an image are detected by a non-contact sensor, and provided as function keys for a non-contact switch or controller. The projections of these keys on an imaging surface can serve as non-contact buttons. A user can position a hand or other object into the projection, or field of view, of an image selected as a function key, thereby activating the non-contact button without physically contacting it. The image of the non-contact sensor is small because the number of assigned function keys is also small. Very little processing time is required as compared with other hand gesture processing techniques, thereby increasing a switching time of the non-contact switch. Multiple function keys can be provided, each corresponding to a function key sensor of the non-contact sensor. Therefore, keys other than an on/off function key can be used to control other functions of the device. Therefore, a non-contact switch in accordance with some embodiments can be constructed and arranged as a controller. In some embodiments, color light can be used to illuminate control keys so that users can easily identify the various buttons. A control spot generator can be provided to generate cones of color light. A beamsplitter can be used to align the non-contact keys and the control spots.
In other embodiments, a switch can include a scanning non-contact sensor with a single element detector. Instead of forming a whole image, a scanning non-contact sensor can create a partial image consisting of only function key pixels and minimum number of non-function pixels, for example, shown in FIG. 21C. In doing so, a fast non-contact switch can be constructed because the scan mirror has only a few positions to go to. Since a single detector is used instead of an array of detectors, the manufacturing cost is significantly reduced, especially with respect to the thermal scan sensor. The scan mirror can be inexpensive too, especially for a MEMS mirror in large volume. Therefore, a fast and inexpensive non-contact switch can be constructed from a scanning non-contact sensor. Also, in some embodiments, the scan scans all function key sensors and only the non-function key sensors in the path of the scan to shorten the data acquisition time. In configurations where a camera is part of the system, a hand gesture can be positioned in the field of view of the camera, and can therefore enhance the functions of the switch, for example, for recognizing and distinguishing hand gestures. For example, referring to the cursor or joystick application above, a camera can capture image data corresponding to the hand gesture and a processor can convert the captured image data into a cursor or joystick command signal that controls a cursor or joystick. The function keys become inactive, such that only camera images of hand gesture is processed.
A low resolution staring non-contact sensor with a small detector array can also be used to construct a fast and inexpensive non-contact switch. For staring systems, all image pixels are generated at the same time. Low resolution thermal detector arrays such as 4×4 or 8×8 are relatively inexpensive.
The non-contact sensor can be of thermal type or a radiometric calibrated color type. The thermal type sensor uses the heat from hands while the radiometric calibrated color sensor uses color of hands. A hand can be therefore be distinguished from another object, for example, a sheet of paper, when positioned in the field of view of the non-contact sensor.
A diffuser is a well-known device that scatters light in different directions. Accordingly, light from different scene contents can become mixed after engaging with a diffuser. The net result is a smearing of the image scene or other undesirable effect. The smearing is worse as the scene moves further away from the diffuser. However, by adding color control spots at the diffuser, a diffuser based switch or controller can be constructed, for example, described herein. In some embodiments, an imaging sensor is positioned behind a diffuser, which can resolve images related to a hand gesture or the like only when the hand or the like is close or touches the diffuser. By moving the hand gesture near or touching a selected control spot on the diffuser, a command based on the control spot is generated and sent to the device under control. A non-contact switch in accordance with some embodiments works best for remote applications, the diffuser based switch or controller works best for close proximity applications, and can complement or otherwise co-exist with a non-contact switch.
FIG. 16 is a view illustrating an operation of elements of a non-contact switch system 1600, in accordance with some embodiments. The system 1600 includes an imaging optics 1602 and a detector array 1604. The detector array 1604 comprises a plurality of image sensing detectors 1606, arranged in rows and columns at the array 1604, which incorporate an array of pixels. Each detector 1606, also referred to as a function key sensor, collects incident light from a small region of an imaging area 1608, or ground sample A. Each function key sensor at the detector array 1604 contains a light sensitive photo diode or the like for measuring light, thereby recording an image of the ground sample. Accordingly, the function key sensor can be identified from or formed according to selected pixels of the array 1604. The output signals produced by the pixels are read out, for example, one row at a time, to form an image. The captured images can be output as an output image 1612, for example, to a display.
The solid angle extending from detector 1606 and imaging optics 1602 is referred as an instantaneous field of view (IFOV). The projection angle of detector 1606 to ground sample A through the imaging optics can be referred to as an IFOV of detector 1606. Detector 1606 collects only light within this IFOV. The output of detector 1606 or pixel A′ is the image of ground sample A. The volume within this IFOV can be used as a function key of a non-contact switch in some embodiment. A target such as a hand detection within this IFOV can be assigned a specific meaning or function in some embodiment. Target detection only occurs when the signal level of pixel A′ falls within a certain range. For example, a thermal signal from a hot soldering iron or a book at ambient temperature is not interpreted as a target signal because it is either too high or too low.
FIG. 17 is a view of a function key 1708 of a non-contact switch, in accordance with some embodiments. In describing FIG. 17, reference is made to elements of the non-contact switch system 1600 of FIG. 16.
In some embodiments, pixel A′ of the system 1600 shown in FIG. 16 is identified as a function key for the non-contact switch. In some embodiments, a collection of pixels can be used as a function key. Detector 1606 can sense an object such as a hand carrying signal entering the IFOV of the image, or ground sample, selected as the function key 1708. For example, the hand can block the IFOV provided by a function key pixel sensor 1706 to activate the function key 1708. Accordingly, the location and/or movement of the hand activates the function key 1708 without physically contacting it. The imaging sensor in an embodiment can include multiple function keys. The function key and the imaging optics form a function key sensor in some embodiments. The imaging sensor therefore can include multiple function key sensors.
FIG. 18 is an illustration of multiple non-contact keys constructed from the pixels of a sensor image 1800, in accordance with some embodiments.
As described above, some function keys can be configured for switching functions, for example, function key control signals generated by one or more function key pixel sensors 1706 of FIG. 17, while other function keys can be configured to permit control-related functions to be performed. In some embodiments, function keys are identified according to a calibration procedure in FIG. 20. Non-contact function keys are separated from each other at an image area to avoid unintentionally activating multiple keys at the same time. A non-contact switch in accordance with some embodiments can provide a sufficient number of keys 1808 to perform both switching and controlling functions. In some embodiments, images captured at a non-contact sensor have a low resolution comprising a small number of pixels. Accordingly, the collection time and processing time are short and the cost is low. A non-contact switch including non-contact sensors can therefore be fast and inexpensive. In some embodiments, images of a non-contact sensor can be of a high resolution. In some embodiments, a function key is identified by inserting a ground truth target such as a hand into a control spot and collecting an image of the ground truth target. A pixel or group of pixels with a highest detector output is identified as the function key. Other function keys can be identified in the same way. The positions of the function keys can be stored in the processor. Control functions of one or more devices under control can be assigned to function keys in the processor prior to a control operation.
FIGS. 19A and 19B are views of a staring-type non-contact sensor 1900 and a scanning-type non-contact sensor 1940, in accordance with some embodiments.
In some embodiment, as shown in FIG. 19A, the staring non-contact sensor 1900 comprises of imaging optics 1902 and a two-dimension detector array 1904, or focal plane detector array. The detector array 1904 can be the same as or similar to the detector array 1604 of FIG. 16. A full image, for example, of the imaging area 1608 of FIG. 16, can be formed by all detectors in the two-dimension detector array 1904. In particular, an image can be created by focusing light received from the scene onto individual detectors of the two-dimension detector array 1904.
In some embodiments, as shown in FIG. 19B, the scanning non-contact sensor 1940 comprises of imaging optics 1944, a scan mirror 1942, and a line of detectors or a single element detector 1943. The scan mirror 1942 can be configured for one axis for a line detector array and for two axes for a single element detector. In some embodiments, the scan mirror 1942 is a MEMS type or the like. In some embodiments, the scan mirror 1942 is a piezoelectric type. In some embodiments, the scan mirror 1942 is an electromagnetic type. In some embodiments, the scan mirror 1942 is a mechanical type.
The scanning non-contact sensor 1940 creates an image by sequentially scanning and focusing light from various parts of the scene onto the detector 1943, i.e., a single detector or a one-dimension array of detectors. The scan mirror 1942 scans portion of the field of view (FOV) of the sensor 1940 at a time generating a partial image. It continues to scan until the full FOV is covered. The full image is then created. For the single element detector system, the scan mirror 1942 scans one IFOV corresponding to one pixel at a time. The full image is output of the same detector looking in different directions within the sensor FOV at different times. In FIG. 21A, the scan mirror scans one IFOV for a column of detectors at a time. It continues from left to right until the full image is obtained. In FIG. 21B the scan mirror scans from FOV from left to right and right to left in a raster pattern until the full image is obtained.
FIGS. 19C and 19D are views of a thermal sensor 1950 and a skin color sensor 1960, respectively, in accordance with some embodiments.
In some embodiments, the sensing mode of a non-contact sensor can be emissive. Thermal radiation emitted from a body object, for example, a human hand, is collected by the thermal sensor 1950, which can be staring or scanning type. In embodiments where the thermal non-contact sensor 1950 is part of a staring system, the sensor 1950 comprises thermal imaging optics 1952 and a detector array 1954. In embodiments where the thermal non-contact sensor 1950 is part of a scanning system, the sensor 1950 comprises of imaging optics, a scan mirror, and detector line array or a single detector, for example, similar to the sensor 1940 of FIG. 19B
In other embodiment, the sensing mode can be reflective. Color light reflected off skin of body object is collected by the skin color sensor 1960, which can be scanning or staring type. A skin color non-contact sensor 1960 comprises of imaging optics 1962, color filters (not shown), and a detector array 1964 for a staring system. It comprises of imaging optics, color filters, scan mirror, and detector array in a scanning system. The skin color non-contact sensor 1660 can include other elements and functions described in U.S. patent application Ser. No. 13/826,177, the entire contents of which are incorporated herein by reference.
FIG. 20 is a view of a system 2000 for aligning a function key pixel and a control spot, in accordance with some embodiments.
In order to permit a user to identify locations of non-contact keys, light at different colors can be emitted to illuminate the IFOVs of one or more non-contact keys. Each non-contact key is assigned to a color, in some embodiments. In order to shine the path of the IFOV for each non-contact key, the system 2000 includes a spot generator 2004 and a beamsplitter 2002. As illustrated in FIG. 20, the beamsplitter 2002 is placed at the angular bisector between the nadir looking non-contact sensor 2006 and the horizontal pointing control spot generator 2004, in some embodiments. The reflected light of the control spot generator 2004 at the beamsplitter 2002 coincides with the IFOV of the relevant function key. In some embodiments, the alignment of a control spot 2012 and identification of a function key is achieved by inserting a hand or other thermal emitting object into the control spot region 2012 and take an image of target in the control spot. The pixel or group of pixels with high output is the non-contact key pixel or pixels for this control spot 2012. Employing the same procedure, the other function key pixels for other control spots can be identified. Once the function key pixels for all control spots are identified, they can be assigned to various functions in some embodiment. In some embodiment, the beamsplitter 2002 can be of a coated type or uncoated type. The coating can be applied to one or both substrate surfaces. The uncoated type is usually made of infrared materials such as Si. Here the transmission and reflection is governed by Fresnel reflection.
FIG. 21A is an illustration of a parallel scanning configuration for a line detector array 2100A, in accordance with some embodiments. Each element represents the IFOV of a pixel. The whole array in FIG. 21A is the field of view (FOV) of the sensor. As shown in FIG. 21A, the IFOVs of a line array 2100A are oriented in a vertical position and scanned in one direction, for example, from left to right as shown in FIG. 21A, by a scan mirror or the like, for example, illustrated at FIG. 19, to create a full image. The array 2100A can include a mixture of function key pixels and non-function key pixels, described herein.
FIG. 21B is an illustration of a scanning configuration for a single element non-contact sensor 2100B, in accordance with some embodiments. The scanning method applied in FIG. 21B is serial. Here, the IFOV of the single element detector is scanned left to right, right to left, and so on, or a serpentine flow, in a raster scanning path to create a full image. Because only a few pixels out of the whole image are assigned to function key pixels, most of the pixels in the full image are not used. Instead of creating a full image, a partial image comprising of all function key pixels and minimum number of non-function key pixels in some embodiments.
FIG. 21C is an illustration of a fast scanning configuration for a single element non-contact sensor 2100C, in accordance with some embodiments. Here, a scan path is formed that creates a partial image including all function key pixels and a minimum number of non-function key pixels. The scan path can be any route as long as it contains all the function key pixels. Because the number of pixels that need to be scanned is small and only one detector is required, a non-contact switch that incorporates this configuration can be fast and inexpensive as compared to other configurations.
FIG. 22 is a view of a control spot generator 2200, in accordance with some embodiments. The control spot generator 2200 comprises of a white LED 2202, a control spot generator plate 2204 with color filters 2205, and a lens 2206. The lens 2206 is separated from the filter plate 2204 by a distance f. Color light is generated from color filters 2205 when the white LED light illuminates. Images of the color filters are created by the lens. As shown in FIG. 22, the control spot generator 2200 can project multiple well defined illumination control spots onto a projection surface 2207. For example, a projection pattern on the surface 2207 can comprise blue (B), green (G), yellow (Y), red (R), and purple (P) control spots.
FIG. 23 is a view of a control spot generator 2300, in accordance with some embodiments. In some embodiment, the control spot generator 2300 comprises a plurality of color LEDs 2302, light pipes 2303, a light pipe mounting plate 2304, and a lens 2306. The lens 2306 is separated from the mounting plate 2304 by a distance f. The light pipes 2303 include exit ports at a focal plane of the lens 2306. As shown in FIG. 23, LEDs 2302 and corresponding light pipes 2303 can generate control spots of different colors, for example, blue (B), green (G), yellow (Y), red (R), and/or purple (P) control spots. The control spot light is used to illuminate the IFOVs of function keys shown in FIG. 24-28. In some embodiments, a control spot generator can also be constructed from color lasers as oppose to LEDs as in FIG. 22-23.
FIG. 24 is a view of a non-contact switch 2400 including a scanning non-contact sensor 2410, in accordance with some embodiments. The non-contact switch 2400 can be the same as or similar to other embodiments herein. Details thereof are not therefore not repeated due to brevity. In some embodiments, the non-contact switch 2400 comprises the scanning non-contact sensor 2410, a beamsplitter 2420, and a control spot generator 2224. The sensor 2410 includes a detector or a line detector array 2414 and a scan mirror 2412. A scan mirror 2412 can be a one-axis mirror for a line detector array, or a two-axes mirror for a single element detector, as described herein with reference to other embodiments. The scan mirror 2412 scans the FOV of the sensor continuously in pattern shown FIG. 21A or 21C. The scanning sensor 2410 preferably scans a portion of a field of view at a time until the complete field of view is scanned. Detector or detectors continuously send output light signals captured from IFOVs of function and non-function key pixels to a processor. Light signals from various IFOVs are captured by the same detector for single element detector or same detectors for line detector array. The processor only processes signal from function key pixels. If a target is detected in a function key, then a command is generated and output to a device under control 2602 as shown in FIG. 26.
FIG. 25 is a view of a non-contact switch 2500 including a staring non-contact sensor 2510, in accordance with some embodiments. In addition to the staring non-contact sensor 2510, the switch 2500 comprises a beamsplitter 2520, and a control spot generator 2524. The non-contact switch 2500 can be the same as or similar to other embodiments herein. Details thereof are not therefore not repeated due to brevity. The operation of this non-contact switch is similar to that of scanning non-contact switch except the light signals from various IFOVs are captured by different detectors, and that the staring sensor 2510 generates all image pixels simultaneously.
In FIGS. 24 and 25, the function keys 2408, 2508, respectively, are aligned with control spots 2409, 2509, respectively, along the paths of the IFOVs by the beamsplitter 2420, 2520, respectively. In FIGS. 24 and 25, there are five function keys 2408, 2508 shown, each aligned with a control spot having a different color, e.g., blue (B), green (G), yellow (Y), red (R), and purple (P) control spots. In some embodiments, two of the function keys 2408, 2508 can correspond to on/off switches. For example, the R function key can function as an off switch when a hand gesture is positioned in the IFOV of the R function key/control spot, and the G function key can function as an on switch when a hand gesture is positioned in the IFOV of the G function key/control spot. Because there are more than two function keys, extra function keys can be used for controlling. A non-contact switch can be also referred as non-contact controller. In some embodiments, two other function keys 2408, 2508 can be assigned to amplitude increasing/decreasing functions. For example, the amplitude can be light level if it is a light switch, or can refer to the speed of a fan if it is a fan switch. For example, the B function key can be used to increase an amplitude of an apparatus under control when a hand gesture is positioned in the IFOV of the B function key/control spot, and the Y function key can be used to decrease an amplitude of an apparatus under control when a hand gesture is positioned in the IFOV of the Y function key/control spot. In some embodiments, the last function key can be assigned to an ambient background measurement function. This is especially important for a thermal non-contact sensor because thermal detectors generate more noise than other detectors. Removing the ambient background can reduce undesirable noise. In embodiments where the thermal sensor is a scanning system with a single element detector, removing the background also removes the dark current of the detector. This is done by subtracting the ambient key pixel from other function key pixels, thereby enhancing the signal to noise ratio. When using other function keys, the user can verify that the IFOV only intercepts an ambient background, and that a hand or other warm targets are not placed in the IFOV.
FIG. 26 is an illustration of an apparatus 2602 controlled by a non-contact switch, in accordance with some embodiments. The non-contact switch can be the same as or similar to other embodiments herein. Details thereof are not therefore not repeated due to brevity.
When a non-contact sensor 2610 of the non-contact switch detects a hand signature in a control spot 2604 generated by a control spot generator 2624, the sensor 2610 outputs the information to a processor 2612, which can be integral with or physically separate from other elements of the non-contact switch. Although not shown, the processor 2612 can include a transceiver. In some cases, the non-contact switch 2610 and the device under control 2602 are separated by large distance, the transceivers can communicate wirelessly. The processor 2612 determines which function key is in communication with a hand gesture. The locations of control spots correspond to the locations of the function keys. Function keys do not have to fill the control spots completely, but must be at least partially aligned. Accordingly, the control spots can serve as function keys. In response, the processor 2612 generates a command C1-C5 that is output to the apparatus under control 2602. For example, command C1 can be a switch on command, command C2 can be a switch off command, command C3 can be an amplitude increase command, command C4 can be an amplitude decrease command, and command C5 can be an ambient background command.
FIG. 27 is a view of a non-contact switch 2700 including a camera or cameras 2722, in accordance with some embodiments. In some embodiments, the camera 2722 is a visible band camera. In some embodiments, the camera 2722 is a near infrared (NIR) band camera. In some embodiments, the camera 2722 is a shortwave infrared (SWIR) band camera. Other spectrums can equally apply. In FIG. 27 the camera or cameras 2722 is placed next to the spot generator. In some embodiment, cameras 2722 can be placed in other locations.
The non-contact switch 2700 can include a non-contact sensor 2710 or the like, which can be the same as or similar to other embodiments herein. For example, the non-contact switch 2700 can include imaging optics 2702, scan mirror 2712, detector 2714, beamsplitter 2720, and control spot generator 2724, which can be the same as or similar to those described in other embodiments herein.
The camera 2722 receives hand gesture information that can be used to enhance a control function of the non-contact switch 2700. Accordingly, multiple hand gestures can be processed for a given function key as illustrated by the examples of FIGS. 30, 31, 32, 33, 35, and 36. For example, the hand gestures are used to control the direction of motion for a drone. In some embodiments, one can control multiple devices by using a combination of function keys and hand gestures. For example, the number of fingers can be assigned to control each of a plurality of lamps, for example, one finger for first lamp, two fingers for a second lamp, three fingers for a third lamp, and so on. A plurality of function keys 2709 and corresponding control spots 2708 can be provided. To switch from the current device to a different device, two hand gestures at two different control spots at simultaneously are needed in some embodiment. FIG. 28 provides more details.
FIG. 28 is an illustration of multiple devices 2802 a-c (generally, 2602) controlled by a non-contact switch 2800. The non-contact switch 2800 can include a non-contact sensor 2810 or the like, which can be the same as or similar to other embodiments herein. For example, the non-contact switch 2700 can include a non-contact sensor 2810, a camera 2814 or multiple cameras, beamsplitter 2820, and control spot generator 2824 and/or other elements that are the same as or similar to those described in other embodiments herein. The cameras 2814 can be mounted next to the control spot generator 2824 or other place. An illumination source 2822 can provide light for the cameras.
A hand gesture can be positioned at different control spots to control the current device. To switch to another device, two hands at two different control spots at the same time are needed. For example, when both hands are inserted in on and off control spots C1, C2 at the same time, a device changing signal, or selection signal, is triggered in the processor 2812. The hand gesture in C2 contains device number. Number of fingers can be used as device number, for example. A two fingers hand gesture in C2 signals switching from current device to device 2. After the processor 2812 analyzes the hand gesture image from the camera 2814, a device number is obtained. The processor 2812 communicates directly with the selected device 2802.
In some embodiments, the non-contact switch 2800 can be in a tracking mode. This is usually for slow moving controlling such as a non-contact mouse motion of a computer. One control spot C1 can be used to activate the tracking mode using a special hand gesture in some embodiment. During tracking, the control spots C2-C5 are inactive. Signals from C2-C5 will not be processed. The camera 2814 takes over. The processor now only processes camera images and signal from the control spot C1. A stop hand gesture, for example, at a predetermined control spot C1, can be used to end the tracking.
FIG. 29 is a flowchart illustrating the relationship between elements of a non-contact switch in an operation, in accordance with some embodiments.
The non-contact sensor 2910 can scan the control spots 2909 for a hand heat signature or color information. During operation, a hand gesture is inserted into a control spot first. The thermal signal from the hand is captured by the non-contact sensor 2910. A camera 2914 can capture an image of a region of the control spot 2909. More specifically, the camera captures the hand gesture, control spot and the background, but only the hand gesture is extracted for processing. If the heat signature or color of hand gesture 2911 is detected in one control spot 2909, then the operation will control a current device. The camera or cameras 2914 will capture images of hand gesture. The heat signature or color signal from control spot 2909 and the hand gesture information from the camera 2914 are used by the processor 2912 to generate a command. If a tracking mode is detected, then all but one control spots 2909 become inactive. The processor simply stops process the control spot information. Hand tracking can continue until a stop hand gesture is given. When tracking ends, the control spots 2909 become active again. The command is sent to the device 2902 under control. When a hand gesture is detected in another control spot 2909, the same process will be repeated. If a heat signature or color signal is detected in two control spots 2909, on and off control spots for example, simultaneously, a device change mode is activated in the processor 2912. The number of fingers in the image of the hand gesture 2911 will determine which device 2902 the operation will be in. The processor 2912 now communicates only to the current device 2902. The switching or controlling only occurs in at the selected device 2902.
In some embodiments, a non-contact switch can be mounted on any surface, on the ceiling, the wall, or floor. In some embodiment, a non-contact switch can be mounted on forehead, for example, by a headband shown in FIG. 15. Accordingly, a video game player can use the non-contact switch as a video game controller in some embodiments. The control spots can serve as “buttons” of a video game controller. A player can play the video game by inserting his/her hand gestures into the control spots. The non-contact game controller in a car racing video game can control a remote control car by wearing the controller on the user's head. Because of the simplicity of the process, the process time is fast. This makes fast-action electronic games or devices, such as a remote control car described herein, enjoyable to play. In some embodiments, an inexpensive reflection panel can be used to enhance the control spot visibility with respect to such games. This may be helpful for beginner game players. As the player plays more, he/she will remember the positions of the control spots.
FIG. 30 is a view of control spot functions of a remote controller 3000 for a remote control vehicle, in accordance with some embodiments. The remote controller 3000 includes elements of a non-contact switch in other embodiments herein. Details thereof are therefore not repeated for brevity.
In an embodiment, a remote control vehicle is controlled by two non-contact controllers, in particular, a left controller 3010A for direction control and a right controller 3010B for speed control.
The control spots from the non-contact controllers 3010A, 3010B are projected on a surface. The control spots can be distinguished from each other by color, size, or other characteristic. The four control spots displayed by the left controller 3010A are provided for direction control while the four control spots displayed by the right controller 3010B are for speed control. In some embodiments, the hand gesture is a first corresponding to a normal operation command, the left hand gesture with a thumb is for fast turning, and the right hand gesture with a thumb is for fast acceleration or hard braking, depending on which control spot the right hand is in. A predefined key can be established for associating each control spot with a particular hand gesture and corresponding command. For example, control spot A corresponds to an accelerate command, control spot B corresponds to a backup command, control spot C corresponds to a cruising command, control spot D corresponds to a decelerating command, control spot L corresponds to a left command, control spot P corresponds to a park command, and control spot R corresponds to a right command. Accordingly, if a user wants to remotely control a vehicle or a vehicle displayed in a video game, and desires for the vehicle to accelerate in a straight line, the user can position a left first on control spot S and a right first with extended thumb on control spot A for fast acceleration.
FIG. 31 is a view of control spot functions for a game controller 3100 for a video game console, in accordance with some embodiments. The controller 3100 includes elements of a non-contact switch in other embodiments herein. Details thereof are therefore not repeated for brevity.
In some embodiment, one or multiple non-contact controller units similar to or the same as non-contact switches herein can be used as a video game controller 3110A, 3110B (generally, 3110), for example to control a martial arts video game such as the Kickstarter™ program, but not limited thereto. Other video games or electronic devices can equally apply.
The left controller 3110A controls movements of left limbs of a martial arts character displayed in the video game. The right controller 3110B controls movements of right limbs of the martial arts character.
A plurality of control spots from the non-contact controllers 3110A, 3110B are projected on a surface. The control spots can be distinguished from each other by color, size, or other characteristic. The four control spots displayed by the left controller 3110A are provided for left limb control while the four control spots displayed by the right controller 3110B are for right limb control.
A predefined key can be established for associating each control spot with a particular hand gesture and corresponding command. For example, control spot P corresponds to a punch command, control spot K corresponds to a kick command, control spot B corresponds to a block command, and control spot G corresponds to a grab command.
Hand gestures can be used as additional features in some embodiment. For example, a hand gesture with an extended thumb at the K control spot generates a signal instructing a sidekick to be executed by the video game character.
In some embodiments, one or more non-contact switch/controllers can be used to control a mechanical apparatus. For example, as illustrated in FIG. 32, the control spots of two non-contact controllers 3210A, 3210B (generally, 3210) of a controller system 3200 can be used to control an excavator or related apparatus. The two controllers 3210A, 3210B can replace two joysticks conventionally located at an excavator. The left controller 3210A corresponding to a left joystick is for controlling the fore arm and bucket of the excavator. The right controller 3210B corresponding to a right joystick is for controlling the back arm and housing movements of the escalator.
A plurality of control spots from the non-contact controllers 3210A, 3210B are projected on a surface. The control spots can be distinguished from each other by color, size, or other characteristic.
A predefined key can be established for associating each control spot with a particular hand gesture and corresponding command. For example, control spot A1 of the right controller 2310B corresponds to a command whereby the escalator arm moves up. Control spot A2 corresponds to a command whereby the escalator arm moves down. Control spot A3 corresponds to a command whereby the escalator arm rotates. When a hand gesture having a thumb extending in a left direction is placed at control spot A3, the escalator arm rotates to the left. When a hand gesture having a thumb extending in a right direction is placed at control spot A3, the escalator arm rotates to the right. When a hand gesture in the form of a first is placed at control spot A3, the escalator arm stops rotating. Control spot H corresponds to a rotation of the escalator housing. When a hand gesture having a thumb extending in a left direction is placed at control spot H, the escalator housing rotates to the left. When a hand gesture having a thumb extending in a right direction is placed at control spot H, the escalator housing rotates to the right. When a hand gesture in the form of a first is placed at control spot H, the escalator housing stops rotating.
In other examples, control spot B1 of the left controller 2010B corresponds to a command whereby the escalator bucket makes a scoop motion. Control spot B2 of the left controller 2010B corresponds to a command whereby the escalator bucket makes a dump motion. Control spot F1 of the left controller 2010B corresponds to a command whereby the escalator forearm is extended. Control spot F2 of the left controller 2010B corresponds to a command whereby the escalator forearm is curled or retracted.
Hand gestures can be used to enhance the controlling functions. For example, a thumb pointing left on control spot A3 rotates the back arm to the left while a thumb pointing to the right on control spot A3 rotates the back arm to the left. A closed first hand gesture can correspond to an instruction to remain at a current status.
FIG. 33A is a side view of a non-contact controller system 3300 for controlling a drone 3350, in accordance with some embodiments. FIG. 33B is a front view of the non-contact controller system 3300 of FIG. 33A. FIG. 33C is a view of a drone 3350 controlled by the non-contact controller system 3300 of FIGS. 33A and 33B.
In some embodiments, the system 3300 comprises two non-contact controllers 3304A, 3304B (generally, 3304) and a head mounted display 3302 such as a Google Glass wearable computer. A transceiver 3306 can be also worn on the head or any part of the user's body. Communications between the non-contact controller 3300 and the drone 3350 is via the transceiver 3306 of the controller 3300 and a transceiver on the drone 3350. The non-contact controllers 3304 are used to control the drone 3350 while the head mounted display 3302 is used to display flight information, onboard camera images, and/or other information related to an operation of the drone 3350.
In some embodiments, the two non-contact controllers 3304 are mounted on the user's forehead above the display device 3302. Two non-contact controllers allow the user to have enough function keys for controlling certain devices. In this manner, control spot beams 3305 can be projected from the non-contact controllers 3304 in front of the user. The user can position hand gestures along a path of the control spot beams. The control spots 3305 can be formed on and/or about the hand gestures for controlling the drone 3350. The control spots 3305 can illuminate a surface, and a hand gesture can be made over the surface but at the control spot for controlling the drone 3350.
FIG. 34 is another illustration of a non-contact controller system 3400 for controlling a drone, in accordance with some embodiments. The remote controller 3400 includes elements of a non-contact switch in other embodiments herein such as the controllers 3304 of FIG. 33. Details thereof are therefore not repeated for brevity. The controllers 3304A, 3304B can be left and right controllers, respectively. Only the function keys illuminated by control spot light in the two controllers are shown. The physical controllers can be similar to those described with reference to FIG. 27. Each controller 3304 can provide different control spot functions with respect to controlling a drone or other remote device.
In some embodiments, the controller system 3400 performs two modes of operation: manual control and automatic control, which can be established by a hand gesture, described below.
A plurality of control spots from the non-contact controllers 3304A, 3204B are projected on a surface. The control spots can be distinguished from each other by color, size, or other characteristic.
A predefined key can be established for associating each control spot with a particular hand gesture and corresponding command. For example, control spot Y of the right controller 3304B corresponds to a command that instructs the drone 3350 to move in a yaw direction. Control spot AD corresponds to a command that instructs the drone 3350 to ascend or descend. Control spot F corresponds to a command that instructs the drone 3350 to move forward. Control spot H corresponds to a command that instructs the drone 3350 to hover. When a hand gesture having a thumb extending in a left direction is placed at control spot Y, the drone 3350 is instructed to yaw to the left. When a hand gesture having a thumb extending in a right direction is placed at control spot Y, the drone 3350 is instructed to yaw to the right. When a hand gesture having a thumb extending in a left direction is placed at control spot AD, the drone 3350 is instructed to ascend. When a hand gesture having a thumb extending in a right direction is placed at control spot AD, the drone 3350 is instructed to descend.
Referring to the controller 3304A, control spot P corresponds to a command that instructs the drone 3350 to pitch. Control spot LT corresponds to a command that instructs the drone 3350 to land or takeoff Control spot R corresponds to a command that instructs the drone 3350 to roll. Control spot T corresponds to a command that instructs the drone 3350 to toggle between automatic and manual modes and/or to change a camera configuration. When a hand gesture having a thumb extending in a right direction is placed at control spot P, the drone 3350 is instructed to pitch up. When a hand gesture having a thumb extending in a left direction is placed at control spot P, the drone 3350 is instructed to pitch down. When a hand gesture having a thumb extending in a right direction is placed at control spot LT, the drone 3350 is instructed to take off. When a hand gesture having a thumb extending in a left direction is placed at control spot LT, the drone 3350 is instructed to land. A closed first hand gesture can correspond to an instruction to remain at a current status. When a hand gesture having a thumb extending in a left direction is placed at control spot R, the drone 3350 is instructed to roll in the positive direction. When a hand gesture having a thumb extending in a right direction is placed at control spot R, the drone 3350 is instructed to roll in the negative direction. The direction of roll obeys the right hand rule.
To toggle between the automatic mode and the manual mode, the user can insert his/her hand into the T control spot generated by controller 3304A, in some embodiments. A hand gesture with a thumb pointing right is for a manual mode. A hand gesture with a thumb pointing left is for an automatic mode.
FIG. 35A is an image generated from an onboard camera of the drone 3350 of FIG. 33C when the controller system is configured for a manual mode of operation, in accordance with some embodiments. FIG. 35B is an image generated from the onboard camera of the drone 3350 of FIG. 33C when the controller system is configured for an automatic mode of operation, in accordance with some embodiments. FIG. 35C is a view of the flight track of the drone 3350 of FIG. 33C.
As described above, the display device 3302 can be a Google Glass screen or the like, and can include an onboard camera. The image 3500A captured by the onboard camera of a drone, for example, described herein, is shown on the top with drone information such as altitude, heading, pitch/yaw, and image mean in percentage of the camera full dynamic range. Other information regarding the camera such as roll, pitch/yaw, and frame rate can also be displayed at the bottom of the display.
With regard to the image 3500B of FIG. 35B, as in the manual mode, the upper left corner includes the onboard camera image with drone information. Other information regarding the camera such as roll, pitch/yaw, and frame rate can also be displayed at the bottom of the display. As shown in FIG. 35C, other information can include a flight track and current position of the drone overlaying the map of the flight area. Flight track and altitude are pre-planned using GPS coordinates. The drone 3350 will fly according to the planned track at the prescribed altitude and return to the ground automatically.
In a manual mode, for example, as illustrated in FIG. 34, the user can control the pitch, yaw (heading), roll, forward, and hovering motions of the drone 3350 by putting hand gestures in the appropriated control spots in some embodiment. In some embodiments, multiple hand gestures can be employed, for example, fist, hand with thumb pointing left, and hand with thumb pointing right. The left thumb pointing gesture can be changed to right pointing gesture by rotating the arm or vice-versa. The direction of the thumb is the direction of an operation. For example, if the user wants the drone 3350 to turn left from the current direction, he/she will put one hand in control spot Y with thumb pointing left for turning left and another hand on control spot R with thumb pointing right for negative roll. The sign of the roll obeys the right hand rule. In order to turn smoothly, the drone 3350 must roll slightly. The limited number of simple hand gestures allows the algorithm short processing time. In some embodiments, more complex hand gestures can be employed.
FIG. 36 is an illustration of a controller system 3600 controlling an onboard camera, in accordance with some embodiments. The camera can be on the drone 3350 described with respect to FIG. 33C. The controller 3600 includes elements of a non-contact switch in other embodiments herein. Details thereof are therefore not repeated for brevity. The controllers 3604A, 3604B (generally, 3604) can be left and right controllers, respectively. Each controller 3604 can provide different control spot functions with respect to controlling an onboard camera configuration.
A plurality of control spots from the non-contact controllers 3604A, 3604B are projected on a surface. The control spots can be distinguished from each other by color, size, or other characteristic.
A predefined key can be established for associating each control spot with a particular hand gesture and corresponding command. For example, control spot Y of the right controller 3604B corresponds to a command that instructs the onboard camera to move in a yaw direction. Control spot P corresponds to a command that instructs the onboard camera to pitch. Control spot R corresponds to a command that instructs the onboard camera to roll. Control spot FR corresponds to a command that changes a frame rate of the onboard camera. When a hand gesture having a thumb extending in a left direction is placed at control spot Y, the camera is instructed to yaw to the left. When a hand gesture having a thumb extending in a right direction is placed at control spot Y, the camera is instructed to yaw to the right. When a hand gesture having a thumb extending in a left direction is placed at control spot R the camera is instructed to roll in the positive direction. When a hand gesture having a thumb extending in a right direction is placed at control spot R the camera is instructed to roll in the negative direction. When a hand gesture having a thumb extending in a left direction is placed at control spot FR, the frame rate increases. When a hand gesture having a thumb extending in a right direction is placed at control spot FR, the frame rate decreases.
Referring to the controller 3604A, control spot I corresponds to a command that instructs the camera to be inactive. Control spot LT corresponds to a command that instructs the drone 3350 to land or takeoff. Control spot T corresponds to a command that toggles the camera between automatic and manual modes and/or to change a camera configuration. When a hand gesture with left pointing thumb is positioned at control spot T, the onboard camera can be placed in a manual mode. When a hand gesture with right pointing thumb is positioned at control spot T, the onboard camera can be placed in an automatic mode. The function keys on the left controller 3604A can become inactive when a first is inserted into control spot T.
FIG. 37 is an illustration of a diffuser 3700 scattering light, in accordance with some embodiments.
While non-contact switches or controllers in accordance with embodiments are often implemented for remote control applications, a diffuser-based switch in accordance with other embodiments is implemented for close proximity control applications. The diffuser 3700 scatters light into different directions. The scattering can occur in the diffuser's volume or on its surfaces. It can occur in transmission or reflection. Scattering not only scattered light out of the original direction of an incident light ray, it can also scatter light into the original direction from other light rays. As shown in FIG. 37, the unscattered transmitted light rays of incident light rays I0 and I1 are I0t and I1t, respectively. The scattered transmitted light rays of incident light rays I0 and I1 are I0s and I1s, respectively. As illustrated in FIG. 37, scattered light from I1 is scattered into I0t direction. And scattered light from I0 is scattered into I1t direction. The net result is smearing of target and background. The smearing is worsened as the target moves away from the diffuser 3700.
In FIG. 38A-D, a ground glass diffuser is positioned between a camera and object, namely, a U.S. map so that the field of view of the camera is directed at a portion of the map via the diffuser. The map is about 2 feet away from the diffuser. The region of the map behind the diffuser shows an uniform background with no detail due to smearing of the diffuser. In the region unblocked by the diffuser, map details are shown as illustrated in image 3800A of FIG. 38A. In the image 3800B shown in FIG. 38B, a hand touches the diffuser, whereby image of the hand is clearly visible although a bit blurry. In the image 3800C shown in FIG. 38C, the hand is near but does not touch the diffuser. Here, the hand is still visible but more distorted than in the image 3800B. In the image 3800D shown in FIG. 38D, the hand moves further away from the diffuser, whereby the hand has lost details. How fast the diffuser smears out details depends also on density of the diffuser grid. It is faster for denser grid and slower for less denser grid.
Using the smearing effect of the diffuser, a switch is constructed in some embodiment. This effect smears the details in the background while keeping the details near the diffuser. FIG. 39 shows a layout for such a switch 3900. The switch 3900 comprises a diffuser 3904, an imaging sensor 3906, a light source 3902, a processor 3912, and the device under control 3905. In some embodiment, the light source 3902 can illuminate a target from either side of the diffuser 3904. For side illumination, a light guide is usually used in conjunction with the diffuser 3904. When a hand touches or is near the diffuser 3904, the image only shows the hand. The background is blurred out. The processor 3912 can process the hand gesture image and send a command to the device under control 3905 based on hand gesture information, for example, described in other embodiments herein. In some embodiments, two imaging sensors 3906 can be used.
FIG. 40A is a top view of a switch 4000 having a diffuser, in accordance with some embodiments. FIG. 40B is a side view of the switch 4000 of FIG. 40A.
A plurality of control spots are formed by a diffuser 4004 in some embodiments. When user's hand is placed on the control spots, this indicates that the user intends to control the device in communication with the switch 4000. In some embodiments, color control spots are placed on the diffuser 4004. The control spots can be generated by color filters 4006 coated or mounted on the surfaces of the diffuser 4004 in some embodiments. In other embodiments, the control spots can be generated by shining color lights on the control spots. The control spots correspond to function key positions, for example, similar to other embodiments herein.
The switch 4000 can be useful for machine operation and video gaming in some embodiments. For example, the non-contact controllers 3200 in FIG. 32 can be replaced by two diffuser based controllers: one positioned at the left side of a driver's seat, the other at the right side. The functions keys of the switch 4000 can be the same as those of other switches or controllers described herein except the control spots are located on the surfaces of the diffuser 4004. Same hand gestures can be used except imaging sensors 4010 are implemented instead of cameras. In this example, the joysticks described in FIG. 32 can be replaced by diffuser based switch/controllers in some embodiments. Joystick functions can be assigned to control spots and hand gestures. FIG. 41 is an illustration of an operator using the diffuser-based controller 4000 of FIG. 40, in accordance with some embodiments. In replacing the controllers 3200 of FIG. 32, two diffuser-based controllers 4000 can be provided, one for each hand. A hand gesture 4012 is positioned at the diffuser 4004, under which one or more control spots are provided corresponding to various function key positions.
In some embodiment, one of the control spot can be reserved for a tracking mode. When user's hand is placed over this control spot, all other control spots are inactive. A sensor is provided to for tracking a user's hand motion. Tracking will stop when the user inserts a particular hand gesture over the same control spot. Hand tracking can be used in beam steering control in some embodiments. It can also be employed as a mouse for a computer in some embodiments.
A combination lock used combination of integers to lock and unlock. Hand gestures can be used to represent these integers as illustrated in FIG. 42 in some embodiments. The number panel of a combination lock 4200 is also shown in FIG. 42. In some embodiments, hand gestures can be used to lock and unlock a door or any device.
FIG. 43 is an illustration of a diffuser-based hand gesture lock mechanism 4300, in accordance with some embodiments.
In some embodiment, the hand gesture lock mechanism 4300 can comprise of a light source 4302, an imaging sensor 4304, a diffuser 4306, and a processor 4312, which are the same as or similar to those of other switches and controllers herein. A description thereof is not repeated for brevity. An electromechanical locking/unlocking mechanism 4308 is also provided. For example, if the combination is 5912, then a hand gesture of 5 fingers is placed near or on the diffuser first, followed by a “pinkyless” hand gesture, proceeded by index finger hand gesture. Index and middle fingers hand gesture is the last hand gesture. Processor 4312 can process the images and generate a command to the electromechanical locking/unlocking mechanism 4308 to unlock the door. The background shown in FIG. 43 is not seen by the imaging sensor because it is smeared out due to its distance from the diffuser.
While the present inventive concepts have been particularly shown and described above with reference to exemplary embodiments thereof, it will be understood by those of ordinary skill in the art, that various changes in form and detail can be made without departing from the spirit and scope of the present inventive concepts.

Claims (21)

What is claimed is:
1. A non-contact sensing device, comprising:
a sensor comprising a plurality of function key sensors having a plurality of non-overlapping field of views (FOVs), each FOV corresponding to a different function key;
a function key sensor of the plurality of function key sensors constructed and arranged to detect a presence of a body part at in its field of view, and to generate a function key control signal in response to detecting the body part at the field of view, wherein the body part of a user activates a function key to which the FOV of the function key sensor corresponds without the user physically touching the function key;
a processor that receives and processes the function key control signal and outputs a command according to an assigned task of the function key sensor to a remote apparatus under control of the non-contact sensing device in response to the processed function key control signal, wherein the processor further stores data of the function key for identifying the function key and associating the function key with the assigned task of the function key sensor; and
a control spot generator that illuminates the non-overlapping FOVs corresponding to the function keys to visually distinguish the different function keys.
2. The non-contact sensing device of claim 1, further comprising one or more cameras that recognize a hand gesture, wherein the command is generated from a combination of a function key control signal corresponding to the function key sensor and the recognized hand gesture.
3. The non-contact sensing device of claim 1, wherein the sensor is a staring sensor comprising a detector array that includes a combination of the function key sensors and non-function key sensors, the staring sensor generating all image pixels of the detector array simultaneously.
4. The non-contact sensing device of claim 1, wherein the sensor is a scanning sensor that scans a portion of a field of view at a time.
5. The non-contact sensing device of claim 1, wherein the sensor includes a scan mirror that scans all function key sensors and only the non-function key sensors in the path of the scan to shorten the data acquisition time.
6. The non-contact sensing device of claim 1, wherein the sensor is constructed and arranged as an emissive mode sensor comprising a thermal sensor that collects thermal radiation emitted from the hand or other body part.
7. The non-contact sensing device of claim 1, wherein the sensor is constructed and arranged as a reflective mode sensor comprising a color sensor that collects color light reflected from a hand or other body part.
8. The non-contact sensing device of claim 1, wherein the control spot generator generates a control spot that is aligned with the field of view of the function key sensor, and wherein the function key sensor detects a hand or body part within the control spot.
9. The non-contact sensing device of claim 1, further comprising a beamsplitter positioned between the sensor and the control spot generator, wherein light beams outputted from the control spot generator directed at the beamsplitter coincide with the field of views of the function key sensors.
10. The non-contact sensing device of claim 1, wherein the function key corresponding to the function key sensor distinguished from other function key sensors is identified by positioning a ground truth target at the control spot among a plurality of control spots and collecting by the non-contact sensing device an image of the ground truth target, wherein a pixel or group of pixels at the sensor having a highest detector output is identified as the current function key.
11. The non-contact sensing device of claim 1, wherein the control spot generator comprises a white light emitting diode (LED), a control spot generator plate having a plurality of color filters, and a lens, wherein color light is generated from the color filters when the white LED illuminates, and wherein a plurality of control spots are generated.
12. The non-contact sensing device of claim 11, wherein each color control spot is aligned with a field of view of a function key sensor.
13. The non-contact sensing device of claim 1, wherein the control spot generator comprises a plurality of color LEDs, light pipes, a light pipe mounting plate, and a lens, wherein the color LEDs are placed at the input ends of light pipes and the output ends of light pipes are placed at the focal plane of the lens, thereby generating control spots of different colors that each illuminate a field of view of a different function key sensor, and wherein the light pipe plate holds the light pipes together at the focal plane of the lens.
14. The non-contact sensing device of claim 1, wherein the remote apparatus comprises a plurality of devices, and wherein the processor generates a device number for a device of the plurality of devices, each device number corresponding to a different hand gesture at a designated control spot, thereby allowing a user to choose what device to operate.
15. The non-contact sensing device of claim 1, wherein a hand gesture at a chosen control spot corresponding to a function key triggers a tracking mode in which a camera tracks hand gesture motion commands for controlling one or multiple apparatuses, wherein in the tracking mode all function keys except the chosen one become inactive, and the processor will not process signals from these function keys, and wherein when exiting the tracking mode, a stop hand gesture can be inserted into the chosen control spot, wherein the other function keys become active again.
16. The non-contact sensing device of claim 1, wherein the sensor is a color sensor comprising color filters on a rotating wheel in front of the sensor, wherein an image of a scene is taken for each color filter, and wherein function key pixels of color images are processed by the processor to determine if a skin color spectrum is detected.
17. The non-contact sensing device of claim 1, wherein the sensor is a color sensor comprising a color camera, and wherein function key pixels of color images are processed to determine if a skin color spectrum is detected.
18. The non-contact sensing device of claim 1, wherein the processor distinguishes the function key sensor from other function key sensors of the plurality of sensors from the function key positions stored in the processor during a function key identification calibration process.
19. The non-contact sensing device of claim 1 further comprising a head mounted display collocated with the sensor, the display providing visual information regarding an operation of the remote apparatus.
20. The non-contact sensing device of claim 19, wherein the remote apparatus is a drone, and wherein the non-contact sensing device is constructed and arranged to control the operation of the drone, and wherein the head mounted display displays a combination of flight information, onboard camera images, and other information regarding the operation of the drone.
21. The non-contact sensing device of claim 1, a camera captures image data corresponding to the hand gesture and the processor converts the captured image data into a cursor command signal that controls a cursor at a display.
US14/318,019 2013-06-28 2014-06-27 Systems and methods for controlling device operation according to hand gestures Active US9423879B2 (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
US14/318,019 US9423879B2 (en) 2013-06-28 2014-06-27 Systems and methods for controlling device operation according to hand gestures
US15/206,355 US20170083103A1 (en) 2013-06-28 2016-07-11 Systems and methods for controlling device operation according to hand gestures

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US201361840791P 2013-06-28 2013-06-28
US14/318,019 US9423879B2 (en) 2013-06-28 2014-06-27 Systems and methods for controlling device operation according to hand gestures

Related Child Applications (1)

Application Number Title Priority Date Filing Date
US15/206,355 Continuation US20170083103A1 (en) 2013-06-28 2016-07-11 Systems and methods for controlling device operation according to hand gestures

Publications (2)

Publication Number Publication Date
US20150002391A1 US20150002391A1 (en) 2015-01-01
US9423879B2 true US9423879B2 (en) 2016-08-23

Family

ID=52115073

Family Applications (2)

Application Number Title Priority Date Filing Date
US14/318,019 Active US9423879B2 (en) 2013-06-28 2014-06-27 Systems and methods for controlling device operation according to hand gestures
US15/206,355 Abandoned US20170083103A1 (en) 2013-06-28 2016-07-11 Systems and methods for controlling device operation according to hand gestures

Family Applications After (1)

Application Number Title Priority Date Filing Date
US15/206,355 Abandoned US20170083103A1 (en) 2013-06-28 2016-07-11 Systems and methods for controlling device operation according to hand gestures

Country Status (4)

Country Link
US (2) US9423879B2 (en)
EP (1) EP3014407A4 (en)
CN (1) CN105518576B (en)
WO (1) WO2014210502A1 (en)

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20160017656A1 (en) * 2013-03-15 2016-01-21 Springs Window Fashions, Llc Window covering motorized lift and control operating system
US20160368382A1 (en) * 2013-06-29 2016-12-22 Audi Ag Motor vehicle control interface with gesture recognition
US9994233B2 (en) * 2014-09-30 2018-06-12 Continental Automotive Systems, Inc. Hands accelerating control system
WO2021038109A1 (en) 2019-08-30 2021-03-04 Metralabs Gmbh Neue Technologien Und Systeme System for capturing sequences of movements and/or vital parameters of a person
US11107236B2 (en) 2019-04-22 2021-08-31 Dag Michael Peter Hansson Projected augmented reality interface with pose tracking for directing manual processes
US11780080B2 (en) 2020-04-27 2023-10-10 Scalable Robotics Inc. Robot teaching with scans and geometries

Families Citing this family (75)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9587804B2 (en) 2012-05-07 2017-03-07 Chia Ming Chen Light control systems and methods
CN105518576B (en) 2013-06-28 2019-04-16 陈家铭 It is operated according to the control device of gesture
US9717118B2 (en) 2013-07-16 2017-07-25 Chia Ming Chen Light control systems and methods
WO2015168218A2 (en) 2014-04-29 2015-11-05 Chia Ming Chen Light control systems and methods
US9575560B2 (en) * 2014-06-03 2017-02-21 Google Inc. Radar-based gesture-recognition through a wearable device
US9696813B2 (en) * 2015-05-27 2017-07-04 Hsien-Hsiang Chiu Gesture interface robot
US10282057B1 (en) * 2014-07-29 2019-05-07 Google Llc Image editing on a wearable device
US9921660B2 (en) 2014-08-07 2018-03-20 Google Llc Radar-based gesture recognition
US9811164B2 (en) 2014-08-07 2017-11-07 Google Inc. Radar-based gesture sensing and data transmission
JP6278565B2 (en) * 2014-08-11 2018-02-14 本田技研工業株式会社 Self-driving vehicle control device
US9588625B2 (en) 2014-08-15 2017-03-07 Google Inc. Interactive textiles
US10268321B2 (en) 2014-08-15 2019-04-23 Google Llc Interactive textiles within hard objects
US9778749B2 (en) 2014-08-22 2017-10-03 Google Inc. Occluded gesture recognition
US11169988B2 (en) 2014-08-22 2021-11-09 Google Llc Radar recognition-aided search
WO2016065519A1 (en) * 2014-10-27 2016-05-06 SZ DJI Technology Co., Ltd. Uav flight display
CN113628500A (en) 2014-09-30 2021-11-09 深圳市大疆创新科技有限公司 System and method for supporting analog mobility
US9600080B2 (en) 2014-10-02 2017-03-21 Google Inc. Non-line-of-sight radar-based gesture recognition
US11119565B2 (en) 2015-01-19 2021-09-14 Samsung Electronics Company, Ltd. Optical detection and analysis of bone
CN104808675B (en) * 2015-03-03 2018-05-04 广州亿航智能技术有限公司 Body-sensing flight control system and terminal device based on intelligent terminal
US10016162B1 (en) 2015-03-23 2018-07-10 Google Llc In-ear health monitoring
US9983747B2 (en) 2015-03-26 2018-05-29 Google Llc Two-layer interactive textiles
KR102002112B1 (en) 2015-04-30 2019-07-19 구글 엘엘씨 RF-based micro-motion tracking for gesture tracking and recognition
WO2016176574A1 (en) 2015-04-30 2016-11-03 Google Inc. Wide-field radar-based gesture recognition
KR102327044B1 (en) 2015-04-30 2021-11-15 구글 엘엘씨 Type-agnostic rf signal representations
US20160317909A1 (en) * 2015-04-30 2016-11-03 Barry Berman Gesture and audio control of a pinball machine
KR20160138806A (en) * 2015-05-26 2016-12-06 엘지전자 주식회사 Glass type terminal and method for controlling the same
US10088908B1 (en) 2015-05-27 2018-10-02 Google Llc Gesture detection and interactions
US9693592B2 (en) 2015-05-27 2017-07-04 Google Inc. Attaching electronic components to interactive textiles
DE102015209900A1 (en) * 2015-05-29 2016-12-01 Kuka Roboter Gmbh Method for determining a path point
US10310617B2 (en) * 2015-06-11 2019-06-04 Intel Corporation Drone controlling device and method
US9616568B1 (en) 2015-08-25 2017-04-11 X Development Llc Generating a grasp affordance for an object based on a thermal image of the object that is captured following human manipulation of the object
US20170068416A1 (en) * 2015-09-08 2017-03-09 Chian Chiu Li Systems And Methods for Gesture Input
US10817065B1 (en) 2015-10-06 2020-10-27 Google Llc Gesture recognition using multiple antenna
SE539323C2 (en) 2015-10-19 2017-07-04 Husqvarna Ab Improved control of remote demolition robot
WO2017079484A1 (en) 2015-11-04 2017-05-11 Google Inc. Connectors for connecting electronics embedded in garments to external devices
US10019072B2 (en) 2016-01-01 2018-07-10 International Business Machines Corporation Imagined grid fingertip input editor on wearable device
US10168700B2 (en) 2016-02-11 2019-01-01 International Business Machines Corporation Control of an aerial drone using recognized gestures
WO2017142480A1 (en) * 2016-02-15 2017-08-24 Advanced Material Engineering Pte. Ltd. Modular add-on augmented reality head-up display, interfaces and controls
US11086313B2 (en) * 2016-04-27 2021-08-10 Atlas Dynamic Limited Gesture-based unmanned aerial vehicle (UAV) control
US10492302B2 (en) 2016-05-03 2019-11-26 Google Llc Connecting an electronic component to an interactive textile
US10175781B2 (en) 2016-05-16 2019-01-08 Google Llc Interactive object with multiple electronics modules
TWI585711B (en) * 2016-05-24 2017-06-01 泰金寶電通股份有限公司 Method for obtaining care information, method for sharing care information, and electronic apparatus therefor
US10591988B2 (en) * 2016-06-28 2020-03-17 Hiscene Information Technology Co., Ltd Method for displaying user interface of head-mounted display device
CN106210345B (en) * 2016-07-29 2019-06-07 努比亚技术有限公司 A kind of mobile terminal and its control method
JP6425822B2 (en) * 2016-07-29 2018-11-21 株式会社ソニー・インタラクティブエンタテインメント Unmanned air vehicle and flight control method of unmanned air vehicle
US10095315B2 (en) 2016-08-19 2018-10-09 Otis Elevator Company System and method for distant gesture-based control using a network of sensors across the building
US11287945B2 (en) 2016-09-08 2022-03-29 Chian Chiu Li Systems and methods for gesture input
CN109844476A (en) * 2016-09-21 2019-06-04 优泰机电有限公司 Motion tracking thermopile array sensor and its application
WO2018054831A1 (en) * 2016-09-22 2018-03-29 Philips Lighting Holding B.V. Thermal imaging for space usage analysis
US10659279B2 (en) 2016-10-04 2020-05-19 Htc Corporation Method and device for displaying video corresponding to physical object
US10579150B2 (en) 2016-12-05 2020-03-03 Google Llc Concurrent detection of absolute distance and relative movement for sensing action gestures
CN106444843B (en) * 2016-12-07 2019-02-15 北京奇虎科技有限公司 Unmanned plane relative bearing control method and device
CN108214482B (en) * 2016-12-14 2021-02-02 上银科技股份有限公司 Non-contact gesture teaching robot
US10409276B2 (en) * 2016-12-21 2019-09-10 Hangzhou Zero Zero Technology Co., Ltd. System and method for controller-free user drone interaction
US10864633B2 (en) * 2017-04-28 2020-12-15 Southe Autonomy Works, Llc Automated personalized feedback for interactive learning applications
CN107436684A (en) * 2017-07-12 2017-12-05 上海创单电子科技有限公司 A kind of Non-contact man-machine interaction method based on infrared ray
US11219837B2 (en) * 2017-09-29 2022-01-11 Sony Interactive Entertainment Inc. Robot utility and interface device
CN107995416B (en) * 2017-11-14 2019-10-18 维沃移动通信有限公司 A kind of focus adjustment method and mobile terminal
WO2019144295A1 (en) * 2018-01-23 2019-08-01 深圳市大疆创新科技有限公司 Flight control method and device, and aircraft, system and storage medium
CN109074168B (en) * 2018-01-23 2022-06-17 深圳市大疆创新科技有限公司 Unmanned aerial vehicle control method and device and unmanned aerial vehicle
US11402917B2 (en) * 2018-06-20 2022-08-02 Sony Interactive Entertainment Inc. Gesture-based user interface for AR and VR with gaze trigger
CN109116980A (en) * 2018-06-25 2019-01-01 中山大学新华学院 A kind of gesture identification intelligent robot system
US11190674B2 (en) * 2018-09-07 2021-11-30 John Robert Mortensen Remote camera trigger
WO2020051783A1 (en) * 2018-09-12 2020-03-19 Robert Bosch Gmbh Laser leveling tool with gesture control
CN111221405A (en) * 2018-11-23 2020-06-02 东莞市易联交互信息科技有限责任公司 Gesture control method and device
CN109933203A (en) * 2019-03-21 2019-06-25 福建工程学院 A kind of hydraulic crawler excavator control method and system based on computer vision gesture
US11106200B2 (en) * 2019-06-27 2021-08-31 Baidu Usa Llc Safety mechanism for joystick control for controlling an unmanned vehicle
US20210196152A1 (en) * 2019-12-31 2021-07-01 Align Technology, Inc. Gesture control using an intraoral scanner
CN111967550B (en) * 2020-07-30 2022-08-30 内蒙古智诚物联股份有限公司 Non-contact temperature detector based on artificial intelligence and detection method thereof
JP2022103098A (en) * 2020-12-25 2022-07-07 グローリー株式会社 Automatic transaction device and method for controlling automatic transaction device
US20220244791A1 (en) * 2021-01-24 2022-08-04 Chian Chiu Li Systems And Methods for Gesture Input
CN113900513B (en) * 2021-09-26 2022-10-11 上海交通大学 Air gesture control method based on visible light signals
AU2022439107A1 (en) 2022-02-01 2024-09-19 Landscan Llc Systems and methods for multispectral landscape mapping
US12073020B2 (en) 2022-12-30 2024-08-27 Htc Corporation Head-mounted display, unlocking method and non-transitory computer readable storage medium thereof
CN117021117B (en) * 2023-10-08 2023-12-15 电子科技大学 Mobile robot man-machine interaction and positioning method based on mixed reality

Citations (45)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4843522A (en) 1984-03-06 1989-06-27 Rosenberg Burton A Vehicular lamp with moving light source
US4985651A (en) 1987-10-19 1991-01-15 Anwar Chitayat Linear motor with magnetic bearing preload
EP0903608A2 (en) 1997-09-20 1999-03-24 Matra Marconi Space Uk Limited Beam steerer
US20030067773A1 (en) 1999-12-02 2003-04-10 Koninklijke Philips Electronics N.V. LED/phosphor-LED hybrid lighting systems
WO2003031923A1 (en) 2001-10-01 2003-04-17 Ud Technology Corporation Simultaneous multi-beam planar array ir (pair) sepctroscopy
US20050029458A1 (en) 2003-08-04 2005-02-10 Z Jason Geng System and a method for a smart surveillance system
US20050087601A1 (en) 2003-10-24 2005-04-28 Gerst Carl W.Iii Light pipe illumination system and method
US20050228366A1 (en) 2004-04-09 2005-10-13 Ralf Kessler Beam steering system for corneal laser surgery
US20060119865A1 (en) 2004-12-06 2006-06-08 Hoyt Clifford C Systems and methods for in-vivo optical imaging and measurement
US20070023661A1 (en) 2003-08-26 2007-02-01 Redshift Systems Corporation Infrared camera system
US20080256494A1 (en) 2007-04-16 2008-10-16 Greenfield Mfg Co Inc Touchless hand gesture device controller
US20080294315A1 (en) 1995-06-07 2008-11-27 Intelligent Technologies International, Inc. System and Method for Controlling Vehicle Headlights
US20080294017A1 (en) 2007-05-22 2008-11-27 Gobeyn Kevin M Image data normalization for a monitoring system
US20090027682A1 (en) 1996-04-30 2009-01-29 Hebert Raymond T Method and Device For Measuring Reflected Optical Radiation
US20090040299A1 (en) 2006-05-02 2009-02-12 Telesis Technologies, Inc. Laser safety system with beam steering
US20090103780A1 (en) * 2006-07-13 2009-04-23 Nishihara H Keith Hand-Gesture Recognition Method
US20100066821A1 (en) 2008-09-16 2010-03-18 Plantronics, Inc. Infrared Derived User Presence and Associated Remote Control
US20100128109A1 (en) 2008-11-25 2010-05-27 Banks Paul S Systems And Methods Of High Resolution Three-Dimensional Imaging
US20100151946A1 (en) 2003-03-25 2010-06-17 Wilson Andrew D System and method for executing a game process
US20100176270A1 (en) 2009-01-09 2010-07-15 Lau Kam C Volumetric error compensation system with laser tracker and active target
US20100200753A1 (en) 2007-06-13 2010-08-12 Adrian Lucien Reginald Westaway Directable Light
WO2010138741A1 (en) 2009-05-27 2010-12-02 Analog Devices, Inc. Position measurement systems using position sensitive detectors
US20110102763A1 (en) 2009-10-30 2011-05-05 Microvision, Inc. Three Dimensional Imaging Device, System and Method
US20110103063A1 (en) 2009-09-12 2011-05-05 Robe Lighting S.R.O Optics for an automated luminaire
US20110221599A1 (en) * 2010-03-09 2011-09-15 Flir Systems, Inc. Imager with multiple sensor arrays
US20110242042A1 (en) 2010-04-02 2011-10-06 Amlogic Co., Ltd. Touch Panel Having Joystick Capabilities
US20110301934A1 (en) * 2010-06-04 2011-12-08 Microsoft Corporation Machine based sign language interpreter
WO2012001677A2 (en) 2010-06-29 2012-01-05 Israel Aerospace Industries Ltd. Line of sight stabilization system
US8139935B2 (en) 2010-03-31 2012-03-20 James Cameron 3D camera with foreground object distance sensing
US20120075463A1 (en) 2010-09-23 2012-03-29 Sony Computer Entertainment Inc. User interface system and method using thermal imaging
EP2463751A2 (en) * 2010-12-08 2012-06-13 Omron Corporation Gesture recognition apparatus, gesture recognition method, control program, and recording medium
US20120194083A1 (en) 2009-08-10 2012-08-02 Redwood Systems, Inc. Group creation in auto-commissioning of lighting systems
US20120320092A1 (en) * 2011-06-14 2012-12-20 Electronics And Telecommunications Research Institute Method and apparatus for exhibiting mixed reality based on print medium
US20130063042A1 (en) 2011-03-11 2013-03-14 Swapnil Bora Wireless lighting control system
US20130101276A1 (en) 2011-10-21 2013-04-25 Raytheon Company Single axis gimbal optical stabilization system
US20130120238A1 (en) 2011-11-11 2013-05-16 Osram Sylvania Inc. Light control method and lighting device using the same
US20130131836A1 (en) 2011-11-21 2013-05-23 Microsoft Corporation System for controlling light enabled devices
US20130128334A1 (en) 2011-11-18 2013-05-23 Vuzix Corporation Beam Steering Device
WO2013076606A1 (en) 2011-11-07 2013-05-30 Koninklijke Philips Electronics N.V. User interface using sounds to control a lighting system
US20130208481A1 (en) 2012-02-09 2013-08-15 Danny H. Sooferian Adjustable focus light
US20130293722A1 (en) 2012-05-07 2013-11-07 Chia Ming Chen Light control systems and methods
US8854433B1 (en) * 2012-02-03 2014-10-07 Aquifi, Inc. Method and system enabling natural user interface gestures with an electronic system
US20150002391A1 (en) 2013-06-28 2015-01-01 Chia Ming Chen Systems and methods for controlling device operation according to hand gestures
US20150023019A1 (en) 2013-07-16 2015-01-22 Chia Ming Chen Light control systems and methods
US20150049062A1 (en) * 2012-03-26 2015-02-19 Silicon Communications Technology Co., Ltd Motion gesture sensing module and motion gesture sensing method

Family Cites Families (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8971565B2 (en) * 2008-05-29 2015-03-03 Hie-D Technologies, Llc Human interface electronic device
US8787663B2 (en) * 2010-03-01 2014-07-22 Primesense Ltd. Tracking body parts by combined color image and depth processing
KR101765771B1 (en) * 2011-05-05 2017-08-07 맥심 인터그래이티드 프로덕츠 인코포레이티드 Method for detecting gestures using a multi-segment photodiode and one or fewer illumination sources
US9229581B2 (en) * 2011-05-05 2016-01-05 Maxim Integrated Products, Inc. Method for detecting gestures using a multi-segment photodiode and one or fewer illumination sources
US9002058B2 (en) * 2011-12-01 2015-04-07 Microvision, Inc. Scanned image projection system with gesture control input
US8553235B1 (en) * 2012-01-18 2013-10-08 Wen-Chieh Geoffrey Lee High resolution and high sensitivity optically activated touch sensing device using multiple color light sources

Patent Citations (49)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4843522A (en) 1984-03-06 1989-06-27 Rosenberg Burton A Vehicular lamp with moving light source
US4985651A (en) 1987-10-19 1991-01-15 Anwar Chitayat Linear motor with magnetic bearing preload
US20080294315A1 (en) 1995-06-07 2008-11-27 Intelligent Technologies International, Inc. System and Method for Controlling Vehicle Headlights
US20090027682A1 (en) 1996-04-30 2009-01-29 Hebert Raymond T Method and Device For Measuring Reflected Optical Radiation
EP0903608A2 (en) 1997-09-20 1999-03-24 Matra Marconi Space Uk Limited Beam steerer
US20030067773A1 (en) 1999-12-02 2003-04-10 Koninklijke Philips Electronics N.V. LED/phosphor-LED hybrid lighting systems
WO2003031923A1 (en) 2001-10-01 2003-04-17 Ud Technology Corporation Simultaneous multi-beam planar array ir (pair) sepctroscopy
US20100151946A1 (en) 2003-03-25 2010-06-17 Wilson Andrew D System and method for executing a game process
US20050029458A1 (en) 2003-08-04 2005-02-10 Z Jason Geng System and a method for a smart surveillance system
US20070023661A1 (en) 2003-08-26 2007-02-01 Redshift Systems Corporation Infrared camera system
US20050087601A1 (en) 2003-10-24 2005-04-28 Gerst Carl W.Iii Light pipe illumination system and method
US20050228366A1 (en) 2004-04-09 2005-10-13 Ralf Kessler Beam steering system for corneal laser surgery
US20060119865A1 (en) 2004-12-06 2006-06-08 Hoyt Clifford C Systems and methods for in-vivo optical imaging and measurement
US20090040299A1 (en) 2006-05-02 2009-02-12 Telesis Technologies, Inc. Laser safety system with beam steering
US20090103780A1 (en) * 2006-07-13 2009-04-23 Nishihara H Keith Hand-Gesture Recognition Method
US20080256494A1 (en) 2007-04-16 2008-10-16 Greenfield Mfg Co Inc Touchless hand gesture device controller
US20080294017A1 (en) 2007-05-22 2008-11-27 Gobeyn Kevin M Image data normalization for a monitoring system
US20100200753A1 (en) 2007-06-13 2010-08-12 Adrian Lucien Reginald Westaway Directable Light
US8455830B2 (en) 2007-06-13 2013-06-04 Adrian Lucien Reginald Westaway Directable light
US20100066821A1 (en) 2008-09-16 2010-03-18 Plantronics, Inc. Infrared Derived User Presence and Associated Remote Control
US20100128109A1 (en) 2008-11-25 2010-05-27 Banks Paul S Systems And Methods Of High Resolution Three-Dimensional Imaging
US20100176270A1 (en) 2009-01-09 2010-07-15 Lau Kam C Volumetric error compensation system with laser tracker and active target
WO2010138741A1 (en) 2009-05-27 2010-12-02 Analog Devices, Inc. Position measurement systems using position sensitive detectors
US20120194083A1 (en) 2009-08-10 2012-08-02 Redwood Systems, Inc. Group creation in auto-commissioning of lighting systems
US20110103063A1 (en) 2009-09-12 2011-05-05 Robe Lighting S.R.O Optics for an automated luminaire
US20110102763A1 (en) 2009-10-30 2011-05-05 Microvision, Inc. Three Dimensional Imaging Device, System and Method
US20110221599A1 (en) * 2010-03-09 2011-09-15 Flir Systems, Inc. Imager with multiple sensor arrays
US8139935B2 (en) 2010-03-31 2012-03-20 James Cameron 3D camera with foreground object distance sensing
US20110242042A1 (en) 2010-04-02 2011-10-06 Amlogic Co., Ltd. Touch Panel Having Joystick Capabilities
US20110301934A1 (en) * 2010-06-04 2011-12-08 Microsoft Corporation Machine based sign language interpreter
WO2012001677A2 (en) 2010-06-29 2012-01-05 Israel Aerospace Industries Ltd. Line of sight stabilization system
US20130193315A1 (en) 2010-06-29 2013-08-01 Israel Aerospace Industries Ltd. Line of sight stabilization system
US20120075463A1 (en) 2010-09-23 2012-03-29 Sony Computer Entertainment Inc. User interface system and method using thermal imaging
EP2463751A2 (en) * 2010-12-08 2012-06-13 Omron Corporation Gesture recognition apparatus, gesture recognition method, control program, and recording medium
US20120146903A1 (en) 2010-12-08 2012-06-14 Omron Corporation Gesture recognition apparatus, gesture recognition method, control program, and recording medium
US20130063042A1 (en) 2011-03-11 2013-03-14 Swapnil Bora Wireless lighting control system
US20120320092A1 (en) * 2011-06-14 2012-12-20 Electronics And Telecommunications Research Institute Method and apparatus for exhibiting mixed reality based on print medium
US20130101276A1 (en) 2011-10-21 2013-04-25 Raytheon Company Single axis gimbal optical stabilization system
WO2013076606A1 (en) 2011-11-07 2013-05-30 Koninklijke Philips Electronics N.V. User interface using sounds to control a lighting system
US20150002046A1 (en) 2011-11-07 2015-01-01 Koninklijke Philips N.V. User Interface Using Sounds to Control a Lighting System
US20130120238A1 (en) 2011-11-11 2013-05-16 Osram Sylvania Inc. Light control method and lighting device using the same
US20130128334A1 (en) 2011-11-18 2013-05-23 Vuzix Corporation Beam Steering Device
US20130131836A1 (en) 2011-11-21 2013-05-23 Microsoft Corporation System for controlling light enabled devices
US8854433B1 (en) * 2012-02-03 2014-10-07 Aquifi, Inc. Method and system enabling natural user interface gestures with an electronic system
US20130208481A1 (en) 2012-02-09 2013-08-15 Danny H. Sooferian Adjustable focus light
US20150049062A1 (en) * 2012-03-26 2015-02-19 Silicon Communications Technology Co., Ltd Motion gesture sensing module and motion gesture sensing method
US20130293722A1 (en) 2012-05-07 2013-11-07 Chia Ming Chen Light control systems and methods
US20150002391A1 (en) 2013-06-28 2015-01-01 Chia Ming Chen Systems and methods for controlling device operation according to hand gestures
US20150023019A1 (en) 2013-07-16 2015-01-22 Chia Ming Chen Light control systems and methods

Non-Patent Citations (15)

* Cited by examiner, † Cited by third party
Title
Borah et al.: A review of communication-oriented optical wireless systems. EURASIP Journal on Wireless Communications and Networking 2012, 2012:91.
Elgala et al.: Indoor Optical Wireless Communication: Potential and State-of-the-Art. IEEE Communications Magazine. Sep. 2011, pp. 56-62.
Elgala ety al.: OFDM Visible Light Wireless Communication Based on White LEDs. IEEE 2007, pp. 2185-2189.
Herbst et al.: Basics of 3D Digital Image Correlation. Dantec Dynamics Application Note-T-Q-400-Basics-3DCORR-002a-EN.
International Preliminary Report of Patentability dated Jan. 7, 2016, issued in corresponding International Patent Application No. PCT/US2014/044643.
International Search Report and the Written Opinion dated Oct. 8, 2015, issued in corresponding International Patent Application No. PCT/US15/28163.
International Search Report and Written Opinion dated Nov. 7, 2014, issued in corresponding International Application No. PCT/US14/44643.
International Search Report and Written Opinion dated Oct. 27, 2014, issued in corresponding International Application No. PCT/US14/46807.
International Search Report and Written Opinion dated Sep. 23, 2013, issued in correspoding International Application No. PCT/US2013/039666.
Kumar et al.: Visible Light Communication Systems Conception and VIDAS. IETE Technical Review, vol. 25 Issue 6. Nov.-Dec. 2008, pp. 359-367.
Office Action dated Aug. 7, 2015 issued in U.S. Appl. No. 13/826,177.
Office Action issued on Feb. 5, 2016 in U.S. Appl. No. 14/048,505.
Sun: Fast Stereo Matching Using Rectangular Subregioning and 3D Maximum-Surface Techniques. International Journal of Computer Vision, vol. 46 No. 1/2/3, pp. 99-117, May 2002.
Wang, et al.: 12.5 Gbps Indoor Optical Wireless Communication System with Single Channel Imaging Receiver. ECOC Technical Digest 2011 OSA.
Wu, et al.: Modulation based cells distribution for visible light communication. Optic Express, vol. 20, No. 22, Oct. 2012.

Cited By (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20160017656A1 (en) * 2013-03-15 2016-01-21 Springs Window Fashions, Llc Window covering motorized lift and control operating system
US20160368382A1 (en) * 2013-06-29 2016-12-22 Audi Ag Motor vehicle control interface with gesture recognition
US9738158B2 (en) * 2013-06-29 2017-08-22 Audi Ag Motor vehicle control interface with gesture recognition
US9994233B2 (en) * 2014-09-30 2018-06-12 Continental Automotive Systems, Inc. Hands accelerating control system
US11107236B2 (en) 2019-04-22 2021-08-31 Dag Michael Peter Hansson Projected augmented reality interface with pose tracking for directing manual processes
WO2021038109A1 (en) 2019-08-30 2021-03-04 Metralabs Gmbh Neue Technologien Und Systeme System for capturing sequences of movements and/or vital parameters of a person
US11780080B2 (en) 2020-04-27 2023-10-10 Scalable Robotics Inc. Robot teaching with scans and geometries
US11826908B2 (en) 2020-04-27 2023-11-28 Scalable Robotics Inc. Process agnostic robot teaching using 3D scans
US12011827B2 (en) 2020-04-27 2024-06-18 Scalable Robotics Inc. Robot teaching with scans in and out of robot workspace

Also Published As

Publication number Publication date
CN105518576A (en) 2016-04-20
US20150002391A1 (en) 2015-01-01
CN105518576B (en) 2019-04-16
WO2014210502A1 (en) 2014-12-31
EP3014407A1 (en) 2016-05-04
US20170083103A1 (en) 2017-03-23
EP3014407A4 (en) 2017-08-02

Similar Documents

Publication Publication Date Title
US9423879B2 (en) Systems and methods for controlling device operation according to hand gestures
US10191559B2 (en) Computer interface for manipulated objects with an absolute pose detection component
US20210294415A1 (en) External user interface for head worn computing
US20100201808A1 (en) Camera based motion sensing system
US9717118B2 (en) Light control systems and methods
EP2848094B1 (en) Light control systems and methods
US8971565B2 (en) Human interface electronic device
US7826641B2 (en) Apparatus and method for determining an absolute pose of a manipulated object in a real three-dimensional environment with invariant features
US7874681B2 (en) Interactive projector system and method
JP2023085535A (en) Detector for optically detecting at least one object
US20090009469A1 (en) Multi-Axis Motion-Based Remote Control
KR102335132B1 (en) Multi-modal gesture based interactive system and method using one single sensing system
US8587520B2 (en) Generating position information using a video camera
US20170017323A1 (en) External user interface for head worn computing
JP2009037620A (en) Three-dimensional virtual input and simulation device
TW201508561A (en) Speckle sensing for motion tracking
JP2010522922A (en) System and method for tracking electronic devices
CN101655739A (en) Device for three-dimensional virtual input and simulation
KR100532525B1 (en) 3 dimensional pointing apparatus using camera
US20170168592A1 (en) System and method for optical tracking
JP6946345B2 (en) Multi-function sensing system
WO2015009795A1 (en) Light control systems and methods
US20230336944A1 (en) Location and space aware adaptive synchronization
WO2023276058A1 (en) Wearable terminal device for changing display position of partial image

Legal Events

Date Code Title Description
STCF Information on status: patent grant

Free format text: PATENTED CASE

MAFP Maintenance fee payment

Free format text: PAYMENT OF MAINTENANCE FEE, 4TH YR, SMALL ENTITY (ORIGINAL EVENT CODE: M2551); ENTITY STATUS OF PATENT OWNER: SMALL ENTITY

Year of fee payment: 4

MAFP Maintenance fee payment

Free format text: PAYMENT OF MAINTENANCE FEE, 8TH YR, SMALL ENTITY (ORIGINAL EVENT CODE: M2552); ENTITY STATUS OF PATENT OWNER: SMALL ENTITY

Year of fee payment: 8