New! View global litigation for patent families

US20010042245A1 - Remote control system - Google Patents

Remote control system Download PDF

Info

Publication number
US20010042245A1
US20010042245A1 US09170871 US17087198A US20010042245A1 US 20010042245 A1 US20010042245 A1 US 20010042245A1 US 09170871 US09170871 US 09170871 US 17087198 A US17087198 A US 17087198A US 20010042245 A1 US20010042245 A1 US 20010042245A1
Authority
US
Grant status
Application
Patent type
Prior art keywords
motion
hand
user
circuit
remote
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
US09170871
Other versions
US6501515B1 (en )
Inventor
Ryuichi Iwamura
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Sony Corp
Original Assignee
Sony Corp
Sony Electronics Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date

Links

Images

Classifications

    • GPHYSICS
    • G08SIGNALLING
    • G08CTRANSMISSION SYSTEMS FOR MEASURED VALUES, CONTROL OR SIMILAR SIGNALS
    • G08C23/00Non-electrical signal transmission systems, e.g. optical systems
    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06FELECTRICAL DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/002Specific input/output arrangements not covered by G06F3/02 - G06F3/16, e.g. facsimile, microfilm
    • G06F3/005Input arrangements through a video camera
    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06FELECTRICAL DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/017Gesture based interaction, e.g. based on a set of recognized hand gestures
    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06FELECTRICAL DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/0304Detection arrangements using opto-electronic means
    • GPHYSICS
    • G08SIGNALLING
    • G08CTRANSMISSION SYSTEMS FOR MEASURED VALUES, CONTROL OR SIMILAR SIGNALS
    • G08C2201/00Transmission systems of control signals via wireless link
    • G08C2201/30User interface
    • G08C2201/32Remote control based on movements, attitude of remote control device
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television, VOD [Video On Demand]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/47End-user applications

Abstract

An electronic appliance remote controller which includes a display screen (which may be part of the appliance, e.g. a TV screen) for displaying icons representing possible operations of the electronic appliance, and a motion detector circuit for detecting a motion within a field of view of the motion detector circuit. The motion detector circuit detects a predetermined motion of a moving object within the field of view as an indication that a remote control operation is to be started and, thereafter, tracks the movement of the object. The motion detector circuit outputs a cursor control signal representative of the motion of the object. A control circuit, connected to the display screen, the electronic appliance, and the motion detector circuit and supplied with the cursor control signal, controls the display screen to display a movable visual indicator, e.g. a cursor, whose own motion tracks the movement of the moving object and the electronic appliance to perform operations corresponding to the icons selected by the user using the visual indicator.

Description

    BACKGROUND OF THE INVENTION
  • [0001]
    1. Field of the Invention
  • [0002]
    This invention relates to a remote control commander for an electronic appliance, such as a television set, and more particularly to an optical motion sensing remote control system for an electronic appliance.
  • [0003]
    2. Related Art
  • [0004]
    An IR (Infra Red) remote commander is a common means to control a TV from a distance. However, existing remote commanders have some drawbacks. They are easy to lose. The user often mistakes a VCR commander for the TV commander. In fact, a lot of people have a great “remote commander collection”. Also one has to learn which button is where on the commander. Remote commanders require batteries which have to be replaced periodically. If a TV could have a camera vision and read the user's gestures, no remote commander would be necessary. However, it is not easy for a TV to distinguish gestures from other moves in its camera view. One would not want the channel to change each time the user got up to fetch a snack from the kitchen, for example.
  • SUMMARY OF THE INVENTION
  • [0005]
    The above and other problems of prior art electronic appliance remote controllers are overcome by an electronic appliance remote controller according to the present invention which includes a display screen (which may be part of the appliance, e.g. a TV screen) for displaying icons representing possible operations of the electronic appliance, and a motion detector circuit for detecting a motion within a field of view of the motion detector circuit. The motion detector circuit detects a predetermined motion of a moving object within the field of view as an indication that a remote control operation is to be started and, thereafter, tracks the movement of the object. The motion detector circuit outputs a cursor control signal representative of the motion of the object. A control circuit, connected to the display screen, the electronic appliance, and the motion detector circuit and supplied with the cursor control signal, controls the display screen to display a movable visual indicator, e.g. a cursor, whose own motion tracks the movement of the moving object and the electronic appliance to perform operations corresponding to the icons selected by the user using the visual indicator.
  • [0006]
    In a preferred embodiment, the motion detector circuit detects the selection of an icon by the user by detecting a predetermined motion pattern of the object when the visual indicator is coincident on the display screen with a particular icon. For example, the motion detector circuit detects the selection of an icon by the user by detecting a cessation of movement of the object for a predetermined period of time after the visual indicator is coincident on the display screen with a particular icon. The detected object can be, for example, the user's hand. The predetermined motion can be a circular hand movement.
  • [0007]
    In the preferred embodiment, the motion detector circuit includes a video camera and calculates a motion vector of each macro block between two adjacent video frames in a video output signal from the video camera. Each video frame includes a plurality of blocks, each containing vectors representative of detected motion of the object. Neighbor vectors with almost the same direction are grouped as one region. For each frame, the motion detector circuit, in determining whether to track an object, checks each region to determine if that region satisfies the conditions (a) that the vector made one rotation clockwise or counterclockwise and (b) the region returned to the start position where it used to be and locks onto that region if conditions (a) and (b) are both satisfied.
  • [0008]
    In order that the same general length of hand movement will control the visual indicator to move a consistent corresponding length of movement, the control circuit includes an automatic cursor sensitivity adjustment feature which automatically scales the extremes of the movement of the visual indicator to the extremes of the predetermined hand motion so that, for example, the same diagonal motion of the user's hand will cause the visual indicator to move just across the diagonal of the display screen regardless of whether the user is close to the motion detector circuit or far away.
  • [0009]
    A remote controlling method for an electronic appliance according to the invention includes the steps of visually displaying on a display screen, such as a TV screen, icons representing possible operations of the electronic appliance (e.g. a TV), detecting a motion within a field of view, including detecting a first predetermined motion of a moving object within the field of view as an indication that a remote control operation is to be started and, thereafter, tracking the movement of the object an outputting a cursor control signal representative of the motion of the object. In response to the control signal, controlling the display screen to display a movable visual indicator, e.g. a cursor, whose movement tracks the movement of the moving object and further controlling the electronic appliance to perform operations corresponding to the icons selected by the user using the visual indicator. The first predetermined motion can be any hand movement, such as a circular movement or a diagonal hand movement, for example.
  • [0010]
    The step of detecting the selection of an icon by the user includes detecting a second predetermined motion pattern of the object when the visual indicator is coincident on the display screen with a particular icon. For example, the predetermined motion pattern could be a cessation of movement of the object for a predetermined period of time after the visual indicator is coincident on the display screen with the particular icon.
  • [0011]
    The motion detecting step uses a video camera in the preferred embodiment and includes calculating a motion vector of each macro block between two adjacent video frames in a video output signal from the video camera. Each video frame includes a plurality of blocks each containing vectors representative of detected motion of the object, wherein neighbor vectors with almost the same direction are grouped as one region. For each frame, the determination of whether to track an object is made by checking each region to determine if that region satisfies the conditions (a) that the vector made one rotation clockwise or counterclockwise and (b) the region returned to the start position where it used to be. That region is locked onto if conditions (a) and (b) are both satisfied.
  • [0012]
    In order that the same general length of hand movement will control the visual indicator to move a consistent corresponding length of movement, the remote controlling method according to the invention further includes a step of automatically adjusting the sensitivity of the visual indicator by the steps of automatically scaling the extremes of the movement of the visual indicator to the extremes of the predetermined hand motion so that, for example, the same diagonal motion of the user's hand will cause the visual indicator to move just across the diagonal of the display screen regardless of whether the user is close to the motion detector circuit or far away.
  • [0013]
    The foregoing and other objectives, features and advantages of the invention will be more readily understood upon consideration of the following detailed description of certain preferred embodiments of the invention, taken in conjunction with the accompanying drawings.
  • BRIEF DESCRIPTION OF THE INVENTION
  • [0014]
    [0014]FIG. 1 is a block diagram of the motion sensing remote control system according to the invention.
  • [0015]
    [0015]FIG. 2 is a diagrammatic illustration for use in explaining how the user uses a hand motion to cause the remote control system of FIG. 1 to recognize that a motion control signal is about to be made.
  • [0016]
    [0016]FIG. 3 is a diagrammatic illustration for use in explaining how the user causes the remote control system of FIG. 1 to move an on-screen cursor to follow the hand motion of the user.
  • [0017]
    [0017]FIG. 4 is a diagram of a macro block in a video signal frame in which a motion is calculated by the remote control system of FIG. 1 and further depicts motion vectors as arrows.
  • [0018]
    FIGS. 5-8 are each snapshots of vectors at an interval of one half a second.
  • [0019]
    [0019]FIG. 9 is a snapshot wherein the remote control system of FIG. 1 has determined that the image is a hand image and locks onto the image.
  • [0020]
    [0020]FIG. 10 is a block diagram of an alternative embodiment which makes use of an MPEG encoder of the electronicl appliance.
  • [0021]
    [0021]FIG. 11 is a diagrammatic illustration for use in explaining how the user uses another type of predetermined hand motion to cause the remote control system of FIG. 1 to recognize that a motion control signal is about to be made.
  • [0022]
    [0022]FIGS. 12 and 13 are each snapshots of macro blocks and motion vectors at an interval of one half a second of motion vectors detected by the remote control system of FIG. 1 for a hand motion shown in FIG. 11.
  • [0023]
    [0023]FIGS. 14 and 15 depict the user's diagonal hand motion as detected by the remote control system of FIG. 1 when the user is close to the TV (FIG. 14) and when the user is far from the TV (FIG. 15).
  • [0024]
    [0024]FIG. 16 is an illustration showing how the user cooperates in setting the automatic cursor sensitivity adjustment control.
  • DETAILED DESCRIPTION OF THE PREFERRED EMBODIMENT
  • [0025]
    The system according to the invention operates on the premise that a user does a special hand motion so that, for example, a TV can easily detect and lock onto an image of the user's hand. Once the hand image is locked, the TV electronically follows the hand's motion and moves a cursor on the TV screen toward the same direction as the hand moves. The user can move the cursor by moving the hand like a PC mouse. Moving the cursor, the user can choose a menu button from a plurality of buttons on the TV display. If the TV loses track of the hand motion after locking, the TV indicates a message to the user and lets the user do a special hand motion to re-lock and trace the motion.
  • [0026]
    To detect hand moves, motion vectors can be employed. A motion vector scheme is common in a motion picture experts group (MPEG) system. If the system has an MPEG encoder, its motion vector circuits can be shared. A large reduction in costs will then be possible.
  • [0027]
    Referring now to FIG. 1, a block diagram of the system is shown. The portion from blocks 1 to 12 is the same as a common digital TV set. The signal received by an antenna 1 is tuned in a tuner 2, demodulated and error-corrected in a demodulation and error correction block 3, and de-multiplexed in demultiplexer 4. Demultiplexed on screen display (OSD) data, video data and audio data are sent to OSD circuit 5, video decoder 6, and audio decoder 7, respectively. OSD data and the decoded video signal are mixed in a superimposer 7 and sent to a cathode ray tube (CRT) circuit 8 and displayed on CRT monitor 9. Decoded audio data is amplified in an amplifier 11 and sent to a loudspeaker 12.
  • [0028]
    Blocks 13 to 16 are the main portion of this invention. A camera 13, which can be mounted on the monitor 9, for example, captures images of a user 18 in front of the TV set and sends its images to a motion detector circuit 15. The motion detector circuit 15 compares the current video frame with a previous video frame stored in a RAM 14 and calculates a motion vector for each macro block of the video frame. A macro block size is, for example, 16×16 pixels. One frame consists of, for example, 22×18 macro blocks.
  • [0029]
    When the user 18 wants to control the TV, the user 18 moves his or her hand 20 in a circular motion, so that it draws a circle in the air. (FIG. 2). The TV distinguishes this unusual hand motion from other motions and senses that the user 18 wants to communicate. At that time, the TV displays the menu button icons 22 on the CRT display. Once the TV's motion detector circuit 15 captures the hand image, the motion detector circuit 15 locks the hand motion and a cursor 24 follows it. If the user 18 moves his or her hand 20 to the right, the cursor 24 on the CRT display moves right (24′). The hand 20 and the cursor 24 behave like a PC mouse and a cursor. Note that the TV does not care about absolute position of the hand 20. The TV senses only moving speed and direction of the hand 20 and moves the on screen cursor 24. When the cursor 24 comes to a menu button icon 22 the user 18 wants, the user 18 stops and holds the hand 20 there a couple of seconds. The motion detector circuit 15 of the TV recognizes this action as the equivalent of a “button push” and executes the function the button icon 22 indicates. If no move is detected for a certain time, it is timed out. The menu disappears. The motion detector circuit 15 begins trying to detect another circular move again.
  • [0030]
    The motion detector circuit 15 recognizes and locks the hand 20 image as follows. The motion detector circuit 15 calculates a motion vector of each macro block between two adjacent frames. Small vectors below a certain threshold are ignored. FIG. 4 shows whole macro blocks in a frame. For purposes of explanation and to make the figure simpler, the depicted macro blocks are less than the actual number and shown larger. Neighbor vectors with almost the same direction are grouped as one region. In FIG. 4, regions 1 and 2 are grouped. At this time, the motion detector circuit 15 does not know which region is the hand image. The motion detector circuit 15 repeats this procedure for every frame.
  • [0031]
    In the next frame, if there is a region that has almost the same position and vector direction as region 1, the region will succeed the named region 1. Other regions will be named in the same way. A new region 1 is given a new name. If a region in a previous frame does not find a successor, it is discarded. Each of FIGS. 5 to 8 indicates a snapshot of vectors at an interval of half a second. It takes one to three seconds to draw a circle. In FIG. 6, region 1 disappeared and the motion detector circuit 15 judges region 1 is not the hand motion. Region 2 is still a candidate for the hand image. For every frame, the motion detector circuit 15 checks that each region satisfies the following two conditions:
  • [0032]
    (1) Did the vector make one rotation clockwise or counterclockwise?
  • [0033]
    (2) Did the region return to the start position where it used to be?
  • [0034]
    If a region meets these conditions, the motion detector circuit 15 judges it is the hand image.
  • [0035]
    In FIG. 8, region 2 is judged as the hand image. Then the motion detector circuit 15 locks on region 2 and follow its motion (FIG. 9). The motion detector circuit 15 lets CPU 16 know that the hand image has been locked and sends its motion information to CPU 16. Controlled by CPU 16, OSD 5 moves the cursor 24 on the CRT monitor 9 so that the cursor 24 follows the hand motion.
  • [0036]
    If the motion detector circuit 15 loses track of the hand 20, the motion detector circuit 15 informs the CPU 16 to cause the CRT 9 to display the message “Move your hand right”. The user 18 follows the message. Then the motion detector circuit 15 causes the CPU to control the CRT 9 to display another message “Move your hand upward.” The user 18 follows the message again. If the motion detector circuit 15 captures the image that moves right first and upward next, then the motion detector circuit 15 re-captures and locks on the hand image again.
  • [0037]
    The special hand motion is not limited to a circular move. Any other special gesture will do. To let the TV know the menu button icon 22 is chosen, the user can do another special gesture instead of holding the hand 20 still. For example, as a variation of the circular hand motion, the user 18 may move the hand 20 several times (for example twice) toward diagonal direction, for example, lower left to upper right. (FIG. 11) When the hand 20 goes up, motion vectors point to the upper right (FIG. 12) region 3). When the hand 20 goes down, the motion vectors point to the lower left. (FIG. 13, region 3) The motion vectors point to the opposite direction as the hand 20 moves. Therefore, if there are motion vectors which point to a predetermined direction and change their direction oppositely, for example, three times (predetermined times) in a certain time period, the system judges that the user 18 has done the predetermined motion and locks onto the hand motion.
  • [0038]
    Compared with the circular motion shown in FIG. 2, this is an easier motion for the user 18 to make and also easier to detect for the system. A drawback is that such a motion is more likely to occur unintentionally than the circular motion and, thus, misdetection could occur more frequently. If the user 18 jiggles his or her leg, it could cause misdetection. It is a tradeoff.
  • [0039]
    The moving distance of the hand 20 depends on the camera view angle and the distance between the camera 13 and the user 18. FIGS. 14 and 15 show a diagonal hand motion in the camera view. If the view angle is wide or the user 18 is at some distance from the camera 13, the corresponding distance moved by the cursor 24 on the display is relatively shorter than it would be if the view angle was not so wide or the user 18 was closer to the camera 13. (FIG. 14). If the view angle is narrow or the user 18 is too close to the camera 13, the hand motion distance is large. (FIG. 15). Assume that the cursor 24 sensitivity is fixed. In the former case, the cursor 24 moves little even if the user 18 makes a large motion of his or her hand 20. In the latter case, the cursor 24 is too sensitive and it moves a relatively large distance in response to a small hand motion.
  • [0040]
    To solve this problem, this system has an auto cursor sensitivity adjustment function. When the predetermined motion is small in the camera view, the CPU 16 moves the cursor 24 largely. When the predetermined motion is large in the camera view, the CPU 16 moves the cursor 24 a little. For example, in FIG. 14, assume that the predetermined hand motion is 50 pixels long. In this case, the CPU 16 makes the cursor 24 move 4 pixels when the hand 20 moves 1 pixel, i.e. the cursor motion is automatically scaled to the length of the detected hand motion. In FIG. 15, the predetermined hand motion is 200 pixels long. The cursor 24 should move 1 pixel for every one pixel of hand motion. If the user 18 wants to move the cursor 24 from the left side to the right side of the display, the user only should move the hand 20 almost the same distance regardless of the camera view angle or the user's position from the camera 13. This auto cursor sensitivity is implemented in the software of the CPU 16.
  • [0041]
    Referring now to FIG. 16, when the user 18 makes a predetermined motion in the form of a diagonal hand movement, the motion detection system 15 locks onto the hand movement and moves the cursor 24 diagonally across the face of the TV screen. CPU 16 always calculates the ratio of the video frame diagonal distance to the distance of the hand stroke. The cursor is controlled proportionally to the ratio. If the user 20 controls the length of his or her hand movement to be constant, the CPU 16 is programmed to recognize this as the largest hand motion that needs to be detected and scales the corresponding-movement of the cursor 24 so that it just spans the entire diagonal of the TV screen. This scale between the length of hand movement and the length of corresponding cursor movement is thereafter maintained for other hand movements. If the recognized diagonal hand stroke was ten inches, after the hand image is locked, the user 18 has to move the hand 20 ten inches diagonally in order to move the cursor from the lower left comer to the upper right comer on the CRT monitor 9. If the recognized diagonal hand stroke is 20 inches, the user has to move the hand 20 inches to move the cursor in the same way.
  • [0042]
    Instead of a cursor 24, a button may be highlighted like a digital satellite system graphical user interface (DSS GUI). When the hand 20 moves up, the upper button icon gets highlighted and so on. To choose the highlighted button, the user 18 holds the hand 20 on the button for some seconds. As used in this specification and claims, the term “cursor” is to be deemed to include any change in the TV display which tracks the movement of the user's detected motion, including such highlighting of button icons in correspondence to the motion of the user's hand.
  • [0043]
    Instead of motion vector detection, another image recognition scheme can be employed for this invention. For example, the motion detector circuit 15 may follow the tracks of skin color of the hand. If the track draws a circle, the motion detector circuit 15 judges that it is the hand image. Another way is to detect an outline of the hand with a pattern-matching scheme. The most important point of this invention is that a user does a special predetermined move so that the motion detector circuit 15 can easily distinguish it from other visually “noisy” moves.
  • [0044]
    This invention can be applied for not only digital TV, but also analog TV, PC video-phone, or any system that uses a camera and monitor display. Not only a CRT but also other kinds of displays (for example, an LCD, projection TV, etc.) can be used.
  • [0045]
    A video conference or telephone system uses an MPEG or H.261 video encoder. FIG. 11 shows a typical example of the encoder. The signal from camera 100 is sent via a subtraction node 101 to a DCT (Discrete Cosine Transform) block 102 for compression. In case of a predictive frame, before DCT processing, the signal is subtracted from reconstructed intra frame data in the subtraction block 101. After DCT processing, the signal is quantized in a circuit block 103 and output as a encoded stream. The output signal is also de-quantized in a circuit block 104 and uncompressed in an Inverse-DCT circuit 105. The uncompressed signal is passed through a summing block 106 and stored in a frame memory 107. In case of a predictive frame, reconstructed intra frame data is added to the uncompressed signal in the block 106.
  • [0046]
    The motion detector circuit 108 is connected to the frame memory 107, compares the past frame with the current frame and calculates a motion vector of each macro block. In this way motion vectors can be obtained. Therefore, with a small modification, i.e. detection of whether the vectors of a given region the motion detector circuit 108 can detect a circular hand motion. The motion detector circuit 108 sends the hand motion data to CPU 16. The rest of the blocks (blocks 1 to 12) are the same as in the embodiment of FIG. 1. Blocks 13 to 15 can be replaced with this modified encoder. By sharing the motion detection block with the encoder, a circuit size reduction and a cost reduction will be accomplished.
  • [0047]
    As an extended feature, if the camera is motor-driven, the CPU 16 can control the pan, tilt, or zoom of the camera automatically so that the hand image is positioned at the best place (usually the center) in the camera view.
  • [0048]
    This system does not require color signals. Therefore, for a dark place, an infrared camera 13 may be used.
  • [0049]
    If the CPU 16 connects with a network interface, for example a 1394 interface, this system can send hand position data and control another device through the network. This system does not have to be built into a TV set.
  • [0050]
    Although the present invention has been shown and described with respect to preferred embodiments, various changes and modifications are deemed to lie within the spirit and scope of the invention as claimed. The corresponding structures, materials, acts, and equivalents of all means or step plus function elements in the claims which follow are intended to include any structure, material, or acts for performing the functions in combination with other claimed elements as specifically claimed.

Claims (32)

    What is claimed is:
  1. 1. A remote controller for an electronic appliance, comprising:
    a display screen for displaying icons representing possible operations of the electronic appliance;
    a motion detector for detecting a motion within a field of view of the motion detector, the motion detector detecting a first predetermined motion of a moving object within the field of view as an indication that a remote control operation is to be started and, thereafter, tracking the movement of the object and outputting a cursor control signal representative of the motion of the object;
    a control circuit, connected to the display screen, the electronic appliance, and the motion detector and supplied with the cursor control signal, for controlling the display screen to display a movable visual indicator which tracks the movement of the moving object and for controlling the electronic appliance to perform operations corresponding to the icons selected by the user using the visual indicator.
  2. 2. A remote controller according to
    claim 1
    , wherein the motion detector detects the selection of an icon by the user by detecting a second predetermined motion pattern of the object when the visual indicator is coincident on the display screen with a particular icon.
  3. 3. A remote controller according to
    claim 2
    , wherein the second predetermined motion pattern is a cessation of movement of the object for a predetermined period of time after the visual indicator is coincident on the display screen with the particular icon.
  4. 4. A remote controller according to
    claim 1
    , wherein the motion detector includes a video camera.
  5. 5. A remote controller according to
    claim 1
    , wherein the visual indicator is a cursor.
  6. 6. A remote controller according to
    claim 1
    , wherein the electronic appliance is a television set.
  7. 7. A remote controller according to
    claim 1
    , wherein the display screen is a cathode ray tube.
  8. 8. A remote controller according to
    claim 1
    , wherein the moving object is a user's hand and the first predetermined motion is a circular hand movement.
  9. 9. A remote controller according to
    claim 1
    , wherein the motion detector includes a video camera and calculates a motion vector of each macro block between two adjacent video frames in a video output signal from the video camera.
  10. 10. A remote controller according to
    claim 9
    , wherein each video frame includes a plurality of blocks each containing vectors representative of detected motion of the object, wherein neighbor vectors with almost the same direction are grouped as one region.
  11. 11. A remote controller according to
    claim 10
    , wherein for each frame, the motion detector, in determining whether to track an object, checks each region to determine if that region satisfies the conditions (a) that the vector made one rotation clockwise or counterclockwise and (b) the region returned to the start position where it used to be and locks onto that region if conditions (a) and (b) are both satisfied.
  12. 12. A remote controller according to
    claim 1
    , wherein the motion detector includes a series connection of a video camera, a discrete cosine transform (DCT) circuit for discrete cosine transform processing a video signal output by the camera, a quantizing circuit for quantizing the DCT processed video signal, an invert-DCT circuit for invert DCT processing the DCT processed video signal, a frame memory supplied with the output of the invert DCT circuit, a subtraction node interposed between the camera and the DCT circuit for subtracting, in the case of a predictive frame, the camera's video signal from reconstructed intra frame data output from the frame memory, and a summing node interposed between the frame memory and the invert DCT processing, for adding, in the case of a predictive frame, reconstructed intra frame data output from the frame memory to the output signal from the invert DCT circuit, and a motion detection circuit connected to the frame memory which compares a past frame in the frame memory with a current frame in the frame memory and calculates a motion vector of each macro block between the past and current video frames.
  13. 13. A remote controller according to
    claim 12
    , wherein each video frame includes a plurality of blocks each containing vectors representative of detected motion of the object, wherein neighbor vectors with almost the same direction are grouped as one region.
  14. 14. A remote controller according to
    claim 13
    , wherein for each frame, the motion detector, in determining whether to track an object, checks each region to determine if that region satisfies the conditions (a) that the vector made one rotation clockwise or counterclockwise and (b) the region returned to the start position where it used to be and locks onto that region if conditions (a) and (b) are both satisfied.
  15. 15. A remote controller according to
    claim 1
    , further comprising an automatic cursor sensitivity control means for automatically adjusting a length of movement of the visual indicator on the display screen relative to a corresponding length of movement of the moving object detected by the motion detector.
  16. 16. A remote controller according to
    claim 15
    , wherein the automatic cursor sensitivity control means adjusts the length of movement of the visual indicator on the display screen to move a fixed, predetermined distance on the display screen in response to the length of the detected first predetermined motion.
  17. 17. A remote controlling method for an electronic appliance, comprising the steps of:
    visually displaying on a display screen icons representing possible operations of the electronic appliance;
    detecting a motion within a field of view, including detecting a predetermined motion of a moving object within the field of view as an indication that a remote control operation is to be started and, thereafter, tracking the movement of the object an outputting a cursor control signal representative of the motion of the object;
    as a function of the control signal, controlling the display screen to display a movable visual indicator which tracks the movement of the moving object and the electronic appliance to perform operations corresponding to the icons selected by the user using the visual indicator.
  18. 18. A remote controlling method according to
    claim 17
    , wherein the step of detecting the selection of an icon by the user includes detecting a predetermined motion pattern of the object when the visual indicator is coincident on the display screen with a particular icon.
  19. 19. A remote controlling method according to
    claim 17
    , wherein the step of detecting the selection of an icon by the user includes detecting a cessation of movement of the object for a predetermined period of time after the visual indicator is coincident on the display screen with a particular icon.
  20. 20. A remote controlling method according to
    claim 17
    , wherein the visual indicator is a cursor.
  21. 21. A remote controlling method according to
    claim 17
    , wherein the electronic appliance is a television set.
  22. 22. A remote controlling method according to
    claim 17
    , wherein the step of displaying includes displaying on a cathode ray tube.
  23. 23. A remote controlling method according to
    claim 17
    , wherein the predetermined motion is a circular hand movement.
  24. 24. A remote controlling method according to
    claim 17
    , wherein the motion detecting step uses a video camera.
  25. 25. A remote controlling method according to
    claim 24
    , wherein the motion detecting step includes calculating a motion vector of each macro block between two adjacent video frames in a video output signal from the video camera.
  26. 26. A remote controlling method according to
    claim 25
    , wherein each video frame includes a plurality of blocks each containing vectors representative of detected motion of the object, wherein neighbor vectors with almost the same direction are grouped as one region.
  27. 27. A remote controlling method according to
    claim 26
    , further comprising the steps, for each frame, of determining whether to track an object by checking each region to determine if that region satisfies the conditions (a) that the vector made one rotation clockwise or counterclockwise and (b) the region returned to the start position where it used to be and locking onto that region if conditions (a) and (b) are both satisfied.
  28. 28. A remote controlling method according to
    claim 24
    , wherein the motion detecting step takes place as part of a process of encoding the video output signal from the video camera according to the motion picture experts group (MPEG) standard and includes calculating a motion vector of each macro block between a past video frame and a current video frame in video output signal from the video camera.
  29. 29. A remote controlling method according to
    claim 28
    , wherein each video frame includes a plurality of blocks each containing vectors representative of a detected motion of the object, wherein neighbor vectors with almost the same direction are grouped as one region.
  30. 30. A remote controlling method according to
    claim 27
    , further comprising the steps, for each frame, of determining whether to track an object by checking each region to determine if that region satisfies the conditions (a) that the vector made one rotation clockwise or counterclockwise and (b) the region returned to the start position where it used to be and locking onto that region if conditions (a) and (b) are both satisfied.
  31. 31. A remote controlling method according to
    claim 17
    , further comprising the step of automatically adjusting a length of movement of the visual indicator on the display screen relative to a corresponding length of the detected movement of the moving object.
  32. 32. A remote controlling method according to
    claim 31
    , wherein during the step of automatically adjusting the length of movement of the visual indicator on the display screen includes moving the visual indicator a fixed, predetermined distance on the display screen in response to the length of the detected first predetermined motion.
US09170871 1998-10-13 1998-10-13 Remote control system Active US6501515B1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
US09170871 US6501515B1 (en) 1998-10-13 1998-10-13 Remote control system

Applications Claiming Priority (4)

Application Number Priority Date Filing Date Title
US09170871 US6501515B1 (en) 1998-10-13 1998-10-13 Remote control system
US09193594 US6498628B2 (en) 1998-10-13 1998-11-17 Motion sensing interface
JP29153999A JP5048890B2 (en) 1998-10-13 1999-10-13 Motion detection interface
JP2011062985A JP5222376B2 (en) 1998-10-13 2011-03-22 Motion detection interface

Related Child Applications (1)

Application Number Title Priority Date Filing Date
US09193594 Continuation-In-Part US6498628B2 (en) 1998-10-13 1998-11-17 Motion sensing interface

Publications (2)

Publication Number Publication Date
US20010042245A1 true true US20010042245A1 (en) 2001-11-15
US6501515B1 US6501515B1 (en) 2002-12-31

Family

ID=22621622

Family Applications (2)

Application Number Title Priority Date Filing Date
US09170871 Active US6501515B1 (en) 1998-10-13 1998-10-13 Remote control system
US09193594 Active US6498628B2 (en) 1998-10-13 1998-11-17 Motion sensing interface

Family Applications After (1)

Application Number Title Priority Date Filing Date
US09193594 Active US6498628B2 (en) 1998-10-13 1998-11-17 Motion sensing interface

Country Status (2)

Country Link
US (2) US6501515B1 (en)
JP (1) JP5222376B2 (en)

Cited By (29)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20020084909A1 (en) * 2000-12-29 2002-07-04 Stefanik John R. Remote control device with smart card capability
US20020085128A1 (en) * 2000-12-29 2002-07-04 Stefanik John R. Remote control device with event notifier
US20020084898A1 (en) * 2000-12-29 2002-07-04 Stefanik John R. Remote control device with illumination
US6750801B2 (en) 2000-12-29 2004-06-15 Bellsouth Intellectual Property Corporation Remote control device with directional mode indicator
US20050188416A1 (en) * 2004-02-09 2005-08-25 Canon Europa Nv Method and device for the distribution of an audiovisual signal in a communications network, corresponding validation method and device
US20080134102A1 (en) * 2006-12-05 2008-06-05 Sony Ericsson Mobile Communications Ab Method and system for detecting movement of an object
US20090177045A1 (en) * 2007-06-04 2009-07-09 Ford John P System and method for data aggregation and prioritization
US20100079671A1 (en) * 2008-09-30 2010-04-01 Echostar Technologies Llc Systems and methods for graphical control of picture-in-picture windows
WO2010038218A1 (en) * 2008-10-03 2010-04-08 Exva - Experts In Video Analisys, Lda Method and system of interaction between actors and surfaces through motion detection
CN101783865A (en) * 2010-02-26 2010-07-21 中山大学;广州中大电讯科技有限公司 Digital set-top box and intelligent mouse control method based on same
US20110013807A1 (en) * 2009-07-17 2011-01-20 Samsung Electronics Co., Ltd. Apparatus and method for recognizing subject motion using a camera
US20110069215A1 (en) * 2009-09-24 2011-03-24 Pantech Co., Ltd. Apparatus and method for controlling picture using image recognition
US20110239139A1 (en) * 2008-10-07 2011-09-29 Electronics And Telecommunications Research Institute Remote control apparatus using menu markup language
CN102469293A (en) * 2010-11-17 2012-05-23 中兴通讯股份有限公司 Realization method and device for acquiring user input information in video service
EP2475183A1 (en) * 2011-01-06 2012-07-11 Samsung Electronics Co., Ltd. Display apparatus controlled by motion and motion control method thereof
EP2635952A1 (en) * 2010-11-01 2013-09-11 Thomson Licensing Method and device for detecting gesture inputs
WO2013179566A1 (en) * 2012-05-29 2013-12-05 Sony Corporation Image processing apparatus and program
US20140053115A1 (en) * 2009-10-13 2014-02-20 Pointgrab Ltd. Computer vision gesture based control of a device
US20140078311A1 (en) * 2012-09-18 2014-03-20 Samsung Electronics Co., Ltd. Method for guiding controller to move to within recognizable range of multimedia apparatus, the multimedia apparatus, and target tracking apparatus thereof
US8704765B1 (en) 2011-04-07 2014-04-22 Google Inc. Methods and apparatus related to cursor device calibration
US8823647B2 (en) 2012-01-31 2014-09-02 Konami Digital Entertainment Co., Ltd. Movement control device, control method for a movement control device, and non-transitory information storage medium
CN104424649A (en) * 2013-08-21 2015-03-18 株式会社理光 Method and system for detecting moving object
EP2474881A3 (en) * 2011-01-06 2015-04-22 Samsung Electronics Co., Ltd. Display apparatus controlled by a motion, and motion control method thereof
US20150172531A1 (en) * 2013-12-12 2015-06-18 Canon Kabushiki Kaisha Image capturing apparatus, communication apparatus, and control method therefor
WO2016048262A1 (en) * 2014-09-22 2016-03-31 Hewlett-Packard Development Company, L.P. Cursor control using images
US20160187990A1 (en) * 2014-12-26 2016-06-30 Samsung Electronics Co., Ltd. Method and apparatus for processing gesture input
EP2306272A3 (en) * 2009-09-04 2016-10-19 Sony Corporation Information processing apparatus, method for controlling display and program for controlling display
US9513711B2 (en) 2011-01-06 2016-12-06 Samsung Electronics Co., Ltd. Electronic device controlled by a motion and controlling method thereof using different motions to activate voice versus motion recognition
EP2651117A3 (en) * 2012-04-13 2017-03-15 Samsung Electronics Co., Ltd Camera apparatus and control method thereof

Families Citing this family (306)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8576199B1 (en) 2000-02-22 2013-11-05 Apple Inc. Computer control systems
US20090273574A1 (en) 1995-06-29 2009-11-05 Pryor Timothy R Programmable tactile touch screen displays and man-machine interfaces for improved vehicle instrumentation and telematics
US8482535B2 (en) 1999-11-08 2013-07-09 Apple Inc. Programmable tactile touch screen displays and man-machine interfaces for improved vehicle instrumentation and telematics
US9513744B2 (en) 1994-08-15 2016-12-06 Apple Inc. Control systems employing novel physical controls and touch screens
US7466843B2 (en) * 2000-07-07 2008-12-16 Pryor Timothy R Multi-functional control and entertainment systems
US8228305B2 (en) 1995-06-29 2012-07-24 Apple Inc. Method for providing human input to a computer
US5623588A (en) 1992-12-14 1997-04-22 New York University Computer user interface with non-salience deemphasis
US20080158261A1 (en) 1992-12-14 2008-07-03 Eric Justin Gould Computer user interface for audio and/or video auto-summarization
US8381126B2 (en) 1992-12-14 2013-02-19 Monkeymedia, Inc. Computer user interface with non-salience deemphasis
US6947571B1 (en) * 1999-05-19 2005-09-20 Digimarc Corporation Cell phones with optical capabilities, and related applications
US8874244B2 (en) 1999-05-19 2014-10-28 Digimarc Corporation Methods and systems employing digital content
US7406214B2 (en) 1999-05-19 2008-07-29 Digimarc Corporation Methods and devices employing optical sensors and/or steganography
US9292111B2 (en) 1998-01-26 2016-03-22 Apple Inc. Gesturing with a multipoint sensing device
US9239673B2 (en) 1998-01-26 2016-01-19 Apple Inc. Gesturing with a multipoint sensing device
US8202094B2 (en) * 1998-02-18 2012-06-19 Radmila Solutions, L.L.C. System and method for training users with audible answers to spoken questions
US7148909B2 (en) * 1998-05-27 2006-12-12 Canon Kabushiki Kaisha Image display system capable of displaying and scaling images on plurality of image sources and display control method therefor
JP2000163196A (en) * 1998-09-25 2000-06-16 Sanyo Electric Co Ltd Gesture recognizing device and instruction recognizing device having gesture recognizing function
US7212197B1 (en) * 1999-02-01 2007-05-01 California Institute Of Technology Three dimensional surface drawing controlled by hand motion
US20060061551A1 (en) * 1999-02-12 2006-03-23 Vega Vista, Inc. Motion detection and tracking system to control navigation and display of portable displays including on-chip gesture detection
US20060279542A1 (en) * 1999-02-12 2006-12-14 Vega Vista, Inc. Cellular phones and mobile devices with motion driven control
US6393158B1 (en) 1999-04-23 2002-05-21 Monkeymedia, Inc. Method and storage device for expanding and contracting continuous play media seamlessly
US6621980B1 (en) * 1999-04-23 2003-09-16 Monkeymedia, Inc. Method and apparatus for seamless expansion of media
US7760905B2 (en) * 1999-06-29 2010-07-20 Digimarc Corporation Wireless mobile phone with content processing
US8391851B2 (en) * 1999-11-03 2013-03-05 Digimarc Corporation Gestural techniques with wireless mobile phone devices
US6592223B1 (en) * 1999-10-07 2003-07-15 Panaseca, Inc. System and method for optimal viewing of computer monitors to minimize eyestrain
US7233312B2 (en) * 2000-07-31 2007-06-19 Panaseca, Inc. System and method for optimal viewing of computer monitors to minimize eyestrain
FR2799916A1 (en) * 1999-10-15 2001-04-20 Yves Jean Paul Guy Reza Control interface for television or video recorder comprises detectors sensing screen area to enable sensing of user inputs
US6901561B1 (en) * 1999-10-19 2005-05-31 International Business Machines Corporation Apparatus and method for using a target based computer vision system for user interaction
US6608648B1 (en) * 1999-10-21 2003-08-19 Hewlett-Packard Development Company, L.P. Digital camera cursor control by sensing finger position on lens cap
US20020032734A1 (en) 2000-07-26 2002-03-14 Rhoads Geoffrey B. Collateral data combined with user characteristics to select web site
US6944315B1 (en) * 2000-10-31 2005-09-13 Intel Corporation Method and apparatus for performing scale-invariant gesture recognition
US7095401B2 (en) * 2000-11-02 2006-08-22 Siemens Corporate Research, Inc. System and method for gesture interface
US20020174426A1 (en) * 2001-05-15 2002-11-21 Koninklijke Philips Electronics N.V Method and apparatus for activating a media player based on user behavior
JP4366886B2 (en) * 2001-05-24 2009-11-18 コニカミノルタビジネステクノロジーズ株式会社 Apparatus and method for image recognition
US7259747B2 (en) 2001-06-05 2007-08-21 Reactrix Systems, Inc. Interactive video display system
US7710391B2 (en) 2002-05-28 2010-05-04 Matthew Bell Processing an image utilizing a spatially varying pattern
US7348963B2 (en) * 2002-05-28 2008-03-25 Reactrix Systems, Inc. Interactive video display system
US8035612B2 (en) 2002-05-28 2011-10-11 Intellectual Ventures Holding 67 Llc Self-contained interactive video display system
US8300042B2 (en) 2001-06-05 2012-10-30 Microsoft Corporation Interactive video display system using strobed light
JP3811025B2 (en) * 2001-07-03 2006-08-16 株式会社日立製作所 Network system
WO2003023701A3 (en) * 2001-09-07 2004-02-05 Me In Gmbh Operating device
JP4974319B2 (en) * 2001-09-10 2012-07-11 株式会社バンダイナムコゲームス Image generation system, program, and information storage medium
US7023499B2 (en) * 2001-09-21 2006-04-04 Williams Cassandra S Television receiver with motion sensor
KR100426174B1 (en) * 2001-10-29 2004-04-06 삼성전자주식회사 Method for controlling a camera using video compression algorithm
US20030095154A1 (en) * 2001-11-19 2003-05-22 Koninklijke Philips Electronics N.V. Method and apparatus for a gesture-based user interface
US8570378B2 (en) * 2002-07-27 2013-10-29 Sony Computer Entertainment Inc. Method and apparatus for tracking three-dimensional movements of an object using a depth sensing camera
US6564144B1 (en) * 2002-01-10 2003-05-13 Navigation Technologies Corporation Method and system using a hand-gesture responsive device for collecting data for a geographic database
US20030132913A1 (en) * 2002-01-11 2003-07-17 Anton Issinski Touchless computer input device to control display cursor mark position by using stereovision input from two video cameras
US20030179249A1 (en) * 2002-02-12 2003-09-25 Frank Sauer User interface for three-dimensional data sets
US7782297B2 (en) * 2002-07-27 2010-08-24 Sony Computer Entertainment America Inc. Method and apparatus for use in determining an activity level of a user in relation to a system
US8313380B2 (en) 2002-07-27 2012-11-20 Sony Computer Entertainment America Llc Scheme for translating movements of a hand-held controller into inputs for a system
US20070015559A1 (en) * 2002-07-27 2007-01-18 Sony Computer Entertainment America Inc. Method and apparatus for use in determining lack of user activity in relation to a system
US9393487B2 (en) 2002-07-27 2016-07-19 Sony Interactive Entertainment Inc. Method for mapping movements of a hand-held controller to game commands
DE10236937A1 (en) * 2002-08-12 2004-02-26 BSH Bosch und Siemens Hausgeräte GmbH Operating panel for household device, e.g. washing machine, with movement detector to activate indicator displays and lights on panel only when user is nearby to save power
US6654001B1 (en) * 2002-09-05 2003-11-25 Kye Systems Corp. Hand-movement-sensing input device
US7358963B2 (en) 2002-09-09 2008-04-15 Apple Inc. Mouse having an optically-based scrolling feature
US8245252B2 (en) * 2002-09-10 2012-08-14 Caption Tv, Inc. System, method, and computer program product for selective replacement of objectionable program content with less-objectionable content
US6996460B1 (en) * 2002-10-03 2006-02-07 Advanced Interfaces, Inc. Method and apparatus for providing virtual touch interaction in the drive-thru
US7030856B2 (en) * 2002-10-15 2006-04-18 Sony Corporation Method and system for controlling a display device
KR100575906B1 (en) * 2002-10-25 2006-05-02 각고호우징 게이오기주크 Hand pattern switching apparatus
JP2004173003A (en) * 2002-11-20 2004-06-17 Toshiba Corp Broadcast receiver, code signal output device and its control method
WO2004053823A1 (en) * 2002-12-09 2004-06-24 Adam Kaplan Method and apparatus for user interface
US7576727B2 (en) * 2002-12-13 2009-08-18 Matthew Bell Interactive directed light/sound system
WO2004055776A9 (en) 2002-12-13 2004-08-19 Matthew Bell Interactive directed light/sound system
US20040119682A1 (en) * 2002-12-18 2004-06-24 International Business Machines Corporation Self-correcting autonomic mouse
US20040196400A1 (en) * 2003-04-07 2004-10-07 Stavely Donald J. Digital camera user interface using hand gestures
EP1665015A2 (en) * 2003-04-14 2006-06-07 Philips Intellectual Property & Standards GmbH Electric apparatus and method of communication between an apparatus and a user
US20040252101A1 (en) * 2003-06-12 2004-12-16 International Business Machines Corporation Input device that detects user's proximity
US20050115816A1 (en) * 2003-07-23 2005-06-02 Neil Gelfond Accepting user control
US20050018172A1 (en) * 2003-07-23 2005-01-27 Neil Gelfond Accepting user control
JP3752246B2 (en) * 2003-08-11 2006-03-08 三菱ふそうトラック・バス株式会社 Hand pattern switch device
JP4306397B2 (en) * 2003-10-08 2009-07-29 株式会社日立製作所 Recognition processing system
WO2005041579A3 (en) 2003-10-24 2006-10-05 Matthew Bell Method and system for processing captured image information in an interactive video display system
CN102034197A (en) 2003-10-24 2011-04-27 瑞克楚斯系统公司 Method and system for managing an interactive video display system
US20050104850A1 (en) * 2003-11-17 2005-05-19 Chia-Chang Hu Cursor simulator and simulating method thereof for using a limb image to control a cursor
US6969964B2 (en) * 2004-01-26 2005-11-29 Hewlett-Packard Development Company, L.P. Control device and method of use
JP2005242694A (en) * 2004-02-26 2005-09-08 Keio Gijuku Hand pattern switching apparatus
JP3746060B2 (en) * 2004-07-20 2006-02-15 コナミ株式会社 Gaming device, a computer control method, and program
US7614008B2 (en) 2004-07-30 2009-11-03 Apple Inc. Operation of a computer with touch screen interface
US8381135B2 (en) 2004-07-30 2013-02-19 Apple Inc. Proximity detector in handheld device
US8479122B2 (en) 2004-07-30 2013-07-02 Apple Inc. Gestures for touch sensitive input devices
US8560972B2 (en) * 2004-08-10 2013-10-15 Microsoft Corporation Surface UI for gesture-based interaction
CN101065766A (en) * 2004-09-03 2007-10-31 潘那西卡股份有限公司 Vision center kiosk
JP4419768B2 (en) * 2004-09-21 2010-02-24 日本ビクター株式会社 Control device for an electronic equipment
US20060152482A1 (en) 2005-01-07 2006-07-13 Chauncy Godwin Virtual interface and control device
US9910497B2 (en) * 2006-02-08 2018-03-06 Oblong Industries, Inc. Gestural control of autonomous and semi-autonomous systems
US9684380B2 (en) 2009-04-02 2017-06-20 Oblong Industries, Inc. Operating environment with gestural control and multiple client devices, displays, and users
US9823747B2 (en) 2006-02-08 2017-11-21 Oblong Industries, Inc. Spatial, multi-modal control device for use with spatial operating system
US8370383B2 (en) 2006-02-08 2013-02-05 Oblong Industries, Inc. Multi-process interactive systems and methods
US9317128B2 (en) 2009-04-02 2016-04-19 Oblong Industries, Inc. Remote devices used in a markerless installation of a spatial operating environment incorporating gestural control
US9740293B2 (en) 2009-04-02 2017-08-22 Oblong Industries, Inc. Operating environment with gestural control and multiple client devices, displays, and users
CN101536494B (en) 2005-02-08 2017-04-26 奥布隆工业有限公司 A system and method for gesture-based control system
KR101821418B1 (en) * 2009-05-04 2018-01-23 오블롱 인더스트리즈, 인크 Gesture-based control systems including the representation, manipulation, and exchange of data
US9075441B2 (en) * 2006-02-08 2015-07-07 Oblong Industries, Inc. Gesture based control using three-dimensional information extracted over an extended depth of field
US8537112B2 (en) * 2006-02-08 2013-09-17 Oblong Industries, Inc. Control system for navigating a principal dimension of a data space
US9740922B2 (en) 2008-04-24 2017-08-22 Oblong Industries, Inc. Adaptive tracking system for spatial input devices
US8531396B2 (en) 2006-02-08 2013-09-10 Oblong Industries, Inc. Control system for navigating a principal dimension of a data space
US20130076616A1 (en) * 2008-04-24 2013-03-28 Ambrus Csaszar Adaptive tracking system for spatial input devices
US9495013B2 (en) 2008-04-24 2016-11-15 Oblong Industries, Inc. Multi-modal gestural interface
US8723795B2 (en) 2008-04-24 2014-05-13 Oblong Industries, Inc. Detecting, representing, and interpreting three-space input: gestural continuum subsuming freespace, proximal, and surface-contact modes
US8537111B2 (en) * 2006-02-08 2013-09-17 Oblong Industries, Inc. Control system for navigating a principal dimension of a data space
KR100687737B1 (en) * 2005-03-19 2007-02-27 한국전자통신연구원 Apparatus and method for a virtual mouse based on two-hands gesture
US9128519B1 (en) 2005-04-15 2015-09-08 Intellectual Ventures Holding 67 Llc Method and system for state-based control of objects
US7415352B2 (en) * 2005-05-20 2008-08-19 Bose Corporation Displaying vehicle information
US7548230B2 (en) * 2005-05-27 2009-06-16 Sony Computer Entertainment Inc. Remote input device
US8427426B2 (en) * 2005-05-27 2013-04-23 Sony Computer Entertainment Inc. Remote input device
US8081822B1 (en) 2005-05-31 2011-12-20 Intellectual Ventures Holding 67 Llc System and method for sensing a feature of an object in an interactive video display
KR100724939B1 (en) * 2005-06-20 2007-06-04 삼성전자주식회사 Method for implementing user interface using camera module and mobile communication terminal therefor
WO2007003195A1 (en) * 2005-07-04 2007-01-11 Bang & Olufsen A/S A unit, an assembly and a method for controlling in a dynamic egocentric interactive space
US20070035411A1 (en) * 2005-08-10 2007-02-15 Nokia Corporation Service selection
US20070045257A1 (en) * 2005-08-30 2007-03-01 United Technologies Corporation Laser control system
US20070045250A1 (en) * 2005-08-30 2007-03-01 United Technologies Corporation Method for manually laser welding metallic parts
US7746985B2 (en) * 2005-09-14 2010-06-29 Sorenson Communications, Inc. Method, system and device for relay call transfer service
US7746984B2 (en) * 2005-09-14 2010-06-29 Sorenson Communications, Inc. Method and system for call initiation in a video relay service
US7742068B2 (en) * 2005-09-14 2010-06-22 Sorenson Communications, Inc. Method and system for auto configuration in a video phone system
US20070057912A1 (en) * 2005-09-14 2007-03-15 Romriell Joseph N Method and system for controlling an interface of a device through motion gestures
US8098277B1 (en) 2005-12-02 2012-01-17 Intellectual Ventures Holding 67 Llc Systems and methods for communication between a reactive video system and a mobile communication device
KR100800998B1 (en) * 2005-12-24 2008-02-11 삼성전자주식회사 Apparatus and method for home network device controlling
US8077147B2 (en) 2005-12-30 2011-12-13 Apple Inc. Mouse with optical sensing surface
DE102006037156A1 (en) * 2006-03-22 2007-09-27 Volkswagen Ag Interactive control apparatus and method for operating the interactive control device
JP4692371B2 (en) * 2006-04-26 2011-06-01 オムロン株式会社 Image processing apparatus, image processing method, image processing program, and a recording medium recording an image processing program and a moving object detection system,
JP5028038B2 (en) * 2006-07-06 2012-09-19 クラリオン株式会社 Display method of the vehicle display device and the in-vehicle display device
US8310656B2 (en) 2006-09-28 2012-11-13 Sony Computer Entertainment America Llc Mapping movements of a hand-held controller to the two-dimensional image plane of a display screen
US8781151B2 (en) 2006-09-28 2014-07-15 Sony Computer Entertainment Inc. Object detection using video input combined with tilt angle information
US8144121B2 (en) * 2006-10-11 2012-03-27 Victor Company Of Japan, Limited Method and apparatus for controlling electronic appliance
US20080098448A1 (en) * 2006-10-19 2008-04-24 Sony Computer Entertainment America Inc. Controller configured to track user's level of anxiety and other mental and physical attributes
US20080096654A1 (en) * 2006-10-20 2008-04-24 Sony Computer Entertainment America Inc. Game control using three-dimensional motions of controller
US7767318B2 (en) * 2006-11-21 2010-08-03 United Technologies Corporation Laser fillet welding
US8508472B1 (en) 2006-11-28 2013-08-13 James W. Wieder Wearable remote control with a single control button
JP4720738B2 (en) * 2006-12-20 2011-07-13 日本ビクター株式会社 Electronics
US8407725B2 (en) * 2007-04-24 2013-03-26 Oblong Industries, Inc. Proteins, pools, and slawx in processing environments
US7889175B2 (en) 2007-06-28 2011-02-15 Panasonic Corporation Touchpad-enabled remote controller and user interaction methods
CA2591808A1 (en) * 2007-07-11 2009-01-11 Hsien-Hsiang Chiu Intelligent object tracking and gestures sensing input device
US8031272B2 (en) * 2007-07-19 2011-10-04 International Business Machines Corporation System and method of adjusting viewing angle for display
JP5055156B2 (en) * 2007-08-24 2012-10-24 国立交通大学 Control apparatus and method
CN101378456B (en) * 2007-08-28 2010-06-02 鸿富锦精密工业(深圳)有限公司;鸿海精密工业股份有限公司 Apparatus for sensing electronic image and remote-control method thereof
US20090066648A1 (en) * 2007-09-07 2009-03-12 Apple Inc. Gui applications for use with 3d remote controller
CN101952818B (en) 2007-09-14 2016-05-25 智慧投资控股81有限责任公司 Processing gesture-based user interactions
JP4636064B2 (en) * 2007-09-18 2011-02-23 ソニー株式会社 Image processing apparatus and image processing method, and program
JP4569613B2 (en) * 2007-09-19 2010-10-27 ソニー株式会社 Image processing apparatus and image processing method, and program
US20090104993A1 (en) * 2007-10-17 2009-04-23 Zhou Ye Electronic game controller with motion-sensing capability
US20090109036A1 (en) * 2007-10-29 2009-04-30 The Boeing Company System and Method for Alternative Communication
US8159682B2 (en) 2007-11-12 2012-04-17 Intellectual Ventures Holding 67 Llc Lens system
JP4670860B2 (en) 2007-11-22 2011-04-13 ソニー株式会社 Recording and reproducing apparatus
US8780278B2 (en) * 2007-11-30 2014-07-15 Microsoft Corporation Motion-sensing remote control
KR101079598B1 (en) * 2007-12-18 2011-11-03 삼성전자주식회사 Display apparatus and control method thereof
US8115877B2 (en) * 2008-01-04 2012-02-14 International Business Machines Corporation System and method of adjusting viewing angle for display based on viewer positions and lighting conditions
US20120202569A1 (en) * 2009-01-13 2012-08-09 Primesense Ltd. Three-Dimensional User Interface for Game Applications
US9035876B2 (en) 2008-01-14 2015-05-19 Apple Inc. Three-dimensional user interface session control
US9772689B2 (en) 2008-03-04 2017-09-26 Qualcomm Incorporated Enhanced gesture-based image manipulation
US8259163B2 (en) 2008-03-07 2012-09-04 Intellectual Ventures Holding 67 Llc Display with built in 3D sensing
US20110102570A1 (en) * 2008-04-14 2011-05-05 Saar Wilf Vision based pointing device emulation
US8031175B2 (en) * 2008-04-21 2011-10-04 Panasonic Corporation Touch sensitive remote control system that detects hand size characteristics of user and adapts mapping to screen display
US20090266623A1 (en) * 2008-04-28 2009-10-29 Chih-Wei Wang Remote sensing controller
US20090284469A1 (en) * 2008-05-16 2009-11-19 Tatung Company Video based apparatus and method for controlling the cursor
US8595218B2 (en) 2008-06-12 2013-11-26 Intellectual Ventures Holding 67 Llc Interactive display management systems and methods
WO2009155465A1 (en) * 2008-06-18 2009-12-23 Oblong Industries, Inc. Gesture-based control system for vehicle interfaces
KR101617562B1 (en) * 2008-07-01 2016-05-02 힐크레스트 래보래토리스, 인크. 3d pointer mapping
US8154428B2 (en) * 2008-07-15 2012-04-10 International Business Machines Corporation Gesture recognition control of electronic devices using a multi-touch device
US8204274B2 (en) * 2008-07-21 2012-06-19 Industrial Technology Research Institute Method and system for tracking positions of human extremities
US8305345B2 (en) * 2008-08-07 2012-11-06 Life Technologies Co., Ltd. Multimedia playing device
US8582957B2 (en) 2008-09-22 2013-11-12 EchoStar Technologies, L.L.C. Methods and apparatus for visually displaying recording timer information
US8572651B2 (en) 2008-09-22 2013-10-29 EchoStar Technologies, L.L.C. Methods and apparatus for presenting supplemental information in an electronic programming guide
US8937687B2 (en) 2008-09-30 2015-01-20 Echostar Technologies L.L.C. Systems and methods for graphical control of symbol-based features in a television receiver
US8473979B2 (en) 2008-09-30 2013-06-25 Echostar Technologies L.L.C. Systems and methods for graphical adjustment of an electronic program guide
US8763045B2 (en) 2008-09-30 2014-06-24 Echostar Technologies L.L.C. Systems and methods for providing customer service features via a graphical user interface in a television receiver
US8793735B2 (en) 2008-09-30 2014-07-29 EchoStar Technologies, L.L.C. Methods and apparatus for providing multiple channel recall on a television receiver
US8397262B2 (en) * 2008-09-30 2013-03-12 Echostar Technologies L.L.C. Systems and methods for graphical control of user interface features in a television receiver
US8133119B2 (en) * 2008-10-01 2012-03-13 Microsoft Corporation Adaptation for alternate gaming input devices
US9100614B2 (en) 2008-10-31 2015-08-04 Echostar Technologies L.L.C. Graphical interface navigation based on image element proximity
US8502787B2 (en) 2008-11-26 2013-08-06 Panasonic Corporation System and method for differentiating between intended and unintended user input on a touchpad
US20100134409A1 (en) * 2008-11-30 2010-06-03 Lenovo (Singapore) Pte. Ltd. Three-dimensional user interface
JP5619775B2 (en) 2009-01-30 2014-11-05 トムソン ライセンシングThomson Licensing How to control and request information of the multimedia from the display
US9652030B2 (en) * 2009-01-30 2017-05-16 Microsoft Technology Licensing, Llc Navigation of a virtual plane using a zone of restriction for canceling noise
US8866821B2 (en) * 2009-01-30 2014-10-21 Microsoft Corporation Depth map movement tracking via optical flow and velocity prediction
US8295546B2 (en) 2009-01-30 2012-10-23 Microsoft Corporation Pose tracking pipeline
US8294767B2 (en) 2009-01-30 2012-10-23 Microsoft Corporation Body scan
JP5364925B2 (en) * 2009-02-27 2013-12-11 現代自動車株式会社 An input device for an in-vehicle equipment
KR20100101389A (en) * 2009-03-09 2010-09-17 삼성전자주식회사 Display apparatus for providing a user menu, and method for providing ui applied thereto
US20100235786A1 (en) * 2009-03-13 2010-09-16 Primesense Ltd. Enhanced 3d interfacing for remote devices
US8773355B2 (en) * 2009-03-16 2014-07-08 Microsoft Corporation Adaptive cursor sizing
US8988437B2 (en) 2009-03-20 2015-03-24 Microsoft Technology Licensing, Llc Chaining animations
US9256282B2 (en) 2009-03-20 2016-02-09 Microsoft Technology Licensing, Llc Virtual object manipulation
US9898675B2 (en) 2009-05-01 2018-02-20 Microsoft Technology Licensing, Llc User movement tracking feedback to improve tracking
US8253746B2 (en) * 2009-05-01 2012-08-28 Microsoft Corporation Determine intended motions
US8503720B2 (en) 2009-05-01 2013-08-06 Microsoft Corporation Human body pose estimation
US8942428B2 (en) 2009-05-01 2015-01-27 Microsoft Corporation Isolate extraneous motions
US8638985B2 (en) 2009-05-01 2014-01-28 Microsoft Corporation Human body pose estimation
US9015638B2 (en) * 2009-05-01 2015-04-21 Microsoft Technology Licensing, Llc Binding users to a gesture based system and providing feedback to the users
US8649554B2 (en) 2009-05-01 2014-02-11 Microsoft Corporation Method to control perspective for a camera-controlled computer
US9498718B2 (en) * 2009-05-01 2016-11-22 Microsoft Technology Licensing, Llc Altering a view perspective within a display environment
US9377857B2 (en) 2009-05-01 2016-06-28 Microsoft Technology Licensing, Llc Show body position
US8181123B2 (en) 2009-05-01 2012-05-15 Microsoft Corporation Managing virtual port associations to users in a gesture-based computing environment
US20100277470A1 (en) * 2009-05-01 2010-11-04 Microsoft Corporation Systems And Methods For Applying Model Tracking To Motion Capture
US8340432B2 (en) 2009-05-01 2012-12-25 Microsoft Corporation Systems and methods for detecting a tilt angle from a depth image
US20100295782A1 (en) 2009-05-21 2010-11-25 Yehuda Binder System and method for control based on face ore hand gesture detection
US8803889B2 (en) 2009-05-29 2014-08-12 Microsoft Corporation Systems and methods for applying animations or motions to a character
US8379101B2 (en) 2009-05-29 2013-02-19 Microsoft Corporation Environment and/or target segmentation
US8625837B2 (en) 2009-05-29 2014-01-07 Microsoft Corporation Protocol and format for communicating an image from a camera to a computing environment
US8542252B2 (en) 2009-05-29 2013-09-24 Microsoft Corporation Target digitization, extraction, and tracking
US8320619B2 (en) 2009-05-29 2012-11-27 Microsoft Corporation Systems and methods for tracking a model
US9182814B2 (en) * 2009-05-29 2015-11-10 Microsoft Technology Licensing, Llc Systems and methods for estimating a non-visible or occluded body part
US8145594B2 (en) * 2009-05-29 2012-03-27 Microsoft Corporation Localized gesture aggregation
US20100306685A1 (en) * 2009-05-29 2010-12-02 Microsoft Corporation User movement feedback via on-screen avatars
US8176442B2 (en) * 2009-05-29 2012-05-08 Microsoft Corporation Living cursor control mechanics
US8418085B2 (en) * 2009-05-29 2013-04-09 Microsoft Corporation Gesture coach
US9400559B2 (en) 2009-05-29 2016-07-26 Microsoft Technology Licensing, Llc Gesture shortcuts
US20100302138A1 (en) * 2009-05-29 2010-12-02 Microsoft Corporation Methods and systems for defining or modifying a visual representation
US20100306716A1 (en) * 2009-05-29 2010-12-02 Microsoft Corporation Extending standard gestures
US8856691B2 (en) * 2009-05-29 2014-10-07 Microsoft Corporation Gesture tool
US9383823B2 (en) * 2009-05-29 2016-07-05 Microsoft Technology Licensing, Llc Combining gestures beyond skeletal
US8509479B2 (en) 2009-05-29 2013-08-13 Microsoft Corporation Virtual object
US8744121B2 (en) 2009-05-29 2014-06-03 Microsoft Corporation Device for identifying and tracking multiple humans over time
US7914344B2 (en) * 2009-06-03 2011-03-29 Microsoft Corporation Dual-barrel, connector jack and plug assemblies
US8390680B2 (en) * 2009-07-09 2013-03-05 Microsoft Corporation Visual representation expression based on player expression
US9159151B2 (en) * 2009-07-13 2015-10-13 Microsoft Technology Licensing, Llc Bringing a visual representation to life via learned input from the user
US20110025689A1 (en) * 2009-07-29 2011-02-03 Microsoft Corporation Auto-Generating A Visual Representation
DE102009037316A1 (en) * 2009-08-14 2011-02-17 Karl Storz Gmbh & Co. Kg Control and method for operating an operating lamp
US9141193B2 (en) * 2009-08-31 2015-09-22 Microsoft Technology Licensing, Llc Techniques for using human gestures to control gesture unaware programs
US8878779B2 (en) * 2009-09-21 2014-11-04 Extreme Reality Ltd. Methods circuits device systems and associated computer executable code for facilitating interfacing with a computing platform display screen
GB2483168B (en) * 2009-10-13 2013-06-12 Pointgrab Ltd Computer vision gesture based control of a device
US20110109617A1 (en) * 2009-11-12 2011-05-12 Microsoft Corporation Visualizing Depth
US20110164032A1 (en) * 2010-01-07 2011-07-07 Prime Sense Ltd. Three-Dimensional User Interface
JP5529568B2 (en) * 2010-02-05 2014-06-25 キヤノン株式会社 The image processing apparatus, an imaging apparatus, a control method, and program
US8522308B2 (en) * 2010-02-11 2013-08-27 Verizon Patent And Licensing Inc. Systems and methods for providing a spatial-input-based multi-user shared display experience
US9535493B2 (en) * 2010-04-13 2017-01-03 Nokia Technologies Oy Apparatus, method, computer program and user interface
KR101121746B1 (en) 2010-04-19 2012-03-22 한국과학기술원 Method and apparatus for hand-gesture based user interaction technique for 3-dimensional user interface
US8593402B2 (en) 2010-04-30 2013-11-26 Verizon Patent And Licensing Inc. Spatial-input-based cursor projection systems and methods
US8368819B2 (en) * 2010-04-30 2013-02-05 Hon Hai Precision Industry Co., Ltd. Remote control system and method of television control
US9310887B2 (en) 2010-05-06 2016-04-12 James W. Wieder Handheld and wearable remote-controllers
US8457353B2 (en) * 2010-05-18 2013-06-04 Microsoft Corporation Gestures and gesture modifiers for manipulating a user-interface
US20110310010A1 (en) * 2010-06-17 2011-12-22 Primesense Ltd. Gesture based user interface
CN102959616B (en) 2010-07-20 2015-06-10 苹果公司 Interactive reality augmentation for natural interaction
US9201501B2 (en) 2010-07-20 2015-12-01 Apple Inc. Adaptive projector
US9167289B2 (en) 2010-09-02 2015-10-20 Verizon Patent And Licensing Inc. Perspective display systems and methods
US9213890B2 (en) * 2010-09-17 2015-12-15 Sony Corporation Gesture recognition system for TV control
US8959013B2 (en) 2010-09-27 2015-02-17 Apple Inc. Virtual keyboard for a non-tactile three dimensional user interface
US8957856B2 (en) 2010-10-21 2015-02-17 Verizon Patent And Licensing Inc. Systems, methods, and apparatuses for spatial input associated with a display
KR20120046973A (en) * 2010-11-03 2012-05-11 삼성전자주식회사 Method and apparatus for generating motion information
US20120139827A1 (en) * 2010-12-02 2012-06-07 Li Kevin A Method and apparatus for interacting with projected displays using shadows
US8872762B2 (en) 2010-12-08 2014-10-28 Primesense Ltd. Three dimensional user interface cursor control
US8933876B2 (en) 2010-12-13 2015-01-13 Apple Inc. Three dimensional user interface session control
US20130154913A1 (en) * 2010-12-16 2013-06-20 Siemens Corporation Systems and methods for a gaze and gesture interface
US9285874B2 (en) 2011-02-09 2016-03-15 Apple Inc. Gaze detection in a 3D mapping environment
US20120206348A1 (en) * 2011-02-10 2012-08-16 Kim Sangki Display device and method of controlling the same
US8942917B2 (en) 2011-02-14 2015-01-27 Microsoft Corporation Change invariant scene recognition by an agent
KR101151962B1 (en) * 2011-02-16 2012-06-01 김석중 Virtual touch apparatus and method without pointer on the screen
KR101381928B1 (en) * 2011-02-18 2014-04-07 주식회사 브이터치 virtual touch apparatus and method without pointer on the screen
DE102011011802A1 (en) 2011-02-19 2012-08-23 Volkswagen Ag Method and apparatus for providing a user interface, in particular in a vehicle
US8928589B2 (en) * 2011-04-20 2015-01-06 Qualcomm Incorporated Virtual keyboards and methods of providing the same
KR20120119440A (en) * 2011-04-21 2012-10-31 삼성전자주식회사 Method for recognizing user's gesture in a electronic device
US8620113B2 (en) 2011-04-25 2013-12-31 Microsoft Corporation Laser diode modes
US20120288251A1 (en) * 2011-05-13 2012-11-15 Cyberlink Corp. Systems and methods for utilizing object detection to adaptively adjust controls
KR20120130466A (en) * 2011-05-23 2012-12-03 삼성전자주식회사 Device and method for controlling data of external device in wireless terminal
US8769409B2 (en) 2011-05-27 2014-07-01 Cyberlink Corp. Systems and methods for improving object detection
US8760395B2 (en) 2011-05-31 2014-06-24 Microsoft Corporation Gesture recognition techniques
KR101789683B1 (en) * 2011-06-13 2017-11-20 삼성전자주식회사 Display apparatus and Method for controlling display apparatus and remote controller
US9459758B2 (en) 2011-07-05 2016-10-04 Apple Inc. Gesture-based interface with enhanced features
US8881051B2 (en) 2011-07-05 2014-11-04 Primesense Ltd Zoom-based gesture user interface
US9377865B2 (en) 2011-07-05 2016-06-28 Apple Inc. Zoom-based gesture user interface
KR101235432B1 (en) * 2011-07-11 2013-02-22 김석중 Remote control apparatus and method using virtual touch of electronic device modeled in three dimension
US9030498B2 (en) 2011-08-15 2015-05-12 Apple Inc. Combining explicit select gestures and timeclick in a non-tactile three dimensional user interface
US9218063B2 (en) 2011-08-24 2015-12-22 Apple Inc. Sessionless pointing user interface
US9122311B2 (en) 2011-08-24 2015-09-01 Apple Inc. Visual feedback for tactile and non-tactile user interfaces
US8635637B2 (en) 2011-12-02 2014-01-21 Microsoft Corporation User interface presenting an animated avatar performing a media reaction
US9100685B2 (en) 2011-12-09 2015-08-04 Microsoft Technology Licensing, Llc Determining audience state or interest using passive sensor data
KR20130078490A (en) * 2011-12-30 2013-07-10 삼성전자주식회사 Electronic apparatus and method for controlling electronic apparatus thereof
US9070019B2 (en) 2012-01-17 2015-06-30 Leap Motion, Inc. Systems and methods for capturing motion in three-dimensional space
US8693731B2 (en) 2012-01-17 2014-04-08 Leap Motion, Inc. Enhanced contrast for object detection and characterization by optical imaging
US9679215B2 (en) 2012-01-17 2017-06-13 Leap Motion, Inc. Systems and methods for machine control
US8638989B2 (en) 2012-01-17 2014-01-28 Leap Motion, Inc. Systems and methods for capturing motion in three-dimensional space
US9336456B2 (en) 2012-01-25 2016-05-10 Bruno Delean Systems, methods and computer program products for identifying objects in video data
US20130194180A1 (en) * 2012-01-27 2013-08-01 Lg Electronics Inc. Device and method of controlling the same
US9229534B2 (en) 2012-02-28 2016-01-05 Apple Inc. Asymmetric mapping for tactile and non-tactile user interfaces
JP5884584B2 (en) * 2012-03-19 2016-03-15 富士通株式会社 The information processing apparatus, a menu selection program, and a menu selection methods
US20130249793A1 (en) * 2012-03-22 2013-09-26 Ingeonix Corporation Touch free user input recognition
US9377863B2 (en) 2012-03-26 2016-06-28 Apple Inc. Gaze-enhanced virtual touchscreen
US8898687B2 (en) 2012-04-04 2014-11-25 Microsoft Corporation Controlling a media program based on a media reaction
CA2775700C (en) 2012-05-04 2013-07-23 Microsoft Corporation Determining a future portion of a currently presented media program
US8938124B2 (en) 2012-05-10 2015-01-20 Pointgrab Ltd. Computer vision based tracking of a hand
US9195310B2 (en) 2012-07-09 2015-11-24 Samsung Electronics Co., Ltd. Camera cursor system
US9280201B2 (en) * 2012-07-09 2016-03-08 Mstar Semiconductor, Inc. Electronic device and digital display device
CN103702058B (en) * 2012-09-27 2015-09-16 珠海扬智电子科技有限公司 Deinterleaving operation macroblock state identification methods and image processing apparatus
US9285893B2 (en) 2012-11-08 2016-03-15 Leap Motion, Inc. Object detection and tracking with variable-field illumination devices
KR101416378B1 (en) * 2012-11-27 2014-07-09 현대자동차 주식회사 A display apparatus capable of moving image and the method thereof
US9465461B2 (en) 2013-01-08 2016-10-11 Leap Motion, Inc. Object detection and tracking with audio and optical signals
US20140201684A1 (en) 2013-01-15 2014-07-17 Leap Motion, Inc. Dynamic user interactions for display control and manipulation of display objects
CN105027032A (en) * 2013-01-22 2015-11-04 科智库公司 Scalable input from tracked object
US9524028B2 (en) 2013-03-08 2016-12-20 Fastvdo Llc Visual language for human computer interfaces
US9704350B1 (en) 2013-03-14 2017-07-11 Harmonix Music Systems, Inc. Musical combat game
WO2014200589A3 (en) 2013-03-15 2015-03-19 Leap Motion, Inc. Determining positional information for an object in space
KR20140113137A (en) * 2013-03-15 2014-09-24 삼성전자주식회사 Display apparatus and control method thereof
US9654763B2 (en) * 2013-04-02 2017-05-16 Htc Corporation Controlling method of detecting image-capturing gesture
US9525906B2 (en) * 2013-04-08 2016-12-20 Hon Hai Precision Industry Co., Ltd. Display device and method of controlling the display device
US9749541B2 (en) * 2013-04-16 2017-08-29 Tout Inc. Method and apparatus for displaying and recording images using multiple image capturing devices integrated into a single mobile device
US9916009B2 (en) 2013-04-26 2018-03-13 Leap Motion, Inc. Non-tactile interface systems and methods
US9747696B2 (en) 2013-05-17 2017-08-29 Leap Motion, Inc. Systems and methods for providing normalized parameters of motions of objects in three-dimensional space
US9829984B2 (en) 2013-05-23 2017-11-28 Fastvdo Llc Motion-assisted visual language for human computer interfaces
FR3006477B1 (en) * 2013-05-29 2016-09-30 Blinksight Device and method of detection of the manipulation of at least one object
JP5941896B2 (en) * 2013-11-26 2016-06-29 京セラドキュメントソリューションズ株式会社 The operation display device
US9857971B2 (en) * 2013-12-02 2018-01-02 Industrial Technology Research Institute System and method for receiving user input and program storage medium thereof
US9613262B2 (en) 2014-01-15 2017-04-04 Leap Motion, Inc. Object detection and tracking for providing a virtual device experience
US20160026257A1 (en) * 2014-07-23 2016-01-28 Orcam Technologies Ltd. Wearable unit for selectively withholding actions based on recognized gestures
US9696813B2 (en) * 2015-05-27 2017-07-04 Hsien-Hsiang Chiu Gesture interface robot
CN104941203A (en) * 2015-06-03 2015-09-30 赵旭 Toy based on gesture track recognition and recognition and control method
US20180048859A1 (en) * 2016-08-15 2018-02-15 Purple Communications, Inc. Gesture-based control and usage of video relay service communications

Family Cites Families (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4988981B1 (en) * 1987-03-17 1999-05-18 Vpl Newco Inc Computer data entry and manipulation apparatus and method
JP2873338B2 (en) * 1991-09-17 1999-03-24 富士通株式会社 Animal-recognition devices
JP3727954B2 (en) * 1993-11-10 2005-12-21 キヤノン株式会社 Imaging device
JP3419050B2 (en) * 1993-11-19 2003-06-23 株式会社日立製作所 Input device
US5473364A (en) * 1994-06-03 1995-12-05 David Sarnoff Research Center, Inc. Video technique for indicating moving objects from a movable platform
US5594469A (en) * 1995-02-21 1997-01-14 Mitsubishi Electric Information Technology Center America Inc. Hand gesture machine control system
WO1996034332A1 (en) * 1995-04-28 1996-10-31 Matsushita Electric Industrial Co., Ltd. Interface device
JPH09128141A (en) * 1995-11-07 1997-05-16 Sony Corp Controller and control method
JPH1091795A (en) * 1996-09-12 1998-04-10 Toshiba Corp Device for detecting mobile object and method therefor

Cited By (56)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7653926B2 (en) 2000-12-29 2010-01-26 At&T Intellectual Property I, L.P. Remote control device with event notifier
US20020085128A1 (en) * 2000-12-29 2002-07-04 Stefanik John R. Remote control device with event notifier
US20020084898A1 (en) * 2000-12-29 2002-07-04 Stefanik John R. Remote control device with illumination
US6750801B2 (en) 2000-12-29 2004-06-15 Bellsouth Intellectual Property Corporation Remote control device with directional mode indicator
US6903655B2 (en) * 2000-12-29 2005-06-07 Bellsouth Intellectual Property Corp. Remote control device with illumination
US8441389B2 (en) 2000-12-29 2013-05-14 At&T Intellectual Property I, L.P. Remote control device with directional mode indicator
US6946970B2 (en) 2000-12-29 2005-09-20 Bellsouth Intellectual Property Corp. Remote control device with smart card capability
US20050206549A1 (en) * 2000-12-29 2005-09-22 Stefanik John R Remote control device with directional mode indicator
US7167122B2 (en) 2000-12-29 2007-01-23 Bellsouth Intellectual Property Corporation Remote control device with directional mode indicator
US20020084909A1 (en) * 2000-12-29 2002-07-04 Stefanik John R. Remote control device with smart card capability
US8069351B2 (en) 2000-12-29 2011-11-29 At&T Intellectual Property I, L.P. Remote control device
US9767657B2 (en) 2000-12-29 2017-09-19 At&T Intellectual Property I, L.P. Remote control device with directional mode indicator
US20050188416A1 (en) * 2004-02-09 2005-08-25 Canon Europa Nv Method and device for the distribution of an audiovisual signal in a communications network, corresponding validation method and device
WO2008068557A3 (en) * 2006-12-05 2008-07-31 Sony Ericsson Mobile Comm Ab Method and system for detecting movement of an object
US20080134102A1 (en) * 2006-12-05 2008-06-05 Sony Ericsson Mobile Communications Ab Method and system for detecting movement of an object
WO2008068557A2 (en) * 2006-12-05 2008-06-12 Sony Ericsson Mobile Communications Ab Method and system for detecting movement of an object
US8489544B2 (en) * 2007-06-04 2013-07-16 John P. Ford System and method for prioritization and display of aggregated data
US20090177045A1 (en) * 2007-06-04 2009-07-09 Ford John P System and method for data aggregation and prioritization
US20100079671A1 (en) * 2008-09-30 2010-04-01 Echostar Technologies Llc Systems and methods for graphical control of picture-in-picture windows
US9357262B2 (en) * 2008-09-30 2016-05-31 Echostar Technologies L.L.C. Systems and methods for graphical control of picture-in-picture windows
WO2010038218A1 (en) * 2008-10-03 2010-04-08 Exva - Experts In Video Analisys, Lda Method and system of interaction between actors and surfaces through motion detection
US20110239139A1 (en) * 2008-10-07 2011-09-29 Electronics And Telecommunications Research Institute Remote control apparatus using menu markup language
US9400563B2 (en) 2009-07-17 2016-07-26 Samsung Electronics Co., Ltd Apparatus and method for recognizing subject motion using a camera
EP2280377A1 (en) * 2009-07-17 2011-02-02 Samsung Electronics Co., Ltd Apparatus and method for recognizing subject motion using a camera
US20110013807A1 (en) * 2009-07-17 2011-01-20 Samsung Electronics Co., Ltd. Apparatus and method for recognizing subject motion using a camera
EP2306272A3 (en) * 2009-09-04 2016-10-19 Sony Corporation Information processing apparatus, method for controlling display and program for controlling display
CN102033696A (en) * 2009-09-24 2011-04-27 株式会社泛泰 Apparatus and method for controlling picture using image recognition
US20110069215A1 (en) * 2009-09-24 2011-03-24 Pantech Co., Ltd. Apparatus and method for controlling picture using image recognition
US8587710B2 (en) 2009-09-24 2013-11-19 Pantech Co., Ltd. Apparatus and method for controlling picture using image recognition
US20140053115A1 (en) * 2009-10-13 2014-02-20 Pointgrab Ltd. Computer vision gesture based control of a device
CN101783865A (en) * 2010-02-26 2010-07-21 中山大学;广州中大电讯科技有限公司 Digital set-top box and intelligent mouse control method based on same
EP2635952A4 (en) * 2010-11-01 2014-09-17 Thomson Licensing Method and device for detecting gesture inputs
US9189071B2 (en) 2010-11-01 2015-11-17 Thomson Licensing Method and device for detecting gesture inputs
EP2635952A1 (en) * 2010-11-01 2013-09-11 Thomson Licensing Method and device for detecting gesture inputs
CN102469293A (en) * 2010-11-17 2012-05-23 中兴通讯股份有限公司 Realization method and device for acquiring user input information in video service
US9398243B2 (en) 2011-01-06 2016-07-19 Samsung Electronics Co., Ltd. Display apparatus controlled by motion and motion control method thereof
US9513711B2 (en) 2011-01-06 2016-12-06 Samsung Electronics Co., Ltd. Electronic device controlled by a motion and controlling method thereof using different motions to activate voice versus motion recognition
EP2474881A3 (en) * 2011-01-06 2015-04-22 Samsung Electronics Co., Ltd. Display apparatus controlled by a motion, and motion control method thereof
EP2475183A1 (en) * 2011-01-06 2012-07-11 Samsung Electronics Co., Ltd. Display apparatus controlled by motion and motion control method thereof
CN102681658A (en) * 2011-01-06 2012-09-19 三星电子株式会社 Display apparatus controlled by motion and motion control method thereof
US8704765B1 (en) 2011-04-07 2014-04-22 Google Inc. Methods and apparatus related to cursor device calibration
US8730162B1 (en) 2011-04-07 2014-05-20 Google Inc. Methods and apparatus related to cursor device calibration
US8823647B2 (en) 2012-01-31 2014-09-02 Konami Digital Entertainment Co., Ltd. Movement control device, control method for a movement control device, and non-transitory information storage medium
US9654685B2 (en) 2012-04-13 2017-05-16 Samsung Electronics Co., Ltd Camera apparatus and control method thereof
EP2651117A3 (en) * 2012-04-13 2017-03-15 Samsung Electronics Co., Ltd Camera apparatus and control method thereof
WO2013179566A1 (en) * 2012-05-29 2013-12-05 Sony Corporation Image processing apparatus and program
US9704028B2 (en) 2012-05-29 2017-07-11 Sony Corporation Image processing apparatus and program
US9507999B2 (en) 2012-05-29 2016-11-29 Sony Corporation Image processing apparatus and program
US20140078311A1 (en) * 2012-09-18 2014-03-20 Samsung Electronics Co., Ltd. Method for guiding controller to move to within recognizable range of multimedia apparatus, the multimedia apparatus, and target tracking apparatus thereof
US9838573B2 (en) * 2012-09-18 2017-12-05 Samsung Electronics Co., Ltd Method for guiding controller to move to within recognizable range of multimedia apparatus, the multimedia apparatus, and target tracking apparatus thereof
CN104424649A (en) * 2013-08-21 2015-03-18 株式会社理光 Method and system for detecting moving object
US9584713B2 (en) * 2013-12-12 2017-02-28 Canon Kabushiki Kaisha Image capturing apparatus capable of specifying an object in image data based on object detection, motion detection and/or object recognition, communication apparatus communicating with image capturing apparatus, and control method therefor
US20150172531A1 (en) * 2013-12-12 2015-06-18 Canon Kabushiki Kaisha Image capturing apparatus, communication apparatus, and control method therefor
WO2016048262A1 (en) * 2014-09-22 2016-03-31 Hewlett-Packard Development Company, L.P. Cursor control using images
US9857878B2 (en) * 2014-12-26 2018-01-02 Samsung Electronics Co., Ltd. Method and apparatus for processing gesture input based on elliptical arc and rotation direction that corresponds to gesture input
US20160187990A1 (en) * 2014-12-26 2016-06-30 Samsung Electronics Co., Ltd. Method and apparatus for processing gesture input

Also Published As

Publication number Publication date Type
US6498628B2 (en) 2002-12-24 grant
JP2011170866A (en) 2011-09-01 application
US20020057383A1 (en) 2002-05-16 application
JP5222376B2 (en) 2013-06-26 grant
US6501515B1 (en) 2002-12-31 grant

Similar Documents

Publication Publication Date Title
US5355163A (en) Video camera that automatically maintains size and location of an image within a frame
US6061055A (en) Method of tracking objects with an imaging device
US6262763B1 (en) Actual size image display
US6297846B1 (en) Display control system for videoconference terminals
US7423669B2 (en) Monitoring system and setting method for the same
US20080266326A1 (en) Automatic image reorientation
US6049363A (en) Object detection method and system for scene change analysis in TV and IR data
US6160899A (en) Method of application menu selection and activation using image cognition
US20040070675A1 (en) System and method of processing a digital image for intuitive viewing
US5561471A (en) Apparatus and method for controlling the display of a caption on a screen and for maximizing the area devoted to presentation of the received video signal
US20100013917A1 (en) Method and system for performing surveillance
US6768563B1 (en) Image input system
US6337709B1 (en) Image display device
US7206029B2 (en) Picture-in-picture repositioning and/or resizing based on video content analysis
US6542625B1 (en) Method of detecting a specific object in an image signal
US20070126884A1 (en) Personal settings, parental control, and energy saving control of television with digital video camera
US6738041B2 (en) Using video information to control cursor position
US7173666B1 (en) System and method for displaying a non-standard aspect ratio image on a standard aspect ratio monitor
US20090109178A1 (en) Non-contact selection device
US20080192125A1 (en) Panoramic photography method and apparatus
US7385626B2 (en) Method and system for performing surveillance
US20010053292A1 (en) Image extracting apparatus and image extracting method
US5091785A (en) Picture-in-picture circuitry using field rate synchronization
US6385772B1 (en) Monitoring system having wireless remote viewing and control
US20110187731A1 (en) Marker display control device, integrated circuit, and marker display control method

Legal Events

Date Code Title Description
AS Assignment

Owner name: SONY ELECTRONICS INC., NEW JERSEY

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:IWAMURA, RYUICHI;REEL/FRAME:009519/0324

Effective date: 19981009

Owner name: SONY CORPORATION, JAPAN

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:IWAMURA, RYUICHI;REEL/FRAME:009519/0324

Effective date: 19981009

CC Certificate of correction
FPAY Fee payment

Year of fee payment: 4

FPAY Fee payment

Year of fee payment: 8

FPAY Fee payment

Year of fee payment: 12

AS Assignment

Owner name: SONY CORPORATION, JAPAN

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:SONY ELECTRONICS INC.;REEL/FRAME:036330/0420

Effective date: 20150731