WO2009111329A2 - Enhanced gesture-based image manipulation - Google Patents

Enhanced gesture-based image manipulation Download PDF

Info

Publication number
WO2009111329A2
WO2009111329A2 PCT/US2009/035544 US2009035544W WO2009111329A2 WO 2009111329 A2 WO2009111329 A2 WO 2009111329A2 US 2009035544 W US2009035544 W US 2009035544W WO 2009111329 A2 WO2009111329 A2 WO 2009111329A2
Authority
WO
WIPO (PCT)
Prior art keywords
user
image object
gesture
image
recognizing
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Ceased
Application number
PCT/US2009/035544
Other languages
English (en)
French (fr)
Other versions
WO2009111329A3 (en
Inventor
Evan Hildreth
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
GESTURETEK Inc
Original Assignee
GESTURETEK Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by GESTURETEK Inc filed Critical GESTURETEK Inc
Priority to JP2010549767A priority Critical patent/JP5855343B2/ja
Publication of WO2009111329A2 publication Critical patent/WO2009111329A2/en
Publication of WO2009111329A3 publication Critical patent/WO2009111329A3/en
Anticipated expiration legal-status Critical
Ceased legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/017Gesture based interaction, e.g. based on a set of recognized hand gestures
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/0304Detection arrangements using opto-electronic means

Definitions

  • the present disclosure generally relates to controls (or widgets).
  • An input device or pointing device is a hardware component that allows a computer user to input data into a computer.
  • a control (or widget) is an interface element that the computer user interacts with, such as by using an input device, to provide a single interaction point for the manipulation of data.
  • a control may be used, for example, to view or manipulate images.
  • an enhanced approach is provided for capturing a user's gesture in free space with a camera, recognizing the gesture, and using the gesture as a user input to manipulate a computer-generated image.
  • images such as photos may be interacted with through straightforward, intuitive, and natural motions of the user's body.
  • a process includes recognizing, from first and second images, a user's gesture, determining an interaction command corresponding to the recognized user's gesture, and manipulating, based on the determined interaction command, an image object displayed in a user interface.
  • Implementations may include one or more of the following features.
  • the interaction command may include a selection command
  • manipulating the image object may further include selecting the image object for further manipulation.
  • Recognizing the user's gesture may further include detecting an arm-extended, fingers- extended, palm-forward hand pose of the user in the first image, and detecting an arm- extended, fingers curled, palm-down hand pose of the user in the second image.
  • the interaction command may include an image pan command
  • manipulating the image object may further include panning the image object relative to the user interface Recognizing the user's gesture may further include detecting a first position of an arm of the user in the first image, detecting a second position of the arm of the user in the second image, and determining a magnitude and direction of a change between the first position and the second position.
  • manipulating the image object may further include determining a displacement position of the image object correlating to the determined magnitude and direction, and displaying the image object in the displacement position.
  • manipulating the image object may further include determining a scroll magnitude and direction correlating to the determined magnitude and direction, and scrolling the image object based on the determined scroll magnitude and direction.
  • the interaction command may include an image zoom command
  • manipulating the image object may further include determining a magnification factor correlating to the determined magnitude and direction, and applying the determined magnification factor to the image object
  • manipulating the image object may further include determining an adjustment magnitude and direction correlating to the determined magnitude and direction, and iteratively adjusting a magnification factor of the image object based on the determined adjustment magnitude and direction
  • the interaction command may further include a rotation command
  • manipulating the object may further include rotating the image object relative to the user interface Recognizing the user's gesture may further include detecting a first orientation of a hand of the user in the first image, detecting a second orientation of the hand of the user in the second image, and determining an orientation change between the first position and the second position
  • manipulating the image object may further include determining displacement orientation of the image object correlating to the determined magnitude and direction, and displaying the image object in the displacement orientation
  • manipulating the image object may further include determining an adjustment magnitude and direction correlating to the determined magnitude and direction, and iteratively adjusting an orientation of the image object based on the determined adjustment magnitude and direction
  • the image object may be manipulated if a magnitude of the user's gesture exceeds a predetermined threshold
  • Manipulating the image object may further include selecting the image object based on recognizing the first selection gesture, adjusting, using a single adjustment technique associated with the first and second interaction gestures, the image object based on recognizing the first and second interaction gestures, and filtering the repositioning gesture
  • the interaction command may include a preview image command
  • manipulating the image object may further include selecting, from a plurality of preview image objects, the image object [0011]
  • a system includes a user interface configured to display an image, and a processor The processor is configured
  • a computer program product is tangibly embodied in a machine-readable medium
  • the computer program product includes instructions that, when read by a machine, operate to cause data processing apparatus to recognize, from first and second images, a user's gesture, to determine an interaction command corresponding to the recognized user's gesture, and to manipulate, based on the determined interaction command, an image object displayed in a user interface
  • FIG 1 is a contextual diagram demonstrating image manipulation using recognized gestures
  • FIG 2 is a block diagram of an exemplary device
  • FIG 3 is a flowchart of an exemplary process
  • FIGS 4 to 13 illustrate exemplary gestures and concomitant user interfaces
  • FIG 14 illustrates thumbnail grids
  • FIG 15 illustrates an example of the exterior appearance of a computing device that further includes a processor and a user interface
  • FIG 16 is a block diagram illustrating the internal architecture of the computer shown in FIG 15
  • an enhanced approach is provided for capturing a user's gesture in free-space with a camera, recognizing the gesture, and using the gesture as a user input to manipulate a computer-generated image
  • images such as photos may be interacted with through straightforward, intuitive, and natural motions of the user's body
  • a camera such as a depth camera may be used to control a computer or hub based on the recognition of gestures or changes in gestures of a user Unlike touch-screen systems that suffer from the deleterious, obscuring effect of fingerprints, gesture-based input allows photos, videos, or other images to be clearly displayed or otherwise output based on the user's natural body movements or poses With this advantage in mind, gestures may be recognized that allow a user to view, pan ( ⁇ e , move), size, rotate, and perform other manipulations on image objects
  • a depth camera which may be also referred to as a time-of-flight camera, may include infrared emitters and a sensor The depth camera may produce a pulse of infrared light and subsequently measure the time it takes for the light to travel to an object and back to the sensor A distance may be calculated based on the travel time
  • a gesture is intended to refer to a form of nonverbal communication made with part of a human body, and is contrasted with verbal communication such as speech
  • a gesture may be defined by a movement, change or transformation between a first position, pose, or expression and a second pose, position or expression
  • Common gestures used in everyday discourse include for instance, an "air quote” gesture, a bowing gesture, a curtsey, a cheek-kiss, a finger or hand motion, a genuflection, a head bobble or movement, a high-five, a nod, a sad face, a raised fist, a salute, a thumbs-up motion, a pinching gesture, a hand or body twisting gesture, or a finger pointing gesture
  • a gesture may be detected using a camera, such as by analyzing an image of a user, using a tilt sensor, such as by detecting an angle that a user is holding or tilting a device,
  • a body part may make a gesture (or “gesticulate”) by changing its position ( ⁇ e a waving motion), or the body part may gesticulate without changing its position ( ⁇ e by making a clenched fist gesture)
  • a gesture or “gesticulate”
  • the enhanced control uses, as examples, hand and arm gestures to effect the control of functionality via camera input, other types of gestures may also be used
  • FIG 1 is a contextual diagram demonstrating image manipulation using recognized gestures
  • user 101 is sitting in front of a user interface 102 output on a display of a media hub 103, and a camera 104, viewing one or more image objects (e g , digital photographs or other images) on the user interface 102
  • image objects e g , digital photographs or other images
  • the user s right arm 105, right hand 106, and torso 107 are within the f ⁇ eld-of-v ⁇ ew 109 of the camera 104
  • Background images such as the sofa that the user 101 is sitting on or user 101's torso and head itself, are sampled, filtered, or otherwise ignored from the gesture recognition process
  • the camera 101 may ignore all candidate or potential control objects disposed further than a certain distance away from the camera, where the distance is predefined or dynamically determined In one instance, that distance could he between the user's outstretched fist and the user's torso
  • a plane could be dynamically defined in front of the user's torso, such that all motion or gestures that occur behind that torso are filtered out or otherwise ignored
  • the user 101 may move his hand in any direction, for instance along a plane parallel to the user interface 102
  • the user may gesticulate by moving his right arm 105 in an upward motion, as illustrated in FIG 1 B
  • a pose of the hand 105 in which the fingers are closed in a fist ( ⁇ e , curled fingers, as illustrated in FIG 1A) is detected
  • the change in position of the hand 106 and thus the gesture performed by the upward motion is also detected, recognized or otherwise determined
  • an image object movement or interaction command is determined, and the selected image object 110 is moved to a higher location on the user interface 102 (as illustrated in FIG 1 B), in a movement consistent with the detected motion of the arm 105
  • the magnitude, displacement, or velocity of the movement of the image object 110 may correlate to the magnitude, displacement, or velocity of the user's gesture
  • a hand movement a plane parallel to the user interface 102 (' in an X-Y direction) along may cause a selected image object to move in the user interface 102 in a corresponding direction in a distance proportional to the movement distance of the hand
  • the distance may have a 1 1 relationship with the distance moved by the hand, some other relationship, or the relationship may be variable or dynamically determined For instance, and as perhaps determined by an anatomical model, small movements at the outside extent of the user's reach may map or otherwise correspond to larger manipulations of the image in the user interface, than would larger hand movements that occur directly in front of the user Put another way, acceleration, deceleration, or other operations may be applied to gestures to determine or affect the magnitude of a concomitant image manipulation
  • the magnitude may also be a function of distance and speed
  • a magnitude- multiplier may adapt to a user's style over a period of time, based upon the distance and speed that the user has performed previous gestures recorded over a period of time
  • the magnitude-multiplier may adapt to a user's style while the gesture is being performed, based on the speed observed during the gesture
  • the magnitude- multiplier may be decreased if the user moves more quickly (for users whose style is to flail their arms wildly), or increased if the user moves more slowly (for users whose style is more deliberate)
  • Movement gestures may result in other image manipulation commands
  • movement gestures may be used as part of a magnification feature
  • a sub- region of an image object may be shown on the user interface 102
  • Movement gestures may move the sub-region within the image object
  • movement gestures may be used to view image objects in a directory or list of image objects
  • Image objects may be scaled to fit the size of the user interface 102 ( ⁇ e , one image object may be displayed on the user interface 102 at a time)
  • a movement gesture may "flip" to the next or previous image object ( ⁇ e , the next and previous image objects may not be displayed until they are "flipped" in)
  • the direction of the movement of the image object 110 may be the same as, orthogonal to, may mirror, or have any other relationship with the movement of the hand 106
  • an upward hand gesture may cause the image object 110 to move upward, as if the user 106 is yanking the image object vertically
  • a gesture to the right may cause the image object 110 to move to the right in relation to the user ( ⁇ e moving left on the user interface 102), or to move right on the user interface ( ⁇ e moving left in relationship to the user)
  • This mapping of directions may be preset, may be user selectable, or may be determined based on past use
  • an upward hand gesture may operate in the same way as the upward movement of a scroll bar, causing the image object 110 actually to move down
  • a gesture to the right may cause the image object 110 to operate as if a scroll bar is moved to the right ( ⁇ e moving the image object to the left in relation to the user and to the right on the user interface 102), and vice versa
  • FIG 2 is a block diagram of a device 200 used to implement image manipulation Briefly, and among other things, the device 200 includes a user interface 201 , a storage medium 202, a camera 204, a processor 205, and a tilt sensor 209
  • the user interface 201 is a mechanism for allowing a user to interact with the device 200, or with applications invoked by the device 200
  • the user interface 201 may provide a mechanism for both input and output, allowing a user to manipulate the device or for the device to produce the effects of the user's manipulation
  • the device 200 may utilize any type of user interface 201 , such as a graphical user interface (GUI), a voice user interface, or a tactile user interface
  • the user interface 201 may be configured to render a visual display image
  • the user interface 201 may be a monitor, a television, a liquid crystal display (LCD), a plasma display device, a projector with a projector screen, an auto- stereoscopic display, a cathode ray tube (CRT) display, a digital light processing (DLP) display, or any other type of display device configured to render a display image
  • the user interface 201 may include one or more display devices
  • the user interface 201 may be configured to display images associated with an application, such as display images generated by an application, including an object or representation such as an avatar
  • the storage medium 202 stores and records information or data, and may be an optical storage medium, magnetic storage medium, flash memory, or any other storage medium type Among other things, the storage medium is encoded with an enhanced control application 207 that effects enhanced input using recognized gestures
  • the camera 204 is a device used to capture images, either as still photographs or a sequence of moving images.
  • the camera 204 may use the light of the visible spectrum or with other portions of the electromagnetic spectrum, such as infrared.
  • the camera 204 may be a digital camera, a digital video camera, or any other type of device configured to capture images.
  • the camera 204 may include one or more cameras
  • the camera 204 may be configured to capture images of an object or user interacting with an application.
  • the camera 204 may be configured to capture images of a user or person physically gesticulating in free-space (e.g. the air surrounding the user), or otherwise interacting with an application within the field of view of the camera 204.
  • the camera 204 may be a stereo camera, a time-of-flight camera, or any other camera.
  • the camera 204 may be an image detector capable of sampling a background image in order to detect motions and, similarly, gestures of a user.
  • the camera 204 may produce a grayscale image, color image, or a distance image, such as a stereo camera or time-of-flight camera capable of generating a distance image.
  • a stereo camera may include two image sensors that acquire images at slightly different viewpoints, where a processor compares the images acquired from different viewpoints to calculate the distance of parts of the images.
  • a time-of-flight camera may include an emitter that generates a pulse of light, which may be infrared light, where the time the pulse of light travels from the emitter to an object and back to a sensor is measured to calculate the distance of parts of the images.
  • a pulse of light which may be infrared light
  • the device 200 is electrically connected to and in operable communication with, over a wireline or wireless pathway, the camera 204 and the user interface 201 , and is configured to control the operation of the processor 205 to provide for the enhanced control.
  • the device 200 uses the processor 205 or other control circuitry to execute an application that provides for enhanced camera-based input.
  • the camera 204 may be a separate unit (such as a webcam) that communicates with the device 200, in other implementations the camera 204 is built into the device 200, and communicates with other components of the device 200 (such as the processor 205) via an internal bus
  • the device 200 has been described as a personal computer (PC) or set top box, such a description is made merely for the sake of brevity, and other implementations or manifestations are also contemplated
  • the device 200 may be implemented as a television, an ultra-mobile personal computer (UMPC), a mobile internet device (MID), a digital picture frame (DPF), a portable media player (PMP), a general-purpose computer (e g , a desktop computer, a workstation, or a laptop computer), a server, a gaming device or console, or any other type of electronic device that includes a processor or other control circuitry configured to execute instructions, or any other apparatus that includes a user interface
  • input occurs by using a camera to detect images of a user performing gestures
  • a mobile phone can be placed on a table and may be operable to generate images of a user using a face-forward camera
  • the gesture may be recognized or detected using the tilt sensor 209, such as by detecting a "tilt left” gesture to move a representation left and to pan an image left or rotate an image counter-clockwise, or by detecting a "tilt forward and right” gesture to move a representation up and to the right of a neutral position, to zoom in and pan an image to the right
  • the tilt sensor 209 may thus be any type of module operable to detect an angular position of the device 200, such as a gyroscope, accelerometer, or a camera- based optical flow tracker
  • image-based input may be supplemented with or replaced by tilt-sensor input to perform functions or commands desired by a user
  • detection of a user's gesture may occur without using a camera, or without detecting the user within the images
  • the user is enabled to control the same interface or application in a straightforward manner
  • FIG 3 is a flowchart illustrating a computer-implemented process 300 that effects image manipulation using recognized gestures
  • the computer- implemented process 300 includes recognizing, from first and second images, a user's gesture, determining an interaction command corresponding to the recognized user's gesture, and manipulating, based on the determined interaction command, an image object displayed in a user interface
  • a user's gesture is recognized from first and second images (S302)
  • the first and second images may be derived from individual image snapshots or from a sequence of images that make up a video sequence Each image captures position information that allows an application to determine a pose or gesture of a user
  • a gesture is intended to refer to a movement, position, pose, or posture that expresses an idea, opinion, emotion, communication, command, demonstration or expression
  • the user's gesture may be a single or multiple finger gesture, a single hand gesture, a single hand and arm gesture, a single hand and arm, and body gesture, a bimanual gesture, a head pose or posture, an eye position, a facial expression a body pose or posture, or any other expressive body state
  • the body part or parts used to perform relevant gestures are generally referred to as a "control object '
  • the user's gesture in a single image or between two images may be expressive of an enabling or "engagement" gesture
  • the gesture of "drawing a circle in the air” or “swiping the hand off to one side” may be detected by a gesture analysis and detection process using the hand, arm, body, head or other object position information
  • the gesture may involve a two- or three-dimensional position displacement, such as when a swiping gesture is made
  • the gesture includes a transformation without a concomitant position displacement
  • the gesture of the user changes if all five fingers are retracted into a ball with the palm remaining forward, even if the overall position of the hand or arm remains static
  • Gestures may be detected using heuristic techniques, such as by determining whether the hand position information passes explicit sets of rules For example, the gesture of "swiping the hand off to one side" may be identified if the following gesture detection rules are satisfied (1 ) the change in horizontal position is greater than a predefined distance over a time span that is less than a predefined limit, (2) the horizontal position changes monotonically over that time span, (3) the change in vertical position is less than a predefined distance over that time span, and (4) the position at the end of the time span is nearer to (or on) a border of the hand detection region than the position at the start of the time span
  • Some gestures utilize multiple rule sets that are executed and satisfied in an explicit order, where the satisfaction of a rule set causes a system to change to a state where a different rule set is applied This system may be unable to detect subtle gestures, in which case Hidden Markov Models may be used, as these models allow for chains of specific motions to be detected, but also consider the overall probability that the motions sufficiently fit a gesture
  • Criteria may be used to filter out irrelevant or unintentional candidate gestures
  • a plane may be defined at a predetermined distance in front of a camera, where gestures that are made or performed on the far side of the plane from the camera are ignored, while gestures or potential gestures that are performed between the camera and the plane are monitored, identified, recognized, filtered, and processed as appropriate
  • the plane may also be defined relative to another point, position or object, such as relative to the user s torso
  • the enhanced approach described herein may use a background filtering model to remove background images or objects in motion that do not make up the control object
  • recognizing gestures or changes in gestures other information may also be determined from the images
  • a facial detection and recognition process may be performed on the images to detect the presence and identity of users within the image
  • Identity information may be used, for example, to determine or select available options, types of available interactions, or to determine which of many users within an image is to be designated as a controlling user if more than one user is attempting to engage the input functionality
  • the process for recognizing the user's gesture may further include recognizing a first displacement in a first direction, and recognizing a second displacement in a second direction, and aggregating these multiple displacements as a single gesture Furthermore, the recognition of the user's gesture may determine a magnitude and direction of the user's gesture
  • An engagement gesture activates or invokes functionality that monitors other images for gesture-based command inputs, and ignores random or background body motions
  • the engagement gesture is a transition from a first hand pose in which the hand is held in an upright position with the palm forward and with all fingers and thumb spread apart widely to a second hand pose in which the hand is held in a closed fist
  • FIGS 4A-4B illustrate an exemplary engagement gesture and a user interface that results from the engagement gesture
  • two images of the user 401 captured by the camera 402 capture the user's hand gesticulating from an open hand pose 405 with palm forward and fingers spread wide (as illustrated in FIG 4A) to a closed-fist hand pose 406 (as illustrated in FIG 4B)
  • the performance of this gesture by the user causes the image object 410 to be highlighted within the user interface to denote selection of the image object 410
  • the image object 410 is highlighted using a double border 408 that appear around an image object 410, designating the image object 410 as a selected image object
  • the user in effect, is virtually "grabbing" the image object 410 in free space to select it Selected image objects may be manipulated by other recognized gestures, such as the movement gesture discussed above with respect to FIG 1
  • finger pointing gestures can be recognized from one or more images
  • a "point left" gesture can be made with the tip of a user's finger and detected by analyzing an image of a finger Fingerprint analysis or other approaches can be used to determine the direction of a pointing fingertip
  • a gesture can be detected without using a camera, such as where the gesture is a verbal gesture or is detected using a tilt sensor or accelerometer
  • an interaction command corresponding to the recognized user's gesture is determined (S304)
  • Image interaction commands may be mapped to a user's gestures
  • the movement gesture discussed above with respect to FIG 1 may be mapped to an image movement command
  • Other examples include a hand rotation gesture mapped to an image rotation command, and hand movement along an axis that is perpendicular to the plane defined by the user interface (the "Z axis") mapped to an image sizing, zooming or magnification command
  • interaction commands may result in a manipulation direction mirroring a direction of the user's gesture
  • a user's right-to-left movement gesture may result in an image object being moved from left to right
  • an image object displayed in a user interface is manipulated (S306), thereby ending the process 300 (S307)
  • an image object e g , image object 110, FIG 1
  • an image object may be moved to a location corresponding to a user's arm movement along a plane parallel to a display screen
  • an image object may be sized corresponding to a determined direction and magnitude of a user's arm movement
  • FIGS 5A-5B illustrate an exemplary zoom-in gesture, in which a user 501 gesticulates his hand backward towards his body from a first position 502 to a second position 503, thereby causing the selected image object 504 in the display 506 to be displayed in a larger size ( ⁇ e , as image object 508, as shown in FIG 5B)
  • a zoom distance may correspond to a hand's Z-position
  • a movement of the hand farther away from the camera 510 may be interpreted as a "zoom-in" command ( ⁇ e , the user is "pulling the image object closer”)
  • FIGS 6A-6B illustrate an exemplary "zoom-out” gesture, in which a user 601 gesticulates his hand forward away from his body from a first position 602 to a second position 603, thereby causing a selected image object 604 in the display 606 to be displayed in a smaller size ( ⁇ e , as image object 608, as shown in FIG 6B)
  • ⁇ e as image object 608
  • a movement of the hand closer to the camera 610 may be interpreted as a "zoom-out” command ( ⁇ e , the user is "pushing the image object away”)
  • the selected image object may continue to zoom in or out at a velocity proportional to the velocity of the hand movement when the hand was opened The zoom velocity may gradually decrease over time after the hand is opened
  • the velocity of the hand motions or other gestures are determined, modeled, and applied to the image objects as if the image objects had mass or momentum
  • a quick left wave gesture might move an image object to the left a further distance than a slow left wave gesture
  • the image objects may react as if the user interface were affected by f ⁇ ctional, drag, gravitational, or other forces, such that a "shove" gesture may result in the image object initially zooming out at a quick pace, then slowing as the time since the application of the virtual shove elapses
  • the image object may continue to travel in the first direction until the "virtual" momentum assigned to the image object is overcome by the 'virtual" momentum assigned to the gesture
  • FIGS 7A-7E illustrate an exemplary "repositioning" gesture
  • a user 701 has his hand pulled back into his body at a position 702 If the user desires to zoom a selected image object in closer, he may not, with his hand in the position 702, be able to move his hand any further towards his body The user 701 may "release" his hand (thereby canceling or suspending the zoom command) by opening his hand and spreading his fingers and thumb wide, as shown in a position 704
  • the user 701 may then move his hand forward to a position 706, close his hand (as shown in position 708) to re-engage ( ⁇ e , reselect an image object), and finally, pull his hand backward towards his body to a position 710, thereby causing the selected image object to zoom to a larger size
  • a similar "repositioning" gesture may be used to repeat other commands
  • the user 701 may, when his hand is fully extended forward, open his hand, pull his hand backward, re-engage by closing his hand, and push his hand forward to resize an image object to a smaller size ( ⁇ e , zoom out farther)
  • Similar pose sequences may be used to repeat movement, rotation, and other gesture commands
  • Poses used for repositioning gestures may be filtered That is, the poses used strictly to reposition may not result in the manipulation of an object
  • FIGS 8A-8B illustrate an exemplary interaction that occurs in a "velocity mode
  • a hand movement by a user 801 in an X-Y direction along a plane parallel to a display 802 may cause a movement of an image object proportional to the hand movement distance
  • Such movement may be referred to as "distance model”
  • a distance model may be effective if there is a short magnification range or a limited number of zoom states
  • a distance model may not be as effective, however, if the displayed image object(s) support a large magnification range, such as a map which supports many zoom levels
  • a "velocity” model may be used in a “velocity” model.
  • a user gestures and then holds a pose The "velocity” model allows a command to be repeated indefinitely without releasing the hand gesture ( ⁇ e , in contrast to the "distance” model where a release of the hand gesture is required, for example, to zoom beyond a certain distance)
  • the user 801 gesticulates his hand upward from a first position 804 to a second position 806, thereby causing a selected image object to move from a position 808 to a position 810
  • the selected image object continues to move in the direction indicated by the gesture ( ⁇ e , upward in this case), to positions 812, 814, etc
  • the hand's X-Y position may be sampled and saved as a reference position when the user 801 closes his hand
  • the selected image object may move at a velocity proportional to the X-Y distance between the user's hand and the reference position ( ⁇ e , the selected image object may move faster as the user moves his hand farther away from the reference position)
  • the selected image object may continue to zoom in/out at the current velocity if the user stops moving and maintains the engagement hand pose
  • the mapping of relative distance to velocity may include a "dead zone,” whereby the velocity may be zero if the relative distance is less than a dead zone distance, so that a user may stop the movement by returning the hand to near (but not necessarily exactly to) the reference position
  • the mapping of relative distance to velocity may be non-linear, such that a change in position near the reference position may result in a change of velocity of small magnitude, while a change in position further from the reference position may result in a change of velocity of larger magnitude
  • Nonlinear mapping may allow a user fine control of low velocities, and coarser control of high velocities
  • the velocity may return to zero if the user returns his hand position to within the dead zone, if the user changes the hand pose to palm forward and fingers and thumb spread, if the hand goes outside the field of view of a camera 816, if the user retracts his hand fully towards his body and drops his arm to his side, or if another event occurs
  • the velocity may return to zero by gradually diminishing over a short period of time
  • a velocity model may be used with other gestures
  • a velocity model may be used for image zoom and for image rotation (see FIG 12 below)
  • Different gestures may use different models
  • image movement may use a distance model while image zooming may use a velocity model
  • a user may be able to switch between different models for a particular gesture
  • a change model gesture may be defined which may toggle an image zoom model between velocity and distance models allowing a user to select the most effective model
  • FIGS 9A-9B illustrate an exemplary gesture combination, in which two or more independent image manipulations occur via a single gesture
  • a user moves his hand in both a Z-direction and in an X-Y direction
  • different approaches may be used to determine whether an image zoom command, image movement command, or both commands should be performed
  • either movement or zooming may be selected based upon whichever distance is larger movement in the z direction or movement in the X-Y plane
  • either movement or zooming may be selected based upon whichever distance, movement in the z direction or movement in the X-Y plane, passes a threshold distance first after an engagement hand pose is detected ( ⁇ e , the type of command may be locked according to which gesture the user does first)
  • a third approach multiple commands are performed For example, movement and zooming may occur at the same time if a user moves his hand in the z direction in addition to moving his hand in the x-y plane
  • a user 901 gesticulates his arm upward and also backward towards his body from a first position 902 to a second position 904, thereby causing a selected image object 906 to both zoom and move within the display 908, as illustrated by a larger image object 910 located more towards the top of the display 908 than the image object 906
  • a command may be classified as a major command based upon the criteria described above (either whichever distance is larger, or whichever passes a threshold distance first), and other commands classified as minor commands
  • the "dead zone' ( ⁇ e the minimum distance the user's hand must move from a reference position before causing a non-zero displacement or velocity of the on-screen object) of minor commands may be enlarged in the presence of multiple commands, so that unintentionally movement of the hand in some directions are ignored while performing a gesture in other directions
  • FIGS 10A-10B illustrate an exemplary image rotation gesture, in which a user gesticulates his hand by rotating an open hand from a first vertical position 1002 to a second, angled position 1004, thereby causing a selected image object 1006 to rotate in the display 1008 (as illustrated by the rotated image object 1010) by the same angle as the detected rotation angle of the hand
  • a dead zone may be defined so that minor movement of the hand does not unintentionally rotate a selected image object
  • a reference angle may be defined which may allow a user to rotate an image object 180 degrees without twisting his arm
  • a rotation start position may be defined such that a user starts rotating clockwise by first placing his hand at a negative 90 degrees counterclockwise relative to a vertical position (i.e., with the fingers of the hand pointing leftward) From this start position, a user may rotate a full 180 degrees clockwise (i.e., with the hands ending in a position with the fingers pointing rightward).
  • FIGS 11A-11B illustrate an exemplary image rotation gesture with a "snap" mode
  • an image rotation may, for example, "snap” to a reference angle of 0 degrees clockwise, 90 degrees clockwise, 90 degrees counterclockwise, or 180 degrees, depending upon which reference angle is closest to the hand angle of rotation.
  • a user's hand gesticulates from a first vertical position 1102 to a second position 1104, where the second position 1104 is angled 50 degrees relative to the first position 1102.
  • the hand rotation causes a selected image object 1106 to rotate to a reference angle of 90 degrees clockwise ( ⁇ e , 90 degrees rather than 50, due to the snap mode) in the display 1108, as illustrated by the image object 1110.
  • the selected image object may rotate in the display 1108 as the hand is being rotated, and may snap to the nearest reference angle when the gesture is released
  • a basis angle may be defined, and reference angles may be defined as the basis angle, the basis angle plus 90 degrees clockwise, the basis angle plus 90 degrees counterclockwise, and the basis angle plus 180 degrees
  • FIGS. 12A-12B illustrate an exemplary image rotation with velocity mode.
  • an image manipulation command is repeated while a user holds a pose.
  • a user's hand gesticulates from a first vertical position 1202 to a second angled position 1204, thereby causing a selected image object to rotate clockwise in the display 1206 from a first position 1208 to a second position 1210.
  • the selected image object continues to rotate in a clockwise direction, possibly passing through positions 1212 and 1214
  • the selected image object may rotate in the display 1206 as the pose is held
  • the image rotation may stop when the user releases the pose
  • the velocity of rotation may be proportional to the hand angle of rotation ( ⁇ e , the selected image object may rotate faster as hand rotation angles increase)
  • the selected image object may "snap" to reference rotation angles (e g , 0, 90 degrees clockwise, 90 degrees counterclockwise, or 180 degrees), depending on which reference angle the image object is nearest to when the user terminates the gesture
  • FIGS 13A-13B illustrate an exemplary show-next-image gesture
  • a "current" image object 1302 may be scaled to fit a large center part of a display 1304
  • One or more "previous” 1306 and “next” 1308 image objects may be shown on the left or right side of the current image object 1302, and may be rendered smaller than the current image object 1302
  • a user 1310 gesticulates his hand upward from a first position 1312 to a second position 1314, thereby causing the current image object 1302 to be replaced by a new "current" image object
  • an animation may be accomplished as follows the current image object 1302 may move leftward, be displayed in a smaller size, and become a new "previous” image object 1316, the "next" image object 1308 may move leftward, be displayed in a larger size, and become a new "current' image object 1318, the "previous” image object 1306 may be removed from the display 1304, and a new "next" image object 1320 may replace the old "next" image object 1308 A downward gesture may cause the animation to occur in the opposite direction
  • FIGS 13A and 13B show-next-image and show-previous-image interactions
  • use of such gestures is purely exemplary and has been chosen, inter alia, for ease of illustration
  • left and right arm movements are used to input a show-previous-image or a show-next-image command, respectively
  • Left and right arm movements may be instinctual for a user to perform, since the previous image object and the next image object are displayed to the left and right of a current image, respectively
  • the "current”, “next” and “previous” image objects may be displayed in the same size (e g , in a “filmst ⁇ p")
  • a filmst ⁇ p may or may not include the current image object ( ⁇ e , a filmstrip may be shown below, above, or to the left or right side of the current image object)
  • a filmstrip may include a thumbnail sized representation of the current image object, and one or more previous and next image objects
  • the filmstrip may be animated to appear to scroll so that the current image object is always centered If the current image object is not shown on the filmstrip, the boundary between the first next and first previous image objects may be centered
  • An up/down gesture may flip through lists of image objects (e g , categories of image objects), and a left/right gesture may flip through image objects within a list
  • a left/right gesture may flip through image object lists
  • an up/down gesture may flip through image objects within a list
  • FIGS 14A-14C illustrate thumbnail states
  • a grid of thumbnails may be displayed, for example, in a rectangular grid (e g , grid 1402)
  • the grid may also appear, for example, as a cylindrical shape or as a spherical shape (e g , grid 1404)
  • An onscreen indication may indicate to the user which image object is the current image object
  • the current image object may be displayed using a highlighted effect, such as a border, shadow, or glow
  • a current image object 1406 is displayed with a darkened double border in the grid 1402
  • the current image object may be displayed as the same size, or at a slightly larger size than other image objects
  • Image objects surrounding the current image object may be displayed at a size depending on the distance from the current thumbnail ( ⁇ e image objects located farther away from the current image object may appear smaller than image objects located closer to the current image object)
  • Image objects may be displayed in a size and shape to appear as a "bulge" (e g , image object 1408 in grid 1410)
  • Image objects may appear to wrap around a cylinder or sphere (e g , image object 1412 in grid 1404)
  • a movement gesture may cause a selection of a current image object within a grid
  • a grid may remain stationary, or a grid may pan so that the current image object is always in the center of the screen
  • a flat grid e g , grid 1402
  • a cylinder or spherical grid e g , 1404
  • the image objects may be laid out on a flat, cylindrical, or spherical surface in a spiral, so that a flat grid may wrap, or a cylinder or spherical grid may spin such that if the user pans left he will eventually reach the image object directly underneath the image object that he started on
  • the number of image objects on a cylinder of sphere may or may not match its circumference, and a user may spin a cylinder or sphere several revolutions to return to the starting thumbnail
  • a cylinder may be rendered to have twelve thumbnails around its circumference, where some of the thumbnails may be hidden on the back-side Thumbnails may be populated with image objects based on how many image objects are in an image object list If there are fewer image objects in a list than visible thumbnails, an image object may appear on multiple thumbnails
  • a thumbnail grid may be referred to as a first thumbnail state While viewing a thumbnail grid, the user may zoom out to reveal a list (e g , a category or other list) thumbnail state
  • a list thumbnail state may be referred to as a second thumbnail state
  • Each thumbnail within a second thumbnail state may be displayed as flat, cylindrical, or spherical structures of the first thumbnail state
  • a transition between the first and second thumbnail states may be displayed as an animation similar in style to "zooming in/out of a molecule" That is, a second thumbnail state may be described as being similar to viewing a chemical structure of a molecule and a first thumbnail state may be described as being similar to zooming into a molecule to view its photons and electrons.
  • a second thumbnail state may be described as being similar to viewing a collection of stars in a solar system while a first thumbnail state may be described as being similar to zooming into a star to view its planets.
  • a zoom gesture may optionally "snap" to a state. For example, if the user terminates a zoom gesture when an image object is close to fitting to the screen, the system may animate the image object movement to reach the fit-to-screen state.
  • a two handed gesture may be used to manipulate an image object.
  • a user may point with two hands, such as pointing to the corners of an image object.
  • To rotate an image object the user may move his hands as if to trace out points on opposite sides of a circular path.
  • the angle between the x-y components of the hand positions is measured. It does not matter which sides or corners the user initially points to, as the rotation will be relative to the initial hand positions.
  • a reference angle is recorded after a short delay (e.g., 500 milliseconds), to allow the user time to raise the second hand into position.
  • the image object After detecting both hands, or after a "hover" is detected (i.e., after a short period (e.g., 500 milliseconds) of time has elapsed with both hand positions moving less than a threshold distance), the image object is rotated by the current angle between the hand positions relative to the reference angle. The image object may snap to the nearest 90 degrees when both hands are no longer detected.
  • a short period e.g. 500 milliseconds
  • the user may move both hands closer or further apart.
  • the distance between the x-y components of the hand positions is measured.
  • a reference distance is recorded after a short delay (e.g., 500 milliseconds), to allow the user time to raise the second hand into position.
  • the distance relative to the reference distance may be mapped to a scale, or to a velocity, and the image object may be zoomed accordingly.
  • the user may move both hands in the same direction, keeping the distance and angle between the hands constant
  • the average of the X-Y components of the hand positions is measured
  • a reference position is recorded after a short delay (e g , 500 milliseconds) to allow the user time to raise the second hand into position
  • the average position relative to the reference position may be mapped to a position within an image object or to a velocity, and the image object may be moved accordingly
  • the enabling gesture may result in the control object ( ⁇ e hand) of the user's body being disposed in an awkward position
  • the user's hand may he near the boundary of or outside of a camera's field of view
  • a process may occur to orient or align the user's control object with a target position (in free space) that eases or improves future gesture recognitions
  • the target position in free space may be predefined, such as a center position of the camera's field of view, or the target position may be dynamically determined, for instance in a well-lit position, or an area of high contrast or without a complex background, or in a region of the field of view away from other moving objects
  • One approach for aligning the user's position for improved gesture recognition is to display the representation and a target image on the user interface or on the user interface, to infer to or guide the user to move the representation through motion of their body, so that the representation aligns with the target image
  • the representation of the user may initially be displayed outside a central region of the user interface, and a target image may be displayed in the central region
  • a realignment gesture may be recognized, the representation may be moved in relation to the target image based on the realignment gesture If the moved representation aligns with the target image, the representation will be displayed in the central region Realignment may assure the user's hand remains in the camera's field of view, or may also assure the user has enough reach of the arm to perform the gestures in one or more directions
  • FIG 15 illustrates an example of the exterior appearance of a computing device 1501 that further includes a processor and a user interface
  • a device includes a user interface and a processor
  • the user interface is configured to display one or more image objects
  • the processor is configured to recognize, from first and second images, a user's gesture, to determine an interaction command corresponding to the recognized user's gesture, and to manipulate, based on the determined interaction command, an image object displayed in a user interface
  • the hardware environment of the computing device 1501 includes a display monitor 1508 for displaying text and images to interface with a user, a keyboard 1509 for entering text data and user commands into the computing device 1501 , a mouse 1510 for pointing, selecting and manipulating objects displayed on the display monitor 1508, a fixed disk drive 1511 , a removable disk drive 1512, a tape drive 1514, a hardcopy output device, a computer network connection, and a digital input device 1517
  • the display monitor 1508 displays the graphics, images, and text that make up the user interface for the software applications used by the computing device 1501 , as well as the operating system programs necessary to operate the computing device 1501
  • a user uses the keyboard 1509 to enter commands and data to operate and control the computer operating system programs as well as the application programs
  • the mouse 1510 may be any type of pointing device, and may be a joystick, a trackball, a touch-pad, or other pointing device
  • Software used to display a user interface and enable a user to enter or select text, numbers, or select from a menu of options is stored locally on computer readable memory media, such as the fixed disk drive 1511
  • the fixed disk drive 1511 itself may include a number of physical drive units, such as a redundant array of independent disks (“RAID”), or may be a disk drive farm or a disk array that is physically located in a separate computing unit.
  • RAID redundant array of independent disks
  • Such computer readable memory media allow the computing device 1501 to access computer-executable process steps, application programs and the like, stored on removable and non-removable memory media.
  • the computer network connection may be a modem connection, a local-area network (“LAN”) connection including the Ethernet, or a broadband wide-area network (“WAN”) connection such as a digital subscriber line (“DSL”), cable high-speed internet connection, a broadband over power line connection, dial-up connection, T- 1 line, T-3 line, fiber optic connection, or satellite connection.
  • the network 1306 may be a LAN network, a corporate or government WAN network, the Internet, or other network.
  • the computer network connection may be a wireline or wireless connector.
  • Example wireless connectors include, for example, an INFRARED DATA ASSOCIATION ® (“IrDA ® ”) wireless connector, an optical wireless connector, an INSTITUTE OF ELECTRICAL AND ELECTRONICS ENGINEERS ® (“IEEE ® ”) Standard 802.11 wireless connector, a BLUETOOTH ® wireless connector, an orthogonal frequency division multiplexing (“OFDM”) ultra wide band (“UWB”) wireless connector, a time-modulated ultra wide band (“TM-UWB”) wireless connector, or other wireless connector.
  • Example wired connectors include, for example, a IEEE ® -1394 FIREWIRE ® connector, a Universal Serial Bus (“USB”) connector, a serial port connector, a parallel port connector, or other wireline connector.
  • the removable disk drive 1512 is a removable storage device that is used to off-load data from the computing device 1501 or upload data onto the computing device 1501.
  • the removable disk drive 1512 may be a floppy disk drive, an IOMEGA ® ZIP ® drive, a compact disk-read only memory (“CD-ROM”) drive, a CD-Recordable drive (“CD-R”), a CD-Rewritable drive (“CD-RW”), flash memory, a USB flash drive, thumb drive, pen drive, key drive, a High-Density Digital Versatile Disc (“HD-DVD”) optical disc drive, a Blu-Ray optical disc drive, a Holographic Digital Data Storage (“HDDS”) optical disc drive, or any one of the various recordable or rewritable digital versatile disc (“DVD”) drives such as the DVD-Recordable (“DVD-R” or “DVD+R”), DVD-Rewritable (“DVD-RW” or “DVD+RW”), or DVD-RAM.
  • the tape drive 1514 is a tape storage device that is used to off-load data from the computing device 1501 or to upload data onto the computing device 1501.
  • the tape drive 1514 may be a quarter-inch cartridge (“QIC"), 4 mm digital audio tape (“DAT”), 8 mm digital linear tape (“DLT”) drive, or other type of tape.
  • the computing device 1501 is described above as a desktop PC, in further implementations the computing device 1501 may be a laptop, a workstation, a midrange computer, a mainframe, an embedded system, telephone, a handheld or tablet computer, a PDA, a gaming device or console, a digital picture frame, a teleconferencing device, or other type of computer.
  • FIG. 16 is a block diagram illustrating the internal architecture of a computer shown in FIG. 15.
  • the computing environment includes a computer central processing unit (“CPU") 1601 , where the computer instructions that make up an operating system or an application are processed; a display interface 1602 which provides a communication interface and processing functions for rendering graphics, images, and texts on the display monitor 1508; a keyboard interface 1604 which provides a communication interface to the keyboard 1509; a pointing device interface 1605 which provides a communication interface to the mouse 1510 or an equivalent pointing device; a digital input interface 1606 which provides a communication interface to the digital input device 1517; a hardcopy output device interface which provides a communication interface to the hardcopy output device; a random access memory (“RAM”) 1610 where computer instructions and data are stored in a volatile memory device for processing by the computer CPU 1601 ; a read-only memory (“ROM”) 1611 where invariant low-level systems code or data for basic system functions such as basic input and output
  • ROM read-only memory
  • a computer program product is tangibly embodied or recorded in a machine-readable medium such as storage 1620
  • the computer program product includes instructions that, when read by a machine, operate to cause data processing apparatus to recognize, from first and second images, a user's gesture, to determine an interaction command corresponding to the recognized user's gesture, and to manipulate, based on the determined interaction command, an image object displayed in a user interface
  • the RAM 1610 interfaces with the computer bus 1627 so as to provide quick RAM storage to the computer CPU 1601 during the execution of software programs such as the operating system application programs, and device drivers More specifically, the computer CPU 1601 loads computer-executable process steps from the fixed disk drive 1511 or other memory media into a field of the RAM 1610 in order to execute software programs Data is stored in the RAM 1610, where the data is accessed by the computer CPU 1601 during execution
  • the computing device 1501 stores computer-executable code for an operating system 1621 , application programs 1622 such as word processing, spreadsheet, presentation, gaming, or other applications It is possible to implement the functions according to the present disclosure as a dynamic link library (“DLL”), or as a plug-in to other application programs such as an Internet web-browser such as the MICROSOFT ® Internet Explorer web browser
  • DLL dynamic link library
  • an Internet web-browser such as the MICROSOFT ® Internet Explorer web browser
  • the computer CPU 1601 is one of a number of high-performance computer processors, including an INTEL ® or AMD ® processor, a POWERPC ® processor, a MIPS ® reduced instruction set computer ("RISC") processor, a SPARC ® processor, an ACORN ® RISC Machine (“ARM ® ”) architecture processor, a HP ALPHASERVER ® processor or a proprietary computer processor for a mainframe
  • the computer CPU 1601 is more than one processing unit, including a multiple CPU configuration found in high-performance workstations and servers, or a multiple scalable processing unit found in mainframes
  • the operating system 1621 may be MICROSOFT ® WINDOWS NT ® /WINDOWS ® 2000/WINDOWS ® XP Workstation, WINDOWS NT ® /WINDOWS ® 2000/WINDOWS ® XP Server, a variety of UNIX ® -flavored operating systems, including AIX ® for IBM ® workstations and servers, SUNOS ® for SUN® workstations and servers, LINUX ® for INTEL ® CPU-based workstations and servers, HP UX WORKLOAD MANAGER ® for HP ® workstations and servers, IRIX ® for SGI ® workstations and servers, VAX/VMS for Digital Equipment Corporation computers, OPENVMS ® for HP ALPHASERVER ® -based computers, MAC OS ® X for POWERPC ® based workstations and servers, SYMBIAN OS ® , WINDOWS MOBILE ® or WINDOWS CE ®
  • FIGS 15 and 16 illustrate one possible implementation of a computing device that executes program code, or program or process steps, configured to provide for an enhanced control that allows for a user to intuitively and easily enter text, numbers, or select from a plurality of items, other types of computers or implementations may also be used as well.

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • User Interface Of Digital Computer (AREA)
  • Position Input By Displaying (AREA)
  • Image Analysis (AREA)
  • Controls And Circuits For Display Device (AREA)
PCT/US2009/035544 2008-03-04 2009-02-27 Enhanced gesture-based image manipulation Ceased WO2009111329A2 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
JP2010549767A JP5855343B2 (ja) 2008-03-04 2009-02-27 改良されたジェスチャに基づく画像操作

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US12/041,927 US9772689B2 (en) 2008-03-04 2008-03-04 Enhanced gesture-based image manipulation
US12/041,927 2008-03-04

Publications (2)

Publication Number Publication Date
WO2009111329A2 true WO2009111329A2 (en) 2009-09-11
WO2009111329A3 WO2009111329A3 (en) 2010-01-07

Family

ID=41054915

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/US2009/035544 Ceased WO2009111329A2 (en) 2008-03-04 2009-02-27 Enhanced gesture-based image manipulation

Country Status (3)

Country Link
US (1) US9772689B2 (enExample)
JP (2) JP5855343B2 (enExample)
WO (1) WO2009111329A2 (enExample)

Cited By (15)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2011118477A (ja) * 2009-11-30 2011-06-16 Sony Corp 情報処理装置、情報処理方法及びそのプログラム
JP2012018620A (ja) * 2010-07-09 2012-01-26 Canon Inc 情報処理装置およびその制御方法
JP2012022458A (ja) * 2010-07-13 2012-02-02 Canon Inc 情報処理装置およびその制御方法
JP2012103982A (ja) * 2010-11-11 2012-05-31 Sony Corp 情報処理装置、立体視表示方法及びプログラム
JP2012103981A (ja) * 2010-11-11 2012-05-31 Sony Corp 情報処理装置、立体視表示方法及びプログラム
JP2013533541A (ja) * 2010-06-10 2013-08-22 マイクロソフト コーポレーション 文字の選択
JP2013232200A (ja) * 2010-09-22 2013-11-14 Nikon Corp 画像表示装置
CN103502912A (zh) * 2011-05-09 2014-01-08 皇家飞利浦有限公司 转动屏幕上的物体
JP2014503084A (ja) * 2010-07-27 2014-02-06 テルコーディア テクノロジーズ インコーポレイテッド 3次元形状のファセット上の関連メディアセグメントの対話型の投影および再生
JP2014510344A (ja) * 2011-03-01 2014-04-24 クゥアルコム・インコーポレイテッド コンテンツを表示するためのシステムおよび方法
JP2015038777A (ja) * 2014-11-12 2015-02-26 セイコーエプソン株式会社 位置検出システム、表示システム及び情報処理システム
US9264693B2 (en) 2011-12-26 2016-02-16 Semiconductor Energy Laboratory Co., Ltd. Motion recognition device
US9465437B2 (en) 2012-02-23 2016-10-11 Intel Corporation Method and apparatus for controlling screen by tracking head of user through camera module, and computer-readable recording medium therefor
CN109324726A (zh) * 2014-06-16 2019-02-12 华为技术有限公司 图标移动方法、装置和电子设备
US11431887B2 (en) 2018-07-24 2022-08-30 Sony Semiconductor Solutions Corporation Information processing device and method for detection of a sound image object

Families Citing this family (383)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8352400B2 (en) 1991-12-23 2013-01-08 Hoffberg Steven M Adaptive pattern recognition based controller apparatus and method and human-factored interface therefore
US7760187B2 (en) 2004-07-30 2010-07-20 Apple Inc. Visual expander
US20070177804A1 (en) * 2006-01-30 2007-08-02 Apple Computer, Inc. Multi-touch gesture dictionary
US7966078B2 (en) 1999-02-01 2011-06-21 Steven Hoffberg Network media appliance system and method
US9182937B2 (en) 2010-10-01 2015-11-10 Z124 Desktop reveal by moving a logical display stack with gestures
US8943434B2 (en) 2010-10-01 2015-01-27 Z124 Method and apparatus for showing stored window display
US8396565B2 (en) 2003-09-15 2013-03-12 Medtronic, Inc. Automatic therapy adjustments
EP1914636A4 (en) * 2005-07-27 2009-12-23 Mikhail Vasilyevich Belyaev CLIENT SERVER INFORMATION SYSTEM AND METHOD FOR PRESENTING A GRAPHIC USER INTERFACE
US8225231B2 (en) 2005-08-30 2012-07-17 Microsoft Corporation Aggregation of PC settings
US7957809B2 (en) 2005-12-02 2011-06-07 Medtronic, Inc. Closed-loop therapy adjustment
US8890813B2 (en) * 2009-04-02 2014-11-18 Oblong Industries, Inc. Cross-user hand tracking and shape recognition user interface
EP2010999A4 (en) * 2006-04-21 2012-11-21 Google Inc SYSTEM FOR ORGANIZING AND VISUALIZING DISPLAY OBJECTS
US8972902B2 (en) * 2008-08-22 2015-03-03 Northrop Grumman Systems Corporation Compound gesture recognition
US7856605B2 (en) * 2006-10-26 2010-12-21 Apple Inc. Method, system, and graphical user interface for positioning an insertion marker in a touch screen display
US8570278B2 (en) 2006-10-26 2013-10-29 Apple Inc. Portable multifunction device, method, and graphical user interface for adjusting an insertion point marker
US9311528B2 (en) * 2007-01-03 2016-04-12 Apple Inc. Gesture learning
KR100913962B1 (ko) * 2007-05-14 2009-08-26 삼성전자주식회사 이동통신 단말기의 문자 입력 방법 및 장치
KR101141087B1 (ko) 2007-09-14 2012-07-12 인텔렉츄얼 벤처스 홀딩 67 엘엘씨 제스처-기반 사용자 상호작용의 프로세싱
JP4569613B2 (ja) * 2007-09-19 2010-10-27 ソニー株式会社 画像処理装置および画像処理方法、並びにプログラム
US8159682B2 (en) 2007-11-12 2012-04-17 Intellectual Ventures Holding 67 Llc Lens system
KR101417808B1 (ko) * 2007-12-06 2014-07-09 삼성전자주식회사 디지털 촬영장치, 그 제어방법 및 제어방법을 실행시키기위한 프로그램을 저장한 기록매체
US8413075B2 (en) * 2008-01-04 2013-04-02 Apple Inc. Gesture movies
US9035876B2 (en) * 2008-01-14 2015-05-19 Apple Inc. Three-dimensional user interface session control
US8933876B2 (en) * 2010-12-13 2015-01-13 Apple Inc. Three dimensional user interface session control
US8650507B2 (en) 2008-03-04 2014-02-11 Apple Inc. Selecting of text using gestures
US8201109B2 (en) 2008-03-04 2012-06-12 Apple Inc. Methods and graphical user interfaces for editing on a portable multifunction device
US8259163B2 (en) 2008-03-07 2012-09-04 Intellectual Ventures Holding 67 Llc Display with built in 3D sensing
US20090254855A1 (en) * 2008-04-08 2009-10-08 Sony Ericsson Mobile Communications, Ab Communication terminals with superimposed user interface
CN101626631A (zh) * 2008-07-08 2010-01-13 深圳富泰宏精密工业有限公司 利用影像调用功能的系统及方法
US8751011B2 (en) 2008-07-11 2014-06-10 Medtronic, Inc. Defining therapy parameter values for posture states
US9050471B2 (en) 2008-07-11 2015-06-09 Medtronic, Inc. Posture state display on medical device user interface
US8332041B2 (en) 2008-07-11 2012-12-11 Medtronic, Inc. Patient interaction with posture-responsive therapy
US8504150B2 (en) 2008-07-11 2013-08-06 Medtronic, Inc. Associating therapy adjustments with posture states using a stability timer
US8401666B2 (en) 2008-07-11 2013-03-19 Medtronic, Inc. Modification profiles for posture-responsive therapy
US8231556B2 (en) 2008-07-11 2012-07-31 Medtronic, Inc. Obtaining baseline patient information
US8708934B2 (en) 2008-07-11 2014-04-29 Medtronic, Inc. Reorientation of patient posture states for posture-responsive therapy
US8688225B2 (en) 2008-07-11 2014-04-01 Medtronic, Inc. Posture state detection using selectable system control parameters
US8249718B2 (en) 2008-07-11 2012-08-21 Medtronic, Inc. Programming posture state-responsive therapy with nominal therapy parameters
JP5161690B2 (ja) * 2008-07-31 2013-03-13 キヤノン株式会社 情報処理装置及びその制御方法
US8280517B2 (en) 2008-09-19 2012-10-02 Medtronic, Inc. Automatic validation techniques for validating operation of medical devices
KR20100041006A (ko) 2008-10-13 2010-04-22 엘지전자 주식회사 3차원 멀티 터치를 이용한 사용자 인터페이스 제어방법
US9690442B2 (en) * 2008-10-17 2017-06-27 Adobe Systems Incorporated Generating customized effects for image presentation
US8086275B2 (en) 2008-10-23 2011-12-27 Microsoft Corporation Alternative inputs of a mobile communications device
US8411046B2 (en) 2008-10-23 2013-04-02 Microsoft Corporation Column organization of content
KR101513019B1 (ko) * 2008-10-27 2015-04-17 엘지전자 주식회사 휴대 단말기 및 그 동작방법
US8516397B2 (en) * 2008-10-27 2013-08-20 Verizon Patent And Licensing Inc. Proximity interface apparatuses, systems, and methods
US8423916B2 (en) * 2008-11-20 2013-04-16 Canon Kabushiki Kaisha Information processing apparatus, processing method thereof, and computer-readable storage medium
JP4816713B2 (ja) * 2008-11-25 2011-11-16 ソニー株式会社 情報処理装置、情報処理方法および情報処理用プログラム
US8205168B1 (en) * 2008-12-01 2012-06-19 Adobe Systems Incorporated Methods and systems for page navigation of dynamically laid-out systems
US20100138797A1 (en) * 2008-12-01 2010-06-03 Sony Ericsson Mobile Communications Ab Portable electronic device with split vision content sharing control and method
KR20100077851A (ko) * 2008-12-29 2010-07-08 엘지전자 주식회사 Dtv 및 이를 이용한 콘텐츠 표시 방법
JP5168161B2 (ja) * 2009-01-16 2013-03-21 ブラザー工業株式会社 ヘッドマウントディスプレイ
US8588465B2 (en) 2009-01-30 2013-11-19 Microsoft Corporation Visual target tracking
US8577085B2 (en) 2009-01-30 2013-11-05 Microsoft Corporation Visual target tracking
US20100199231A1 (en) 2009-01-30 2010-08-05 Microsoft Corporation Predictive determination
US8682028B2 (en) 2009-01-30 2014-03-25 Microsoft Corporation Visual target tracking
US8577084B2 (en) 2009-01-30 2013-11-05 Microsoft Corporation Visual target tracking
US9652030B2 (en) 2009-01-30 2017-05-16 Microsoft Technology Licensing, Llc Navigation of a virtual plane using a zone of restriction for canceling noise
US8565477B2 (en) * 2009-01-30 2013-10-22 Microsoft Corporation Visual target tracking
US8267781B2 (en) * 2009-01-30 2012-09-18 Microsoft Corporation Visual target tracking
US7996793B2 (en) 2009-01-30 2011-08-09 Microsoft Corporation Gesture recognizer system architecture
US8565476B2 (en) * 2009-01-30 2013-10-22 Microsoft Corporation Visual target tracking
US20100218100A1 (en) * 2009-02-25 2010-08-26 HNTB Holdings, Ltd. Presentation system
US9024976B2 (en) 2009-03-05 2015-05-05 The Invention Science Fund I, Llc Postural information system and method
US20100228153A1 (en) * 2009-03-05 2010-09-09 Searete Llc, A Limited Liability Corporation Of The State Of Delaware Postural information system and method
US20100228494A1 (en) * 2009-03-05 2010-09-09 Searete Llc, A Limited Liability Corporation Of The State Of Delaware Postural information system and method including determining subject advisory information based on prior determined subject advisory information
KR20100101389A (ko) * 2009-03-09 2010-09-17 삼성전자주식회사 사용자 메뉴를 제공하는 디스플레이 장치 및 이에 적용되는ui제공 방법
US8756534B2 (en) 2009-03-16 2014-06-17 Apple Inc. Methods and graphical user interfaces for editing on a multifunction device with a touch screen display
US8849570B2 (en) * 2009-03-19 2014-09-30 Microsoft Corporation Projected way-finding
US8121640B2 (en) 2009-03-19 2012-02-21 Microsoft Corporation Dual module portable devices
US20100241999A1 (en) * 2009-03-19 2010-09-23 Microsoft Corporation Canvas Manipulation Using 3D Spatial Gestures
US8175653B2 (en) 2009-03-30 2012-05-08 Microsoft Corporation Chromeless user interface
US8238876B2 (en) 2009-03-30 2012-08-07 Microsoft Corporation Notifications
US8810574B2 (en) * 2009-04-02 2014-08-19 Mellmo Inc. Displaying pie charts in a limited display area
JP5256109B2 (ja) 2009-04-23 2013-08-07 株式会社日立製作所 表示装置
US9026223B2 (en) 2009-04-30 2015-05-05 Medtronic, Inc. Therapy system including multiple posture sensors
US9327070B2 (en) 2009-04-30 2016-05-03 Medtronic, Inc. Medical device therapy based on posture and timing
US8175720B2 (en) 2009-04-30 2012-05-08 Medtronic, Inc. Posture-responsive therapy control based on patient input
US9898675B2 (en) 2009-05-01 2018-02-20 Microsoft Technology Licensing, Llc User movement tracking feedback to improve tracking
US20100289912A1 (en) * 2009-05-14 2010-11-18 Sony Ericsson Mobile Communications Ab Camera arrangement with image modification
US9417700B2 (en) 2009-05-21 2016-08-16 Edge3 Technologies Gesture recognition systems and related methods
US8836648B2 (en) 2009-05-27 2014-09-16 Microsoft Corporation Touch pull-in gesture
US9400559B2 (en) * 2009-05-29 2016-07-26 Microsoft Technology Licensing, Llc Gesture shortcuts
US8856691B2 (en) * 2009-05-29 2014-10-07 Microsoft Corporation Gesture tool
US9383823B2 (en) 2009-05-29 2016-07-05 Microsoft Technology Licensing, Llc Combining gestures beyond skeletal
US8466934B2 (en) * 2009-06-29 2013-06-18 Min Liang Tan Touchscreen interface
US20110029904A1 (en) * 2009-07-30 2011-02-03 Adam Miles Smith Behavior and Appearance of Touch-Optimized User Interface Elements for Controlling Computer Function
KR100984817B1 (ko) * 2009-08-19 2010-10-01 주식회사 컴퍼니원헌드레드 이동통신 단말기의 터치스크린을 이용한 사용자 인터페이스 방법
EP2472374B1 (en) * 2009-08-24 2019-03-20 Samsung Electronics Co., Ltd. Method for providing a ui using motions
US8429565B2 (en) * 2009-08-25 2013-04-23 Google Inc. Direct manipulation gestures
KR101038323B1 (ko) * 2009-09-24 2011-06-01 주식회사 팬택 영상인식기법을 이용한 화면 프레임 제어장치
KR101599288B1 (ko) * 2009-09-24 2016-03-04 삼성전자 주식회사 디스플레이장치 및 그 영상표시방법
KR101631451B1 (ko) * 2009-11-16 2016-06-20 엘지전자 주식회사 영상표시장치 및 그 동작방법
US20110119216A1 (en) * 2009-11-16 2011-05-19 Microsoft Corporation Natural input trainer for gestural instruction
US8988507B2 (en) * 2009-11-19 2015-03-24 Sony Corporation User interface for autofocus
US20110199387A1 (en) * 2009-11-24 2011-08-18 John David Newton Activating Features on an Imaging Device Based on Manipulations
EP2507683A1 (en) * 2009-12-04 2012-10-10 Next Holdings Limited Methods and systems for position detection using an interactive volume
US8358281B2 (en) * 2009-12-15 2013-01-22 Apple Inc. Device, method, and graphical user interface for management and manipulation of user interface elements
KR20110076458A (ko) * 2009-12-29 2011-07-06 엘지전자 주식회사 디스플레이 장치 및 그 제어방법
US8579834B2 (en) 2010-01-08 2013-11-12 Medtronic, Inc. Display of detected patient posture state
US8758274B2 (en) 2010-01-08 2014-06-24 Medtronic, Inc. Automated adjustment of posture state definitions for a medical device
US9956418B2 (en) 2010-01-08 2018-05-01 Medtronic, Inc. Graphical manipulation of posture zones for posture-responsive therapy
US9357949B2 (en) 2010-01-08 2016-06-07 Medtronic, Inc. User interface that displays medical therapy and posture data
US8633890B2 (en) 2010-02-16 2014-01-21 Microsoft Corporation Gesture detection based on joint skipping
EP2539797B1 (en) 2010-02-25 2019-04-03 Hewlett Packard Development Company, L.P. Representative image
EP2548133A4 (en) * 2010-03-18 2016-03-16 Hewlett Packard Development Co INTERACTION WITH A DEVICE
US20120274550A1 (en) * 2010-03-24 2012-11-01 Robert Campbell Gesture mapping for display device
US20110246946A1 (en) * 2010-03-31 2011-10-06 Douglas Weber Apparatus and Method for Interacting with Embedded Objects in Mail Application
CN102859546B (zh) * 2010-03-31 2016-11-02 乐天株式会社 信息处理装置、信息处理方法
US9566441B2 (en) 2010-04-30 2017-02-14 Medtronic, Inc. Detecting posture sensor signal shift or drift in medical devices
JP5485470B2 (ja) * 2010-04-30 2014-05-07 トムソン ライセンシング 3d系内でプッシュ及びプルのジェスチャーを認識する方法及び装置
US8396252B2 (en) * 2010-05-20 2013-03-12 Edge 3 Technologies Systems and related methods for three dimensional gesture recognition in vehicles
WO2011146070A1 (en) * 2010-05-21 2011-11-24 Hewlett-Packard Development Company, L.P. System and method for reporting data in a computer vision system
JP2011248768A (ja) * 2010-05-28 2011-12-08 Sony Corp 情報処理装置、情報処理システム及びプログラム
US20110310010A1 (en) * 2010-06-17 2011-12-22 Primesense Ltd. Gesture based user interface
US20110317871A1 (en) * 2010-06-29 2011-12-29 Microsoft Corporation Skeletal joint recognition and tracking system
KR101167784B1 (ko) 2010-07-09 2012-07-25 성균관대학교산학협력단 단말기의 후면부 손가락 움직임에 따른 포인터 인식 방법 및 제어 명령어 인식 방법
WO2012011044A1 (en) 2010-07-20 2012-01-26 Primesense Ltd. Interactive reality augmentation for natural interaction
US9201501B2 (en) 2010-07-20 2015-12-01 Apple Inc. Adaptive projector
JP5625599B2 (ja) * 2010-08-04 2014-11-19 ソニー株式会社 情報処理装置、情報処理方法、及びプログラム
CN102346490B (zh) * 2010-08-05 2014-02-19 鸿富锦精密工业(深圳)有限公司 化妆镜调整系统、方法及具有该调整系统的化妆镜
WO2012030958A1 (en) * 2010-08-31 2012-03-08 Activate Systems, Inc. Methods and apparatus for improved motion capture
US8666144B2 (en) 2010-09-02 2014-03-04 Edge 3 Technologies, Inc. Method and apparatus for determining disparity of texture
US8582866B2 (en) 2011-02-10 2013-11-12 Edge 3 Technologies, Inc. Method and apparatus for disparity computation in stereo images
US8655093B2 (en) 2010-09-02 2014-02-18 Edge 3 Technologies, Inc. Method and apparatus for performing segmentation of an image
WO2012030872A1 (en) 2010-09-02 2012-03-08 Edge3 Technologies Inc. Method and apparatus for confusion learning
JP2012060236A (ja) * 2010-09-06 2012-03-22 Sony Corp 画像処理装置、画像処理方法およびコンピュータプログラム
US20120069055A1 (en) * 2010-09-22 2012-03-22 Nikon Corporation Image display apparatus
US8959013B2 (en) 2010-09-27 2015-02-17 Apple Inc. Virtual keyboard for a non-tactile three dimensional user interface
US9046992B2 (en) 2010-10-01 2015-06-02 Z124 Gesture controls for multi-screen user interface
US20120113223A1 (en) * 2010-11-05 2012-05-10 Microsoft Corporation User Interaction in Augmented Reality
EP2453386B1 (en) * 2010-11-11 2019-03-06 LG Electronics Inc. Multimedia device, multiple image sensors having different types and method for controlling the same
KR20120051212A (ko) * 2010-11-12 2012-05-22 엘지전자 주식회사 멀티미디어 장치의 사용자 제스쳐 인식 방법 및 그에 따른 멀티미디어 장치
JP5300825B2 (ja) * 2010-11-17 2013-09-25 シャープ株式会社 指示受付装置、指示受付方法、コンピュータプログラム及び記録媒体
US9545188B2 (en) 2010-12-02 2017-01-17 Ultradent Products, Inc. System and method of viewing and tracking stereoscopic video images
US8872762B2 (en) 2010-12-08 2014-10-28 Primesense Ltd. Three dimensional user interface cursor control
US9171264B2 (en) 2010-12-15 2015-10-27 Microsoft Technology Licensing, Llc Parallel processing machine learning decision tree training
KR101813028B1 (ko) * 2010-12-17 2017-12-28 엘지전자 주식회사 이동 단말기 및 그 디스플레이 제어 방법
US20120159383A1 (en) 2010-12-20 2012-06-21 Microsoft Corporation Customization of an immersive environment
US20120159395A1 (en) 2010-12-20 2012-06-21 Microsoft Corporation Application-launching interface for multiple modes
US9575561B2 (en) 2010-12-23 2017-02-21 Intel Corporation Method, apparatus and system for interacting with content on web browsers
US8689123B2 (en) 2010-12-23 2014-04-01 Microsoft Corporation Application reporting in an application-selectable user interface
US8612874B2 (en) 2010-12-23 2013-12-17 Microsoft Corporation Presenting an application change through a tile
US9423951B2 (en) 2010-12-31 2016-08-23 Microsoft Technology Licensing, Llc Content-based snap point
EP3527121B1 (en) 2011-02-09 2023-08-23 Apple Inc. Gesture detection in a 3d mapping environment
US8970589B2 (en) 2011-02-10 2015-03-03 Edge 3 Technologies, Inc. Near-touch interaction with a stereo camera grid structured tessellations
US20140218300A1 (en) * 2011-03-04 2014-08-07 Nikon Corporation Projection device
US9857868B2 (en) 2011-03-19 2018-01-02 The Board Of Trustees Of The Leland Stanford Junior University Method and system for ergonomic touch-free interface
US9383917B2 (en) 2011-03-28 2016-07-05 Microsoft Technology Licensing, Llc Predictive tiling
US8717318B2 (en) * 2011-03-29 2014-05-06 Intel Corporation Continued virtual links between gestures and user interface elements
WO2012141350A1 (en) * 2011-04-12 2012-10-18 Lg Electronics Inc. Electronic device and method for displaying stereoscopic image
US9189825B2 (en) 2011-04-12 2015-11-17 Lg Electronics Inc. Electronic device and method for displaying stereoscopic image
WO2012141352A1 (en) * 2011-04-13 2012-10-18 Lg Electronics Inc. Gesture recognition agnostic to device orientation
US8928589B2 (en) * 2011-04-20 2015-01-06 Qualcomm Incorporated Virtual keyboards and methods of providing the same
US8654219B2 (en) * 2011-04-26 2014-02-18 Lg Electronics Inc. Method and apparatus for restoring dead pixel using light intensity map in a time-of-flight camera
JP6106921B2 (ja) 2011-04-26 2017-04-05 株式会社リコー 撮像装置、撮像方法および撮像プログラム
KR101514170B1 (ko) * 2011-04-27 2015-04-21 엔이씨 솔루션 이노베이터 가부시키가이샤 입력 장치, 입력 방법 및 기록 매체
US10671841B2 (en) * 2011-05-02 2020-06-02 Microsoft Technology Licensing, Llc Attribute state classification
WO2012157793A1 (en) * 2011-05-17 2012-11-22 Lg Electronics Inc. Gesture recognition method and apparatus
US9658766B2 (en) 2011-05-27 2017-05-23 Microsoft Technology Licensing, Llc Edge gesture
US8893033B2 (en) 2011-05-27 2014-11-18 Microsoft Corporation Application notifications
US9104440B2 (en) 2011-05-27 2015-08-11 Microsoft Technology Licensing, Llc Multi-application environment
US8769409B2 (en) * 2011-05-27 2014-07-01 Cyberlink Corp. Systems and methods for improving object detection
US9104307B2 (en) 2011-05-27 2015-08-11 Microsoft Technology Licensing, Llc Multi-application environment
US9158445B2 (en) 2011-05-27 2015-10-13 Microsoft Technology Licensing, Llc Managing an immersive interface in a multi-application immersive environment
US20120304132A1 (en) 2011-05-27 2012-11-29 Chaitanya Dev Sareen Switching back to a previously-interacted-with application
US8661339B2 (en) 2011-05-31 2014-02-25 Apple Inc. Devices, methods, and graphical user interfaces for document manipulation
US20120311503A1 (en) * 2011-06-06 2012-12-06 Microsoft Corporation Gesture to trigger application-pertinent information
US11048333B2 (en) 2011-06-23 2021-06-29 Intel Corporation System and method for close-range movement tracking
JP6074170B2 (ja) * 2011-06-23 2017-02-01 インテル・コーポレーション 近距離動作のトラッキングのシステムおよび方法
JP2013016018A (ja) * 2011-07-04 2013-01-24 Canon Inc 表示制御装置、制御方法及びプログラム
US9377865B2 (en) 2011-07-05 2016-06-28 Apple Inc. Zoom-based gesture user interface
US9459758B2 (en) 2011-07-05 2016-10-04 Apple Inc. Gesture-based interface with enhanced features
US8881051B2 (en) 2011-07-05 2014-11-04 Primesense Ltd Zoom-based gesture user interface
WO2013015462A1 (ko) * 2011-07-22 2013-01-31 엘지전자 주식회사 사용자 제스쳐에 따라 동작하는 전자기기 및 전자기기의 동작 제어방법
US8687023B2 (en) 2011-08-02 2014-04-01 Microsoft Corporation Cross-slide gesture to select and rearrange
ES2958183T3 (es) 2011-08-05 2024-02-05 Samsung Electronics Co Ltd Procedimiento de control de aparatos electrónicos basado en el reconocimiento de voz y de movimiento, y aparato electrónico que aplica el mismo
JP5701714B2 (ja) * 2011-08-05 2015-04-15 株式会社東芝 ジェスチャ認識装置、ジェスチャ認識方法およびジェスチャ認識プログラム
KR101262700B1 (ko) 2011-08-05 2013-05-08 삼성전자주식회사 음성 인식 및 모션 인식을 이용하는 전자 장치의 제어 방법 및 이를 적용한 전자 장치
US9030498B2 (en) 2011-08-15 2015-05-12 Apple Inc. Combining explicit select gestures and timeclick in a non-tactile three dimensional user interface
TWI575494B (zh) * 2011-08-19 2017-03-21 半導體能源研究所股份有限公司 半導體裝置的驅動方法
JP5921835B2 (ja) * 2011-08-23 2016-05-24 日立マクセル株式会社 入力装置
US9122311B2 (en) 2011-08-24 2015-09-01 Apple Inc. Visual feedback for tactile and non-tactile user interfaces
US9218063B2 (en) * 2011-08-24 2015-12-22 Apple Inc. Sessionless pointing user interface
US20130057587A1 (en) 2011-09-01 2013-03-07 Microsoft Corporation Arranging tiles
US9557909B2 (en) 2011-09-09 2017-01-31 Microsoft Technology Licensing, Llc Semantic zoom linguistic helpers
US8922575B2 (en) 2011-09-09 2014-12-30 Microsoft Corporation Tile cache
US10353566B2 (en) 2011-09-09 2019-07-16 Microsoft Technology Licensing, Llc Semantic zoom animations
US9146670B2 (en) 2011-09-10 2015-09-29 Microsoft Technology Licensing, Llc Progressively indicating new content in an application-selectable user interface
US8933952B2 (en) 2011-09-10 2015-01-13 Microsoft Corporation Pre-rendering new content for an application-selectable user interface
US9244802B2 (en) 2011-09-10 2016-01-26 Microsoft Technology Licensing, Llc Resource user interface
EP2756427A4 (en) * 2011-09-12 2015-07-29 Intel Corp METHOD AND APPARATUS FOR ANNOTATION AND / OR RECOMMENDATION OF VIDEO CONTENT
MX2014003131A (es) * 2011-09-16 2014-08-27 Landmark Graphics Corp Metodos y sistemas para el control de aplicacion petrotecnica basado en gestos.
US9129400B1 (en) * 2011-09-23 2015-09-08 Amazon Technologies, Inc. Movement prediction for image capture
US20130080976A1 (en) * 2011-09-28 2013-03-28 Microsoft Corporation Motion controlled list scrolling
TWI437467B (zh) 2011-10-11 2014-05-11 Ind Tech Res Inst 顯示控制裝置及顯示控制方法
US9672609B1 (en) 2011-11-11 2017-06-06 Edge 3 Technologies, Inc. Method and apparatus for improved depth-map estimation
US9100685B2 (en) 2011-12-09 2015-08-04 Microsoft Technology Licensing, Llc Determining audience state or interest using passive sensor data
KR101870902B1 (ko) * 2011-12-12 2018-06-26 삼성전자주식회사 영상 처리 장치 및 영상 처리 방법
US9223472B2 (en) 2011-12-22 2015-12-29 Microsoft Technology Licensing, Llc Closing applications
WO2013095678A1 (en) * 2011-12-23 2013-06-27 Intel Corporation Mechanism to provide feedback regarding computing system command gestures
US10015557B2 (en) 2011-12-31 2018-07-03 Intel Corporation Content-based control system
KR20130081580A (ko) * 2012-01-09 2013-07-17 삼성전자주식회사 표시 장치 및 그 제어 방법
US9625993B2 (en) * 2012-01-11 2017-04-18 Biosense Webster (Israel) Ltd. Touch free operation of devices by use of depth sensors
US9931154B2 (en) 2012-01-11 2018-04-03 Biosense Webster (Israel), Ltd. Touch free operation of ablator workstation by use of depth sensors
JP2013150086A (ja) * 2012-01-18 2013-08-01 Sony Corp 行動情報認識システム、情報処理装置および行動情報認識方法
JP5509227B2 (ja) 2012-01-31 2014-06-04 株式会社コナミデジタルエンタテインメント 移動制御装置、移動制御装置の制御方法、及びプログラム
US20130198690A1 (en) * 2012-02-01 2013-08-01 Microsoft Corporation Visual indication of graphical user interface relationship
US20130211843A1 (en) * 2012-02-13 2013-08-15 Qualcomm Incorporated Engagement-dependent gesture recognition
US9128605B2 (en) 2012-02-16 2015-09-08 Microsoft Technology Licensing, Llc Thumbnail-image selection of applications
JPWO2013121807A1 (ja) * 2012-02-17 2015-05-11 ソニー株式会社 情報処理装置、情報処理方法およびコンピュータプログラム
US9229534B2 (en) 2012-02-28 2016-01-05 Apple Inc. Asymmetric mapping for tactile and non-tactile user interfaces
US20130229345A1 (en) * 2012-03-01 2013-09-05 Laura E. Day Manual Manipulation of Onscreen Objects
US10503373B2 (en) * 2012-03-14 2019-12-10 Sony Interactive Entertainment LLC Visual feedback for highlight-driven gesture user interfaces
EP2650754A3 (en) 2012-03-15 2014-09-24 Omron Corporation Gesture recognition apparatus, electronic device, gesture recognition method, control program, and recording medium
CN104246682B (zh) 2012-03-26 2017-08-25 苹果公司 增强的虚拟触摸板和触摸屏
US9907959B2 (en) 2012-04-12 2018-03-06 Medtronic, Inc. Velocity detection for posture-responsive therapy
US9239624B2 (en) 2012-04-13 2016-01-19 Nokia Technologies Oy Free hand gesture control of automotive user interface
US9448635B2 (en) 2012-04-16 2016-09-20 Qualcomm Incorporated Rapid gesture re-engagement
JP5778231B2 (ja) * 2012-04-17 2015-09-16 シャープ株式会社 メニュー表示装置、メニュー表示方法、メニュー表示プログラム、メニュー表示装置を備えたテレビジョン受像機、及び、記録媒体
TWI454966B (zh) * 2012-04-24 2014-10-01 Wistron Corp 手勢控制方法及手勢控制裝置
US9737719B2 (en) 2012-04-26 2017-08-22 Medtronic, Inc. Adjustment of therapy based on acceleration
US20130293454A1 (en) * 2012-05-04 2013-11-07 Samsung Electronics Co. Ltd. Terminal and method for controlling the same based on spatial interaction
CA2775700C (en) 2012-05-04 2013-07-23 Microsoft Corporation Determining a future portion of a currently presented media program
JP5924407B2 (ja) * 2012-05-22 2016-05-25 ソニー株式会社 画像処理装置、画像処理方法、及びプログラム
US9747306B2 (en) * 2012-05-25 2017-08-29 Atheer, Inc. Method and apparatus for identifying input features for later recognition
JP6351579B2 (ja) 2012-06-01 2018-07-04 ウルトラデント プロダクツ インク. 立体ビデオ撮像
US9690465B2 (en) 2012-06-01 2017-06-27 Microsoft Technology Licensing, Llc Control of remote applications using companion device
US8965696B2 (en) 2012-06-05 2015-02-24 Apple Inc. Providing navigation instructions while operating navigation application in background
US9482296B2 (en) 2012-06-05 2016-11-01 Apple Inc. Rendering road signs during navigation
US9230556B2 (en) 2012-06-05 2016-01-05 Apple Inc. Voice instructions during navigation
US9047691B2 (en) * 2012-06-05 2015-06-02 Apple Inc. Route display and review
US9886794B2 (en) 2012-06-05 2018-02-06 Apple Inc. Problem reporting in maps
US8983778B2 (en) 2012-06-05 2015-03-17 Apple Inc. Generation of intersection information by a mapping service
US9159153B2 (en) 2012-06-05 2015-10-13 Apple Inc. Method, system and apparatus for providing visual feedback of a map view change
US10176633B2 (en) 2012-06-05 2019-01-08 Apple Inc. Integrated mapping and navigation application
US9418672B2 (en) 2012-06-05 2016-08-16 Apple Inc. Navigation application with adaptive instruction text
US9311750B2 (en) * 2012-06-05 2016-04-12 Apple Inc. Rotation operations in a mapping application
US9997069B2 (en) 2012-06-05 2018-06-12 Apple Inc. Context-aware voice guidance
US9043722B1 (en) * 2012-06-19 2015-05-26 Surfwax, Inc. User interfaces for displaying relationships between cells in a grid
TWI463371B (zh) * 2012-06-20 2014-12-01 Pixart Imaging Inc 手勢偵測裝置以及根據速度判斷連續手勢的方法
US9111135B2 (en) 2012-06-25 2015-08-18 Aquifi, Inc. Systems and methods for tracking human hands using parts based template matching using corresponding pixels in bounded regions of a sequence of frames that are a specified distance interval from a reference camera
US8934675B2 (en) 2012-06-25 2015-01-13 Aquifi, Inc. Systems and methods for tracking human hands by performing parts based template matching using images from multiple viewpoints
CN103529928B (zh) * 2012-07-06 2017-03-01 原相科技股份有限公司 手势侦测装置以及根据速度判断连续手势的方法
US9575565B2 (en) * 2012-07-13 2017-02-21 Juice Design Co., Ltd. Element selection device, element selection method, and program
KR20140014548A (ko) * 2012-07-24 2014-02-06 삼성전자주식회사 전자 장치, 그 제어 방법, 및 컴퓨터 판독가능 기록매체
US9870642B2 (en) * 2012-08-10 2018-01-16 Here Global B.V. Method and apparatus for layout for augmented reality view
JP5665140B2 (ja) * 2012-08-17 2015-02-04 Necソリューションイノベータ株式会社 入力装置、入力方法、及びプログラム
JP5798532B2 (ja) * 2012-08-23 2015-10-21 株式会社Nttドコモ ユーザインタフェース装置、ユーザインタフェース方法及びプログラム
JP6025473B2 (ja) * 2012-09-14 2016-11-16 キヤノン株式会社 情報処理装置、情報処理方法、およびプログラム
JP5383885B1 (ja) * 2012-10-04 2014-01-08 株式会社Nttドコモ 情報処理装置
EP2905676A4 (en) * 2012-10-05 2016-06-15 Nec Solution Innovators Ltd USER INTERFACE DEVICE AND USER INTERFACE PROCEDURE
CN103777857A (zh) * 2012-10-24 2014-05-07 腾讯科技(深圳)有限公司 实现视频画面转动的方法和装置
JP2014085965A (ja) * 2012-10-25 2014-05-12 Nec Personal Computers Ltd 情報処理装置、情報処理方法、及びプログラム
JP6341924B2 (ja) * 2012-11-09 2018-06-13 トムソン ライセンシングThomson Licensing ハンドヘルドディスプレイズーム機能
KR20140063272A (ko) * 2012-11-16 2014-05-27 엘지전자 주식회사 영상표시장치, 및 그 동작방법
WO2014081104A1 (en) * 2012-11-21 2014-05-30 Lg Electronics Inc. Multimedia device for having touch sensor and method for controlling the same
TWI502519B (zh) * 2012-11-21 2015-10-01 Wistron Corp 手勢辨識模組及手勢辨識方法
US9152234B2 (en) * 2012-12-02 2015-10-06 Apple Inc. Detecting user intent to remove a pluggable peripheral device
WO2014088621A1 (en) * 2012-12-03 2014-06-12 Google, Inc. System and method for detecting gestures
TWI454971B (zh) * 2012-12-11 2014-10-01 Pixart Imaging Inc 電子裝置控制方法以及使用此電子裝置控制方法的電子裝置
CN103902026A (zh) * 2012-12-25 2014-07-02 鸿富锦精密工业(武汉)有限公司 显示屏幕自动调节系统及方法
CN103914126A (zh) * 2012-12-31 2014-07-09 腾讯科技(深圳)有限公司 一种多媒体播放器控制方法和装置
CN103926999B (zh) * 2013-01-16 2017-03-01 株式会社理光 手掌开合手势识别方法和装置、人机交互方法和设备
CN103970455B (zh) * 2013-01-28 2018-02-27 联想(北京)有限公司 一种信息处理方法及电子设备
US9129155B2 (en) 2013-01-30 2015-09-08 Aquifi, Inc. Systems and methods for initializing motion tracking of human hands using template matching within bounded regions determined using a depth map
US9092665B2 (en) 2013-01-30 2015-07-28 Aquifi, Inc Systems and methods for initializing motion tracking of human hands
KR20140099111A (ko) * 2013-02-01 2014-08-11 삼성전자주식회사 카메라 장치의 동작을 제어하는 방법 및 상기 카메라 장치
US9720504B2 (en) * 2013-02-05 2017-08-01 Qualcomm Incorporated Methods for system engagement via 3D object detection
US9159116B2 (en) 2013-02-13 2015-10-13 Google Inc. Adaptive screen interfaces based on viewing distance
WO2014131197A1 (en) * 2013-03-01 2014-09-04 Microsoft Corporation Object creation using body gestures
JP5372273B2 (ja) * 2013-03-07 2013-12-18 富士通テン株式会社 ディスプレイ装置
US10721448B2 (en) 2013-03-15 2020-07-21 Edge 3 Technologies, Inc. Method and apparatus for adaptive exposure bracketing, segmentation and scene organization
US10152135B2 (en) * 2013-03-15 2018-12-11 Intel Corporation User interface responsive to operator position and gestures
US20140282275A1 (en) * 2013-03-15 2014-09-18 Qualcomm Incorporated Detection of a zooming gesture
US9450952B2 (en) 2013-05-29 2016-09-20 Microsoft Technology Licensing, Llc Live tiles without application-code execution
US9298266B2 (en) * 2013-04-02 2016-03-29 Aquifi, Inc. Systems and methods for implementing three-dimensional (3D) gesture based graphical user interfaces (GUI) that incorporate gesture reactive interface objects
US10394410B2 (en) 2013-05-09 2019-08-27 Amazon Technologies, Inc. Mobile device interfaces
DE102013209701A1 (de) * 2013-05-24 2014-11-27 Johnson Controls Gmbh Darstellung von Grafikelementen in Abhängigkeit von einer Bewegung eines Objekts
WO2014194314A1 (en) * 2013-05-31 2014-12-04 Freedom Scientific, Inc. Vector-based customizable pointing indicia
WO2015001546A1 (en) * 2013-07-01 2015-01-08 Inuitive Ltd. Rotating display content responsive to a rotational gesture of a body part
US10533850B2 (en) 2013-07-12 2020-01-14 Magic Leap, Inc. Method and system for inserting recognized object data into a virtual world
KR102033077B1 (ko) 2013-08-07 2019-10-16 나이키 이노베이트 씨.브이. 제스처 인식 및 전력 관리를 갖는 손목 착용 운동 디바이스
TW201510771A (zh) 2013-09-05 2015-03-16 Utechzone Co Ltd 指向位置偵測裝置及其方法、程式及電腦可讀取紀錄媒體
US9766855B2 (en) * 2013-09-10 2017-09-19 Avigilon Corporation Method and apparatus for controlling surveillance system with gesture and/or audio commands
CN104463782B (zh) * 2013-09-16 2018-06-01 联想(北京)有限公司 图像处理方法、装置和电子设备
JP6749837B2 (ja) * 2013-10-01 2020-09-02 クアンタム インターフェイス エルエルシー アクティブセンシング領域を有する動きセンサによる方法とシステム
US10152136B2 (en) * 2013-10-16 2018-12-11 Leap Motion, Inc. Velocity field interaction for free space gesture interface and control
US10168873B1 (en) 2013-10-29 2019-01-01 Leap Motion, Inc. Virtual interactions for machine control
US9996797B1 (en) 2013-10-31 2018-06-12 Leap Motion, Inc. Interactions with virtual objects for machine control
US20150123890A1 (en) * 2013-11-04 2015-05-07 Microsoft Corporation Two hand natural user input
KR102173123B1 (ko) * 2013-11-22 2020-11-02 삼성전자주식회사 전자장치에서 이미지 내의 특정 객체를 인식하기 위한 방법 및 장치
US10126822B2 (en) 2013-12-16 2018-11-13 Leap Motion, Inc. User-defined virtual interaction space and manipulation of virtual configuration
KR20150073378A (ko) * 2013-12-23 2015-07-01 삼성전자주식회사 동작인식을 기반으로 하는 가상 입력장치의 사용자 인터페이스(ui)를 표시하는 장치 및 방법
JP6222830B2 (ja) * 2013-12-27 2017-11-01 マクセルホールディングス株式会社 画像投射装置
US9971491B2 (en) * 2014-01-09 2018-05-15 Microsoft Technology Licensing, Llc Gesture library for natural user input
KR102152236B1 (ko) * 2014-01-29 2020-09-04 엘지이노텍 주식회사 카메라 모듈
US9619105B1 (en) * 2014-01-30 2017-04-11 Aquifi, Inc. Systems and methods for gesture based interaction with viewpoint dependent user interfaces
JP5888793B2 (ja) * 2014-03-28 2016-03-22 株式会社コロプラ オブジェクト制御プログラム及びオブジェクト制御方法
CN105359094A (zh) 2014-04-04 2016-02-24 微软技术许可有限责任公司 可扩展应用表示
CN105359055A (zh) 2014-04-10 2016-02-24 微软技术许可有限责任公司 计算设备的滑盖
CN105378582B (zh) 2014-04-10 2019-07-23 微软技术许可有限责任公司 计算设备的可折叠壳盖
JP6303772B2 (ja) * 2014-04-25 2018-04-04 富士通株式会社 入力制御装置、制御方法および制御プログラム
US10579207B2 (en) * 2014-05-14 2020-03-03 Purdue Research Foundation Manipulating virtual environment using non-instrumented physical object
US9766702B2 (en) 2014-06-19 2017-09-19 Apple Inc. User detection by a computing device
US20150378440A1 (en) * 2014-06-27 2015-12-31 Microsoft Technology Licensing, Llc Dynamically Directing Interpretation of Input Data Based on Contextual Information
US10592080B2 (en) 2014-07-31 2020-03-17 Microsoft Technology Licensing, Llc Assisted presentation of application windows
US10254942B2 (en) 2014-07-31 2019-04-09 Microsoft Technology Licensing, Llc Adaptive sizing and positioning of application windows
US10678412B2 (en) 2014-07-31 2020-06-09 Microsoft Technology Licensing, Llc Dynamic joint dividers for application windows
US9696551B2 (en) * 2014-08-13 2017-07-04 Beijing Lenovo Software Ltd. Information processing method and electronic device
CA2993876C (en) * 2014-08-15 2023-03-07 The University Of British Columbia Methods and systems for performing medical procedures and for accessing and/or manipulating medically relevant information
US10642365B2 (en) 2014-09-09 2020-05-05 Microsoft Technology Licensing, Llc Parametric inertia and APIs
WO2016065568A1 (en) 2014-10-30 2016-05-06 Microsoft Technology Licensing, Llc Multi-configuration input device
KR101636460B1 (ko) * 2014-11-05 2016-07-05 삼성전자주식회사 전자 장치 및 그 제어 방법
JP2016110250A (ja) * 2014-12-03 2016-06-20 日本ユニシス株式会社 ジェスチャ認識システム、ジェスチャ認識方法およびコンピュータプログラム
KR20160076857A (ko) * 2014-12-23 2016-07-01 엘지전자 주식회사 이동 단말기 및 그의 컨텐츠 제어방법
US9696795B2 (en) 2015-02-13 2017-07-04 Leap Motion, Inc. Systems and methods of creating a realistic grab experience in virtual reality/augmented reality environments
US10429923B1 (en) 2015-02-13 2019-10-01 Ultrahaptics IP Two Limited Interaction engine for creating a realistic experience in virtual reality/augmented reality environments
CN106168774A (zh) * 2015-05-20 2016-11-30 西安中兴新软件有限责任公司 一种信息处理方法及电子设备
US9529454B1 (en) * 2015-06-19 2016-12-27 Microsoft Technology Licensing, Llc Three-dimensional user input
CN106293444B (zh) 2015-06-25 2020-07-03 小米科技有限责任公司 移动终端、显示控制方法及装置
US10281976B2 (en) 2015-07-07 2019-05-07 Seiko Epson Corporation Display device, control method for display device, and computer program
US10397484B2 (en) * 2015-08-14 2019-08-27 Qualcomm Incorporated Camera zoom based on sensor data
JP2017041002A (ja) 2015-08-18 2017-02-23 キヤノン株式会社 表示制御装置、表示制御方法及び表示制御用プログラム
US10101803B2 (en) 2015-08-26 2018-10-16 Google Llc Dynamic switching and merging of head, gesture and touch input in virtual reality
US9910641B2 (en) 2015-10-14 2018-03-06 Microsoft Technology Licensing, Llc Generation of application behaviors
US10133441B2 (en) 2015-11-30 2018-11-20 Kabushiki Kaisha Toshiba Electronic apparatus and method
WO2017104525A1 (ja) * 2015-12-17 2017-06-22 コニカミノルタ株式会社 入力装置、電子機器及びヘッドマウントディスプレイ
US10429935B2 (en) 2016-02-08 2019-10-01 Comcast Cable Communications, Llc Tremor correction for gesture recognition
DE102016206529A1 (de) 2016-04-19 2017-10-19 Robert Bosch Gmbh Montagearbeitsplatz mit Positionsbestimmungsvorrichtung
US9972119B2 (en) 2016-08-11 2018-05-15 Microsoft Technology Licensing, Llc Virtual object hand-off and manipulation
US10126827B2 (en) * 2016-08-11 2018-11-13 Chi Fai Ho Apparatus and method to navigate media content using repetitive 3D gestures
JP6817602B2 (ja) * 2016-10-07 2021-01-20 パナソニックIpマネジメント株式会社 監視映像解析システム、監視映像解析方法及び監視映像解析プログラム
CN106681503A (zh) * 2016-12-19 2017-05-17 惠科股份有限公司 一种显示控制方法、终端及显示装置
US20190369807A1 (en) 2017-02-13 2019-12-05 Sony Corporation Information processing device, information processing method, and program
CA3052869A1 (en) 2017-02-17 2018-08-23 Nz Technologies Inc. Methods and systems for touchless control of surgical environment
US10620779B2 (en) 2017-04-24 2020-04-14 Microsoft Technology Licensing, Llc Navigating a holographic image
JP2019086916A (ja) * 2017-11-02 2019-06-06 オリンパス株式会社 作業支援装置、作業支援方法、作業支援プログラム
US10937240B2 (en) 2018-01-04 2021-03-02 Intel Corporation Augmented reality bindings of physical objects and virtual objects
JP2019139332A (ja) 2018-02-06 2019-08-22 富士通株式会社 情報処理装置、情報処理方法及び情報処理プログラム
US11875012B2 (en) 2018-05-25 2024-01-16 Ultrahaptics IP Two Limited Throwable interface for augmented reality and virtual reality environments
JP2019207574A (ja) * 2018-05-29 2019-12-05 富士ゼロックス株式会社 情報処理装置、情報処理システム及びプログラム
US11416077B2 (en) * 2018-07-19 2022-08-16 Infineon Technologies Ag Gesture detection system and method using a radar sensor
US10890653B2 (en) 2018-08-22 2021-01-12 Google Llc Radar-based gesture enhancement for voice interfaces
US10770035B2 (en) 2018-08-22 2020-09-08 Google Llc Smartphone-based radar system for facilitating awareness of user presence and orientation
US10698603B2 (en) * 2018-08-24 2020-06-30 Google Llc Smartphone-based radar system facilitating ease and accuracy of user interactions with displayed objects in an augmented-reality interface
US10921854B2 (en) 2018-09-06 2021-02-16 Apple Inc. Electronic device with sensing strip
US10788880B2 (en) 2018-10-22 2020-09-29 Google Llc Smartphone-based radar system for determining user intention in a lower-power mode
US10761611B2 (en) 2018-11-13 2020-09-01 Google Llc Radar-image shaper for radar-based applications
US11320911B2 (en) * 2019-01-11 2022-05-03 Microsoft Technology Licensing, Llc Hand motion and orientation-aware buttons and grabbable objects in mixed reality
WO2021051200A1 (en) * 2019-09-17 2021-03-25 Huawei Technologies Co., Ltd. User interface control based on elbow-anchored arm gestures
US11175730B2 (en) 2019-12-06 2021-11-16 Facebook Technologies, Llc Posture-based virtual space configurations
KR102145523B1 (ko) * 2019-12-12 2020-08-18 삼성전자주식회사 카메라 장치의 동작을 제어하는 방법 및 상기 카메라 장치
KR102346294B1 (ko) * 2020-03-03 2022-01-04 주식회사 브이터치 2차원 이미지로부터 사용자의 제스처를 추정하는 방법, 시스템 및 비일시성의 컴퓨터 판독 가능 기록 매체
DE102020203164A1 (de) 2020-03-12 2021-09-16 Robert Bosch Gesellschaft mit beschränkter Haftung Arbeitsplatz mit Arbeitsplan für datenverarbeitende Werkzeuge
CN112639689A (zh) * 2020-04-30 2021-04-09 华为技术有限公司 基于隔空手势的控制方法、装置及系统
DE112021003415T5 (de) 2020-06-26 2023-07-27 Apple Inc. Vorrichtungen, verfahren und grafische benutzerschnittstellen für inhaltsanwendungen
US11256336B2 (en) * 2020-06-29 2022-02-22 Facebook Technologies, Llc Integration of artificial reality interaction modes
KR102238635B1 (ko) * 2020-08-11 2021-04-09 삼성전자주식회사 카메라 장치의 동작을 제어하는 방법 및 상기 카메라 장치
CN111984117A (zh) * 2020-08-12 2020-11-24 深圳创维-Rgb电子有限公司 全景地图控制方法、装置、设备及存储介质
AU2021349382B2 (en) 2020-09-25 2023-06-29 Apple Inc. Methods for adjusting and/or controlling immersion associated with user interfaces
KR102596341B1 (ko) 2020-09-25 2023-11-01 애플 인크. 환경에서 객체들을 조작하기 위한 방법들
CN116209974A (zh) * 2020-09-25 2023-06-02 苹果公司 用于导航用户界面的方法
CN116670627A (zh) 2020-12-31 2023-08-29 苹果公司 对环境中的用户界面进行分组的方法
KR102760282B1 (ko) * 2021-01-14 2025-02-03 한국전자통신연구원 손동작 인식 장치 및 손동작 인식 방법
US20220244791A1 (en) * 2021-01-24 2022-08-04 Chian Chiu Li Systems And Methods for Gesture Input
US11995230B2 (en) 2021-02-11 2024-05-28 Apple Inc. Methods for presenting and sharing content in an environment
US12453859B2 (en) 2021-02-24 2025-10-28 Medtronic, Inc. Posture state definition calibration
US11960790B2 (en) * 2021-05-27 2024-04-16 Microsoft Technology Licensing, Llc Spatial attention model enhanced voice engagement system
CN115480643A (zh) * 2021-06-17 2022-12-16 深圳市瑞立视多媒体科技有限公司 一种应用于基于ue4全息沙盘的交互方法及装置
CN113791685A (zh) * 2021-08-16 2021-12-14 青岛海尔科技有限公司 用于移动组件的方法及装置、电子设备、存储介质
US12026317B2 (en) 2021-09-16 2024-07-02 Apple Inc. Electronic devices with air input sensors
EP4388501A1 (en) 2021-09-23 2024-06-26 Apple Inc. Devices, methods, and graphical user interfaces for content applications
WO2023049670A1 (en) 2021-09-25 2023-03-30 Apple Inc. Devices, methods, and graphical user interfaces for presenting virtual objects in virtual environments
US12456271B1 (en) 2021-11-19 2025-10-28 Apple Inc. System and method of three-dimensional object cleanup and text annotation
EP4466593A1 (en) 2022-01-19 2024-11-27 Apple Inc. Methods for displaying and repositioning objects in an environment
US12272005B2 (en) 2022-02-28 2025-04-08 Apple Inc. System and method of three-dimensional immersive applications in multi-user communication sessions
US12321666B2 (en) 2022-04-04 2025-06-03 Apple Inc. Methods for quick message response and dictation in a three-dimensional environment
CN114840086B (zh) * 2022-05-10 2024-07-30 Oppo广东移动通信有限公司 一种控制方法、电子设备及计算机存储介质
US12394167B1 (en) 2022-06-30 2025-08-19 Apple Inc. Window resizing and virtual object rearrangement in 3D environments
US12112011B2 (en) 2022-09-16 2024-10-08 Apple Inc. System and method of application-based three-dimensional refinement in multi-user communication sessions
US12468396B2 (en) * 2023-09-07 2025-11-11 Snap Inc. Virtual manipulation of augmented and virtual reality objects
WO2025094516A1 (ja) * 2023-11-02 2025-05-08 ソニーセミコンダクタソリューションズ株式会社 情報処理装置、情報処理方法、及びプログラム

Family Cites Families (46)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5594469A (en) 1995-02-21 1997-01-14 Mitsubishi Electric Information Technology Center America Inc. Hand gesture machine control system
JPH08332170A (ja) 1995-06-08 1996-12-17 Matsushita Electric Ind Co Ltd ビデオスコープ
US6144366A (en) 1996-10-18 2000-11-07 Kabushiki Kaisha Toshiba Method and apparatus for generating information input using reflected light image of target object
KR19990011180A (ko) 1997-07-22 1999-02-18 구자홍 화상인식을 이용한 메뉴 선택 방법
EP0905644A3 (en) * 1997-09-26 2004-02-25 Matsushita Electric Industrial Co., Ltd. Hand gesture recognizing device
US9292111B2 (en) 1998-01-26 2016-03-22 Apple Inc. Gesturing with a multipoint sensing device
JPH11338120A (ja) 1998-05-27 1999-12-10 Dainippon Screen Mfg Co Ltd レイアウト装置
JP2000075991A (ja) 1998-08-28 2000-03-14 Aqueous Research:Kk 情報入力装置
DE19839638C2 (de) 1998-08-31 2000-06-21 Siemens Ag System zur Ermöglichung einer Selbstkontrolle hinsichtlich durchzuführender Körperbewegungsabläufe durch die sich bewegende Person
JP5048890B2 (ja) 1998-10-13 2012-10-17 ソニー エレクトロニクス インク 動作検知インターフェース
US6501515B1 (en) 1998-10-13 2002-12-31 Sony Corporation Remote control system
US7267148B2 (en) * 1999-08-10 2007-09-11 Michelin Recherche Et Technique S.A. Measurement of adherence between a vehicle wheel and the roadway
EP1148411A3 (en) 2000-04-21 2005-09-14 Sony Corporation Information processing apparatus and method for recognising user gesture
US7227526B2 (en) 2000-07-24 2007-06-05 Gesturetek, Inc. Video-based image control system
US7000200B1 (en) * 2000-09-15 2006-02-14 Intel Corporation Gesture recognition system recognizing gestures within a specified timing
US8487872B2 (en) 2000-11-14 2013-07-16 Blue Orb, Inc. Apparatus and method for generating data signals
US6827579B2 (en) 2000-11-16 2004-12-07 Rutgers, The State University Of Nj Method and apparatus for rehabilitation of neuromotor disorders
US7274800B2 (en) * 2001-07-18 2007-09-25 Intel Corporation Dynamic gesture recognition from stereo sequences
JP4304337B2 (ja) 2001-09-17 2009-07-29 独立行政法人産業技術総合研究所 インタフェース装置
US7340077B2 (en) * 2002-02-15 2008-03-04 Canesta, Inc. Gesture recognition system using depth perceptive sensors
US7348963B2 (en) * 2002-05-28 2008-03-25 Reactrix Systems, Inc. Interactive video display system
US7665041B2 (en) * 2003-03-25 2010-02-16 Microsoft Corporation Architecture for controlling a computer using hand gestures
US7233316B2 (en) 2003-05-01 2007-06-19 Thomson Licensing Multimedia user interface
JP2005051472A (ja) 2003-07-28 2005-02-24 Nikon Corp 自動撮影制御装置、自動撮影用プログラム、カメラ
US7301529B2 (en) 2004-03-23 2007-11-27 Fujitsu Limited Context dependent gesture response
US7180500B2 (en) 2004-03-23 2007-02-20 Fujitsu Limited User definable gestures for motion controlled handheld devices
US20050239028A1 (en) 2004-04-03 2005-10-27 Wu Chang J R Stance guide and method of use
JP2005301693A (ja) 2004-04-12 2005-10-27 Japan Science & Technology Agency 動画編集システム
US7308112B2 (en) 2004-05-14 2007-12-11 Honda Motor Co., Ltd. Sign based human-machine interaction
US7519223B2 (en) 2004-06-28 2009-04-14 Microsoft Corporation Recognizing gestures and using gestures for interacting with software applications
JP4692159B2 (ja) 2004-08-31 2011-06-01 パナソニック電工株式会社 ジェスチャースイッチ
JP2006130221A (ja) 2004-11-09 2006-05-25 Konica Minolta Medical & Graphic Inc 医用画像転送装置、プログラム及び記憶媒体
US20060164382A1 (en) 2005-01-25 2006-07-27 Technology Licensing Company, Inc. Image manipulation in response to a movement of a display
CN101536494B (zh) 2005-02-08 2017-04-26 奥布隆工业有限公司 用于基于姿势的控制系统的系统和方法
JP4389821B2 (ja) * 2005-03-22 2009-12-24 ソニー株式会社 体動検出装置、コンテンツ再生装置、体動検出方法およびコンテンツ再生方法
US20070057912A1 (en) 2005-09-14 2007-03-15 Romriell Joseph N Method and system for controlling an interface of a device through motion gestures
US8589824B2 (en) 2006-07-13 2013-11-19 Northrop Grumman Systems Corporation Gesture recognition interface system
JP2008040576A (ja) 2006-08-02 2008-02-21 Sharp Corp 画像処理システム及び該システムを備えた映像表示装置
KR100783552B1 (ko) 2006-10-11 2007-12-07 삼성전자주식회사 휴대 단말기의 입력 제어 방법 및 장치
JP2008146243A (ja) 2006-12-07 2008-06-26 Toshiba Corp 情報処理装置、情報処理方法、及びプログラム
US20090109036A1 (en) 2007-10-29 2009-04-30 The Boeing Company System and Method for Alternative Communication
US20090164937A1 (en) * 2007-12-20 2009-06-25 Alden Alviar Scroll Apparatus and Method for Manipulating Data on an Electronic Device Display
US8253746B2 (en) 2009-05-01 2012-08-28 Microsoft Corporation Determine intended motions
KR101623007B1 (ko) 2009-11-11 2016-05-20 엘지전자 주식회사 디스플레이 장치 및 그 제어방법
JP5485470B2 (ja) 2010-04-30 2014-05-07 トムソン ライセンシング 3d系内でプッシュ及びプルのジェスチャーを認識する方法及び装置
US8457353B2 (en) 2010-05-18 2013-06-04 Microsoft Corporation Gestures and gesture modifiers for manipulating a user-interface

Cited By (22)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2011118477A (ja) * 2009-11-30 2011-06-16 Sony Corp 情報処理装置、情報処理方法及びそのプログラム
US10871833B2 (en) 2009-11-30 2020-12-22 Sony Corporation Information processing apparatus, method and computer-readable medium
US9536272B2 (en) 2009-11-30 2017-01-03 Sony Corporation Information processing apparatus, method and computer-readable medium
JP2013533541A (ja) * 2010-06-10 2013-08-22 マイクロソフト コーポレーション 文字の選択
JP2012018620A (ja) * 2010-07-09 2012-01-26 Canon Inc 情報処理装置およびその制御方法
JP2012022458A (ja) * 2010-07-13 2012-02-02 Canon Inc 情報処理装置およびその制御方法
JP2014503084A (ja) * 2010-07-27 2014-02-06 テルコーディア テクノロジーズ インコーポレイテッド 3次元形状のファセット上の関連メディアセグメントの対話型の投影および再生
JP2013232200A (ja) * 2010-09-22 2013-11-14 Nikon Corp 画像表示装置
US9456203B2 (en) 2010-11-11 2016-09-27 Sony Corporation Information processing apparatus, stereoscopic display method, and program
JP2012103982A (ja) * 2010-11-11 2012-05-31 Sony Corp 情報処理装置、立体視表示方法及びプログラム
US10652515B2 (en) 2010-11-11 2020-05-12 Sony Corporation Information processing apparatus, stereoscopic display method, and program
US10349034B2 (en) 2010-11-11 2019-07-09 Sony Corporation Information processing apparatus, stereoscopic display method, and program
JP2012103981A (ja) * 2010-11-11 2012-05-31 Sony Corp 情報処理装置、立体視表示方法及びプログラム
US9285883B2 (en) 2011-03-01 2016-03-15 Qualcomm Incorporated System and method to display content based on viewing orientation
JP2014510344A (ja) * 2011-03-01 2014-04-24 クゥアルコム・インコーポレイテッド コンテンツを表示するためのシステムおよび方法
CN103502912B (zh) * 2011-05-09 2017-11-07 皇家飞利浦有限公司 转动屏幕上的物体
CN103502912A (zh) * 2011-05-09 2014-01-08 皇家飞利浦有限公司 转动屏幕上的物体
US9264693B2 (en) 2011-12-26 2016-02-16 Semiconductor Energy Laboratory Co., Ltd. Motion recognition device
US9465437B2 (en) 2012-02-23 2016-10-11 Intel Corporation Method and apparatus for controlling screen by tracking head of user through camera module, and computer-readable recording medium therefor
CN109324726A (zh) * 2014-06-16 2019-02-12 华为技术有限公司 图标移动方法、装置和电子设备
JP2015038777A (ja) * 2014-11-12 2015-02-26 セイコーエプソン株式会社 位置検出システム、表示システム及び情報処理システム
US11431887B2 (en) 2018-07-24 2022-08-30 Sony Semiconductor Solutions Corporation Information processing device and method for detection of a sound image object

Also Published As

Publication number Publication date
WO2009111329A3 (en) 2010-01-07
US20090228841A1 (en) 2009-09-10
JP2014099184A (ja) 2014-05-29
JP5855343B2 (ja) 2016-02-09
US9772689B2 (en) 2017-09-26
JP2011517357A (ja) 2011-06-02

Similar Documents

Publication Publication Date Title
US9772689B2 (en) Enhanced gesture-based image manipulation
US11954265B2 (en) Enhanced input using recognized gestures
US20240061511A1 (en) Dynamic, free-space user interactions for machine control
US10817130B2 (en) Dynamic user interactions for display control and measuring degree of completeness of user gestures
US8514251B2 (en) Enhanced character input using recognized gestures
US10268339B2 (en) Enhanced camera-based input
WO2014113454A1 (en) Dynamic, free-space user interactions for machine control
US20150355717A1 (en) Switching input rails without a release command in a natural user interface

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 09717219

Country of ref document: EP

Kind code of ref document: A2

WWE Wipo information: entry into national phase

Ref document number: 2010549767

Country of ref document: JP

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 09717219

Country of ref document: EP

Kind code of ref document: A2