EP3742263A1 - Terminal and method for controlling the same based on spatial interaction - Google Patents

Terminal and method for controlling the same based on spatial interaction Download PDF

Info

Publication number
EP3742263A1
EP3742263A1 EP20185022.9A EP20185022A EP3742263A1 EP 3742263 A1 EP3742263 A1 EP 3742263A1 EP 20185022 A EP20185022 A EP 20185022A EP 3742263 A1 EP3742263 A1 EP 3742263A1
Authority
EP
European Patent Office
Prior art keywords
hand
terminal
user
gesture
screen
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Withdrawn
Application number
EP20185022.9A
Other languages
German (de)
French (fr)
Inventor
Jinyoung Jeon
Jiyoung Kang
Boyoung Lee
Daesung Kim
Jinyong Kim
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Samsung Electronics Co Ltd
Original Assignee
Samsung Electronics Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from KR1020120047678A external-priority patent/KR20130124061A/en
Priority claimed from KR1020120132222A external-priority patent/KR20130124138A/en
Priority claimed from KR1020120132223A external-priority patent/KR20130124139A/en
Priority claimed from KR1020120137740A external-priority patent/KR20130124143A/en
Application filed by Samsung Electronics Co Ltd filed Critical Samsung Electronics Co Ltd
Publication of EP3742263A1 publication Critical patent/EP3742263A1/en
Withdrawn legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/017Gesture based interaction, e.g. based on a set of recognized hand gestures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/033Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
    • G06F3/038Control and interface arrangements therefor, e.g. drivers or device-embedded control circuitry
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/0483Interaction with page-structured environments, e.g. book metaphor
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/0485Scrolling or panning
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04883Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures for inputting data by handwriting, e.g. gesture or text
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M1/00Substation equipment, e.g. for use by subscribers
    • H04M1/72Mobile telephones; Cordless telephones, i.e. devices for establishing wireless links to base stations without route selection
    • H04M1/724User interfaces specially adapted for cordless or mobile telephones
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/61Control of cameras or camera modules based on recognised objects
    • H04N23/611Control of cameras or camera modules based on recognised objects where the recognised objects include parts of the human body
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/63Control of cameras or camera modules by using electronic viewfinders
    • H04N23/631Graphical user interfaces [GUI] specially adapted for controlling image capture or setting capture parameters
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2203/00Indexing scheme relating to G06F3/00 - G06F3/048
    • G06F2203/041Indexing scheme relating to G06F3/041 - G06F3/045
    • G06F2203/041012.5D-digitiser, i.e. digitiser detecting the X/Y position of the input means, finger or stylus, also when it does not touch, but is proximate to the digitiser's interaction surface and also measures the distance of the input means within a short range in the Z direction, possibly with a separate measurement setup
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2203/00Indexing scheme relating to G06F3/00 - G06F3/048
    • G06F2203/048Indexing scheme relating to G06F3/048
    • G06F2203/04808Several contacts: gestures triggering a specific function, e.g. scrolling, zooming, right-click, when the user establishes several contacts with the surface simultaneously; e.g. using several fingers or a combination of fingers and pen
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M1/00Substation equipment, e.g. for use by subscribers
    • H04M1/72Mobile telephones; Cordless telephones, i.e. devices for establishing wireless links to base stations without route selection
    • H04M1/724User interfaces specially adapted for cordless or mobile telephones
    • H04M1/72403User interfaces specially adapted for cordless or mobile telephones with means for local support of applications that increase the functionality
    • H04M1/72442User interfaces specially adapted for cordless or mobile telephones with means for local support of applications that increase the functionality for playing music files
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M2250/00Details of telephonic subscriber devices
    • H04M2250/52Details of telephonic subscriber devices including functional features of a camera

Definitions

  • the present invention relates to a terminal and method for controlling same based on spatial interaction. More particularly, the present invention relates to a terminal and method for recognizing distinct spatial gestures and executing the input corresponding to a recognized spatial gesture interactively.
  • some representative interaction-based device control technologies include touch or motion-based device control and hand gesture-based device control technologies.
  • an aspect of the present invention provides a terminal and method for controlling the same in such a way of recognizing distinct spatial gestures and executing the input corresponding to a recognized spatial gesture interactively.
  • a method for controlling a terminal includes detecting a spatial gesture, analyzing the spatial gesture, and executing an action depending on the analysis result.
  • a method for controlling a terminal includes detecting a user gesture moving an object in a certain direction within proximity of the terminal, determining at least one of the movement direction, movement speed, and movement distance of the user gesture, and performing a control operation associated with a currently running application according to the determined at least one of movement direction, movement speed, and movement distance of the user gesture.
  • a terminal in accordance with another aspect of the present invention, includes a sensing unit which detects a user gesture moving an object in a certain direction within proximity of the terminal, a control unit which determines at least one of the movement direction, movement speed, and movement distance of the user gesture and performs a control operation associated with a currently running application according to the determined at least one of movement direction, movement speed, and movement distance of the user gesture, and a display unit which displays an execution screen of the application under the control of the control unit.
  • a method for controlling a terminal inside a vehicle includes initiating a driving mode based on at least one of a movement state of the terminal and a movement state of the vehicle, detecting a user gesture within proximity of the terminal when the terminal is in the driving mode, and executing a function associated with the corresponding user gesture when the terminal is in the driving mode.
  • a terminal in accordance with another aspect of the present invention, includes a sensing unit which detects a user gesture moving in a certain direction within proximity of the terminal, a control unit which initiates a driving mode based on at least one of a movement state of the terminal and a movement state of the vehicle and which executes, when the user gesture is detected by the sensing unit when the terminal is in the driving mode, a function associated with the corresponding user gesture when the terminal is the driving mode, and an output unit which outputs a function execution result under the control of the control unit.
  • a method for controlling a terminal includes entering a specific operation mode according to a state of the terminal, detecting a user gesture within proximity of the terminal when the terminal is in the specific operation mode, and displaying information associated with the operation mode in response to the user gesture.
  • a terminal in accordance with another aspect of the present invention, includes a sensing unit which detects a user gesture within proximity of the terminal, a display unit which displays information, and a control unit which controls entering a specific operation mode according to a condition of the terminal and displaying, when the user gesture by the sensing unit when the terminal is in the specific operation mode, the information associated with the operation mode in response to the user gesture.
  • a method for controlling a terminal includes detecting a user gesture within proximity of the terminal, determining a type of the user gesture, and performing a control operation of the terminal according to the user gesture.
  • a terminal in accordance with another aspect of the present invention, includes a sensing unit which detects a user gesture within proximity of the terminal, a control unit which determines a type of the user gesture and performs a control operation of the terminal according to the user gesture, and a display unit which displays a result of the control operation of the control unit.
  • the present invention is applicable to all the types of devices capable of recognizing spatial gestures as well as the electronic terminals including a smartphone, a portable terminal, a mobile terminal, a Personal Digital Assistant (PDA), a Portable Multimedia Player (PMP), a laptop computer, a tablet computer, a Personal Computer, WiBro terminal, a digital camera, a portable game console, an MP3 player, a handheld e-book, a Global Positioning System (GPS) navigation, a high Definition Television (HDTV), and the like.
  • PDA Personal Digital Assistant
  • PMP Portable Multimedia Player
  • laptop computer a laptop computer
  • tablet computer a tablet computer
  • WiBro terminal a digital camera
  • MP3 player a portable game console
  • MP3 player a handheld e-book
  • HDTV high Definition Television
  • FIG. 1 is a flowchart illustrating a method for controlling a terminal based on spatial gestures according to an exemplary embodiment of the present invention.
  • the terminal control method according to an exemplary embodiment of the present invention comprises the following steps.
  • the terminal detects a spatial gesture.
  • the spatial gesture can be made with an input means such as user's hand (including fingers) or the like.
  • the terminal is capable of detecting the spatial gesture by means of (e.g., using) a camera, a proximity sensor, and/or the like.
  • the terminal is also capable of detecting multiple spatial gestures simultaneously or sequentially.
  • the terminal may be capable of detecting a touch gesture along with the spatial gesture.
  • the terminal may be capable of detecting the spatial gesture and the touch gesture simultaneously or sequentially.
  • terminal may be capable of detecting the spatial gesture in a state in which the touch gesture is maintained.
  • the terminal analyzes the detected spatial gesture at step S120.
  • the terminal may be capable of detecting the spatial gesture by analyzing the picture taken by the camera and/or by means of (e.g., using) the proximity sensor.
  • the terminal is capable of analyzing a motion oriented to at least one direction or drawing a certain shape (which may be referred to as sweep), a motion speed, a motion length, an input means' shape (e.g., hand or finger shape), at least one coordinates associated with the input means' approach, an approaching speed, an approaching distance, and the like.
  • spatial gestures may include a tap (e.g., bending forward and then stretching the hand), a sweep (e.g., sweeping hand up/down/left/right), a wave (e.g., a horizontal wave and/or a vertical wave), a circle (e.g., a drawn circle in a clockwise or a counter-clockwise direction), a grasp (e.g., a clench of a palm to form a fist), a pinch (e.g., a squeeze between a thumb and an index finger), an approach, a trace (e.g., tracking a hand motion in real time), a cover (e.g., covering the terminal with a hand), a push (e.g., a push of a hand to the terminal), a two hand push, a two hand open, a two hand circle, put two hand, a touch and sweep,
  • a tap e.g., bending forward and then stretching the hand
  • a sweep e.g.,
  • FIGS. 2 to 25 are diagrams illustrating spatial gestures defined for use in a spatial interaction-based terminal control method according to an exemplary embodiment of the present invention.
  • the spatial gesture may be a touchless tap.
  • the touchless tap may correspond to a user moving a hand closer to the screen.
  • the touchless tap may correspond to the user moving the hand closer to a specific desired portion of the screen.
  • the terminal may detect the touchless tap by determining that the user's hand is moved closer to the screen. For example, the terminal may compare coordinates of a detected user's hand and determine that a change in the coordinates of the detected user's hand corresponds to the touchless tap.
  • the spatial gesture may be a touchless sweep.
  • the touchless sweep may be a sweep motion in at least one of the up, down, left, and right directions.
  • the touchless sweep may correspond to a user's hand being disposed such that a surface of the palm of the user's hand is substantially perpendicular to the screen surface and moved in at least one of the up, down, left, and right directions.
  • the terminal may detect the touchless sweep by determining that the user's hand is disposed in a predefined configuration (e.g., such that the surface of the palm is substantially perpendicular to the screen surface) and determining that the user's hand is moved relative to (e.g., across) the screen.
  • the terminal may compare coordinates of a detected user's hand and determine that a change in the coordinates of the detected user's hand corresponds to the touchless sweep.
  • the spatial gesture may be a touchless horizontal wave.
  • the touchless horizontal wave may be a motion of the user's hand so as to variously move across the screen from one side of the terminal to an opposite side of the terminal.
  • the motion of the user's hand may be moved from a left side of the terminal screen to the right side of the screen.
  • the motion of the user's hand may be repeated in the touchless horizontal wave.
  • the horizontal wave may include the user's hand being outstretched such that the user's palm is facing (e.g., substantially parallel to) the screen.
  • the terminal may detect the touchless horizontal wave by determining that the user's hand is disposed in a predefined configuration (e.g., such that the surface of the palm is substantially parallel to the screen surface) and determining that the user's hand is moved relative to (e.g., from the left side to the right side, from the right side to the left side, or the like) the screen. For example, the terminal may compare coordinates of a detected user's hand and determine that a change in the coordinates of the detected user's hand corresponds to the touchless horizontal wave.
  • a predefined configuration e.g., such that the surface of the palm is substantially parallel to the screen surface
  • the terminal may compare coordinates of a detected user's hand and determine that a change in the coordinates of the detected user's hand corresponds to the touchless horizontal wave.
  • the spatial gesture may be touchless vertical wave.
  • the touchless vertical wave may be a motion of the user's hand so as to variously move across the screen from one side of the terminal to an opposite side of the terminal.
  • the motion of the user's hand may be moved from a top side of the terminal screen to the bottom side of the screen.
  • the motion of the user's hand may be repeated in the touchless vertical wave.
  • the vertical wave may include the user's hand being outstretched such that the user's palm is facing (e.g., substantially parallel to) the screen when the hand is between the motion to the top side of the terminal and the motion to the bottom side of the terminal.
  • the terminal may detect the touchless vertical wave by determining that the user's hand is disposed in a predefined configuration (e.g., such that the surface of the palm is substantially parallel to the screen surface) and determining that the user's hand (e.g., the user's outstretched fingers) is moved relative to (e.g., from the top side to the bottom side, from the bottom side to the top side, or the like) the screen.
  • the terminal may compare coordinates of a detected user's hand (e.g., the user's outstretched fingers) and determine that a change in the coordinates of the detected user's hand corresponds to the touchless vertical wave.
  • the spatial gesture may be a touchless cover.
  • the touchless cover may correspond to the user's hand being moved so as to cover the screen.
  • the touchless cover may include the user's hand having the palm facing the screen so as to be substantially parallel to the screen and substantially covering the screen.
  • the user's fingers may be outstretched during the touchless cover.
  • the terminal may detect the touchless cover by determining that the user's hand is disposed in a predefined configuration (e.g., such that the surface of the palm is substantially parallel to the screen surface) and determining that the user's hand is moved relative to the screen (e.g., so as to cover substantially the entire screen).
  • the terminal may compare coordinates of a detected user's hand and determine that a change in the coordinates of the detected user's hand corresponds to the touchless cover.
  • the touchless cover may include the user moving the hand closer to the screen and so as to cover substantially the entire screen.
  • the touchless cover may include the user moving the hand substantially along an original plane so as to cover substantially the entire screen.
  • the spatial gesture may be a touchless palm push.
  • the touchless palm push may include the user's hand being moved closer to the screen.
  • the touchless palm push may include the user's fingers being outstretched and the palm facing the screen (e.g., such that the surface of the palm is substantially parallel to the screen).
  • the terminal may detect the touchless palm push by determining that the user's hand is disposed in a predefined configuration (e.g., such that the surface of the palm is substantially parallel to the screen surface) and determining that the user's hand is moved relative to the screen (e.g., such that the user's palm is moved closer to the screen).
  • the terminal may compare coordinates of a detected user's hand and determine that a change in the coordinates of the detected user's hand (e.g., the user's palm) corresponds to the touchless palm push.
  • the spatial gesture may be a touchless circling in a clockwise direction.
  • the touchless circling may include moving a portion of the user's hand in a clockwise direction.
  • the touchless circling may include moving the palm in a clockwise direction relative to the screen.
  • the touchless circling may include moving at least one finger in a clockwise direction relative to the screen.
  • the terminal may detect the touchless circling by determining that the user's hand is disposed in a predefined configuration (e.g., such that the surface of the palm is substantially parallel to the screen surface) and determining that the user's hand is moved relative to the screen (e.g., such that at least a portion of the user's hand is moved in a clockwise direction).
  • a predefined configuration e.g., such that the surface of the palm is substantially parallel to the screen surface
  • the terminal may detect the touchless circling by determining that the user's hand is disposed in a predefined configuration (e.g., such that the surface of the palm is substantially parallel to the screen surface) and determining that the user's hand is moved relative to the screen (e.g., such that at least a portion of the user's hand is moved in a clockwise direction).
  • the terminal may compare coordinates of a detected user's hand (e.g., the palm, at least one finger, and/or the like) and determine that a change in the coordinates of the detected user's hand (e.g., the palm, at least one finger, and/or the like) corresponds to the touchless circling in a clockwise direction.
  • a detected user's hand e.g., the palm, at least one finger, and/or the like
  • a change in the coordinates of the detected user's hand e.g., the palm, at least one finger, and/or the like
  • the spatial gesture may be a touchless circling in a counter-clockwise direction.
  • the touchless circling may include moving a portion of the user's hand in a counter-clockwise direction.
  • the touchless circling may include moving the palm in a counter-clockwise direction relative to the screen.
  • the touchless circling may include moving at least one finger in a counter-clockwise direction relative to the screen.
  • the terminal may detect the touchless circling by determining that the user's hand is disposed in a predefined configuration (e.g., such that the surface of the palm is substantially parallel to the screen surface) and determining that the user's hand is moved relative to the screen (e.g., such that at least a portion of the user's hand is moved in a counter-clockwise direction).
  • a predefined configuration e.g., such that the surface of the palm is substantially parallel to the screen surface
  • the terminal may detect the touchless circling by determining that the user's hand is disposed in a predefined configuration (e.g., such that the surface of the palm is substantially parallel to the screen surface) and determining that the user's hand is moved relative to the screen (e.g., such that at least a portion of the user's hand is moved in a counter-clockwise direction).
  • the terminal may compare coordinates of a detected user's hand (e.g., the palm, at least one finger, and/or the like) and determine that a change in the coordinates of the detected user's hand (e.g., the palm, at least one finger, and/or the like) corresponds to the touchless circling in a counter-clockwise direction.
  • a detected user's hand e.g., the palm, at least one finger, and/or the like
  • a change in the coordinates of the detected user's hand e.g., the palm, at least one finger, and/or the like
  • the spatial gesture may be a touchless grab.
  • the touchless grab may include moving a user's hand such that the user's fingers are moved from a substantially outstretched position to a position in which the fingers are brought closer to the user's thumb, or such that the fingers are brought towards a clenched fist.
  • the terminal may detect the touchless grab by determining that the user's hand is disposed in a predefined configuration (e.g., such that the user's fingers are substantially outstretched relative to the user's palm and/or thumb) and determining that the user's hand is moved relative to the screen (e.g., such that the user's fingers are brought towards at least one of the user's thumb and palm so as to form or begin to form, for example, a clenched fist).
  • a predefined configuration e.g., such that the user's fingers are substantially outstretched relative to the user's palm and/or thumb
  • the user's hand is moved relative to the screen (e.g., such that the user's fingers are brought towards at least one of the user's thumb and palm so as to form or begin to form, for example, a clenched fist).
  • the terminal may compare coordinates of a detected user's hand (e.g., the palm, at least one finger, and/or the like) and determine that a change in the coordinates of the detected user's hand (e.g., the palm, at least one finger, and/or the like) corresponds to the touchless grab.
  • a detected user's hand e.g., the palm, at least one finger, and/or the like
  • determines that a change in the coordinates of the detected user's hand e.g., the palm, at least one finger, and/or the like
  • the spatial gesture may be a touchless pinch.
  • the touchless pinch may include moving a user's hand from a position in which at least one finger is outstretched to a position in which at least one outstretched finger is substantially touching the thumb.
  • the touchless pinch may include moving an outstretched finger and outstretched thumb towards each other.
  • the terminal may detect the touchless pinch by determining that the user's hand is disposed in a predefined configuration (e.g., such that at least one of the user's fingers is substantially outstretched relative to the user's palm and/or thumb) and determining that the user's hand is moved relative to the screen (e.g., such that at least one outstretched finger is brought towards the user's thumb).
  • a predefined configuration e.g., such that at least one of the user's fingers is substantially outstretched relative to the user's palm and/or thumb
  • the user's hand is moved relative to the screen (e.g., such that at least one outstretched finger is brought towards the user's thumb).
  • the terminal may compare coordinates of a detected user's hand (e.g., the palm, at least one finger, the thumb, and/or the like) and determine that a change in the coordinates of the detected user's hand (e.g., the palm, at least one finger, the thumb, and/or the like) corresponds to the touchless pinch.
  • a detected user's hand e.g., the palm, at least one finger, the thumb, and/or the like
  • a change in the coordinates of the detected user's hand e.g., the palm, at least one finger, the thumb, and/or the like
  • the spatial gesture may be a touchless approach.
  • the touchless approach may include the user's hand being moved closer to the screen.
  • the touchless approach may include the user's fingers being outstretched and the palm facing the screen (e.g., such that the surface of the palm is substantially parallel to the screen).
  • the terminal may detect the touchless approach by determining that the user's hand is disposed in a predefined configuration (e.g., such that the surface of the palm is substantially parallel to the screen surface) and determining that the user's hand is moved relative to the screen (e.g., such that the user's hand is moved closer to the screen).
  • the terminal may compare coordinates of a detected user's hand and determine that a change in the coordinates of the detected user's hand (e.g., the user's palm) corresponds to a touchless approach.
  • the terminal may distinguish between a touchless palm push and a touchless approach based on the distance along an axis perpendicular to the screen which the hand moves in a direction closer to the screen.
  • the touchless approach may include the hand being initially disposed at a distance farther from the screen than for the touchless palm push.
  • the spatial gesture may be a touchless tracking.
  • the touchless tracking may include a user moving an outstretched finger across the screen.
  • the touchless tracking may include the user moving the outstretched finger along a path across the screen.
  • the terminal may detect the touchless tracking by determining that the user's hand is disposed in a predefined configuration (e.g., such that at least one of the user's fingers is substantially outstretched relative to the user's palm and/or thumb) and determining that the user's hand is moved relative to the screen (e.g., such that at least one outstretched finger is moved across the screen).
  • the terminal may compare coordinates of a detected user's hand (e.g., the outstretched finger) and determine that a change in the coordinates of the detected user's hand (e.g., the outstretched finger) corresponds to the touchless tracking. For example, for a touchless tracking gesture, the distance between the outstretched finger and the surface of the screen may remain substantially the same throughout the tracking.
  • the spatial gesture may be a touchless palm push of two hands.
  • the touchless palm push of two hands may include the two hands being moved closer to the screen.
  • the touchless palm push of two hands may include the fingers on two hands being outstretched and the hands' respective palms facing the screen (e.g., such that the surface of the palm is substantially parallel to the screen).
  • the terminal may detect the touchless palm push of two hands by determining that the two hands hand are respectively disposed in a predefined configuration (e.g., such that the surface of the palm is substantially parallel to the screen surface) and determining that both of the two hands are moved relative to the screen (e.g., such that the respective palms are moved closer to the screen).
  • the terminal may compare coordinates of a detected two hands and determine that a change in the coordinates of the detected two hands (e.g., the respective palms) corresponds to the touchless palm push of two hands.
  • the spatial gesture may be a touchless pinch of two hands.
  • the touchless pinch of two hands may include at least one of two hands being moved relative to the screen so as to move the two hands closer together.
  • the touchless pinch of two hands may include at least one finger or thumb on each of the two hands being outstretched and the outstretched at least one finger or thumb of each of the two hands being moved closer together.
  • the touchless pinch of two hands may include the outstretched at least one finger or thumb of each of the two hands being moved so as to be in touching relation.
  • the terminal may detect the touchless pinch of two hands by determining that the two hands hand are respectively disposed in a predefined configuration (e.g., such that each of the two hands has at least one finger or thumb outstretched) and determining that at least one of the two hands is moved relative to the screen towards the other hand (e.g., such that the outstretched at least one finger or thumb on each hand is moved towards each other, and particularly, such that the outstretched at least one finger or thumb on each hand is moved so as to be in touching relation).
  • a predefined configuration e.g., such that each of the two hands has at least one finger or thumb outstretched
  • the terminal may compare coordinates of a detected two hands and determine that a change in the coordinates of the detected at least one of the two hands (e.g., the at least one outstretched finger or thumb) corresponds to the touchless pinch of two hands.
  • a change in the coordinates of the detected at least one of the two hands e.g., the at least one outstretched finger or thumb
  • the spatial gesture may be a touchless parallel rotate.
  • the touchless parallel rotate may include two hands disposed at a distance from one another and moved around an axis that is perpendicular to the screen.
  • the touchless parallel rotate may correspond to two hands being disposed such that a surface of the palm of the two hands is substantially perpendicular to the screen surface and disposed such that the palm of each hand is parallel to the other, and the hands moved around an axis that is perpendicular to the screen.
  • the terminal may detect the touchless parallel rotate by determining that each of the two hands is disposed in a predefined configuration (e.g., such that the surface of the palm is substantially perpendicular to the screen surface), determining that the palms of each of the two hands is substantially parallel with the other, and determining that the two hands are moved relative to the screen(e.g., around an axis that is perpendicular to (and extends from) the screen). For example, the terminal may compare coordinates of a detected two hands and determine that a change in the coordinates of the detected two hands corresponds to the touchless parallel rotate.
  • a predefined configuration e.g., such that the surface of the palm is substantially perpendicular to the screen surface
  • the terminal may compare coordinates of a detected two hands and determine that a change in the coordinates of the detected two hands corresponds to the touchless parallel rotate.
  • the spatial gesture may be a touchless clap.
  • the touchless clap may include two hands disposed at a distance from one another (e.g., such that the hands' respective palms are facing each other) and moved towards each other (e.g., along an axis that is substantially parallel to the screen).
  • the touchless clap may correspond to the two hands being disposed such that a surface of the respective palms of the two hands is substantially perpendicular to the screen surface and disposed such that the palm of each hand is parallel to the other (e.g., the palms are opposing or facing each other), and the hands are moved along an axis that is substantially parallel to the screen.
  • the terminal may detect the touchless clap by determining that each of the two hands is disposed in a predefined configuration (e.g., such that the surface of the palm is substantially perpendicular to the screen surface), determining that the palms of each of the two hands opposes or faces the other, and determining that the two hands are moved relative to the screen (e.g., around an axis that is substantially parallel to the screen). For example, the terminal may compare coordinates of a detected two hands and determine that a change in the relative coordinates of the detected two hands corresponds to the touchless clap.
  • a predefined configuration e.g., such that the surface of the palm is substantially perpendicular to the screen surface
  • the terminal may compare coordinates of a detected two hands and determine that a change in the relative coordinates of the detected two hands corresponds to the touchless clap.
  • the spatial gesture may be a touch with (e.g., in conjunction with) a touchless sweep.
  • the touch with the touchless sweep may include a user touching the screen with an input means (e.g., a stylus, a finger, a thumb, and/or the like) and a sweep motion in at least one of the up, down, left, and right directions.
  • the touch with the touchless sweep may correspond to a user touching the screen with an input means and a hand being disposed such that a surface of the palm of the hand is substantially perpendicular to the screen surface and moved in at least one of the up, down, left, and right directions.
  • the terminal may detect a touch (e.g., by a change in capacitance in the screen), and detect the touchless sweep by determining that the hand is disposed in a predefined configuration (e.g., such that the surface of the palm is substantially perpendicular to the screen surface) and determining that the hand is moved relative to (e.g., across at least a portion of) the screen.
  • the terminal may compare coordinates of a detected hand and determine that a change in the coordinates of the detected hand in conjunction with a touch event corresponds to the touch with touchless sweep.
  • the spatial gesture may be a finger hovering.
  • the finger hovering may correspond to a user holding a finger (or other input means) over substantially a same position relative to the screen for at least a predefined period of time.
  • the finger hovering may include the finger being held in substantially a constant same position relative to the screen which is disposed away from the screen such that no contact is made between the input means disposed away from the screen and the screen surface.
  • the terminal may detect the finger hovering by determining that the input means is held in substantially a constant position away from the screen for at least a predefined period of time. For example, the terminal may compare coordinates of a detected input means over a predefined period of time and determine that a lack of change in the coordinates of the detected input means corresponds to the finger hovering.
  • the spatial gesture may be a palm.
  • the palm may correspond to a user holding an outstretched hand (e.g., having fingers outstretched) over substantially a same position relative to the screen for at least a predefined period of time.
  • the palm may include the palm being held in substantially a constant same position relative to the screen which is disposed away from the screen such that no contact is made between the palm and the screen surface.
  • the terminal may detect the palm by determining that the palm is held in substantially a constant position away from the screen for at least a predefined period of time.
  • the terminal may compare coordinates of a detected palm over a predefined period of time and determine that a lack of change in the coordinates of the detected palm corresponds to a palm.
  • the terminal may compare the configuration of the hand over a predefined period of time and determine that maintaining the configuration of the hand in an outstretched configuration over the predefined of time corresponds to the palm.
  • the spatial gesture may be a fist.
  • the fist may correspond to a user holding a clenched hand (e.g., the fingers and thumb clenched in a fist) over substantially a same position relative to the screen for at least a predefined period of time.
  • the fist may include the hand (e.g., the fist) being held in substantially a constant same position relative to the screen which is disposed away from the screen such that no contact is made between the fist and the screen surface.
  • the terminal may detect the fist by determining that the hand is in a predetermined configuration (e.g., the hand forms a clenched fist), and determining that the fist is held in substantially a constant position away from the screen for at least a predefined period of time. For example, the terminal may compare coordinates of a detected fist over a predefined period of time and determine that a lack of change in the coordinates of the detected fist corresponds to a fist. As another example, the terminal may compare the configuration of the hand over a predefined period of time and determine that maintaining the configuration of the hand in a fist configuration over the predefined of time corresponds to the fist.
  • a predetermined configuration e.g., the hand forms a clenched fist
  • the terminal may compare coordinates of a detected fist over a predefined period of time and determine that a lack of change in the coordinates of the detected fist corresponds to a fist.
  • the terminal may compare
  • the spatial gesture may be a thumb-up.
  • the thumb-up may correspond to a user holding a hand having an outstretched thumb and fingers clenched towards the palm over substantially a same position relative to the screen for at least a predefined period of time.
  • the thumb-up may include the hand (e.g., the outstretched thumb) being held in substantially a constant same position relative to the screen which is disposed away from the screen such that no contact is made between the hand and the screen surface.
  • the terminal may detect the thumb by determining that the hand is in a predetermined configuration (e.g., the hand forms a thumb-up sign), and determining that the thumb-up is held in substantially a constant position away from the screen for at least a predefined period of time. For example, the terminal may compare coordinates of a detected thumb-up over a predefined period of time and determine that a lack of change in the coordinates of the detected thumb-up corresponds to the thumb-up. As another example, the terminal may compare the configuration of the hand over a predefined period of time and determine that maintaining the configuration of the hand in the thumb-up configuration over the predefined of time corresponds to the thumb-up.
  • a predetermined configuration e.g., the hand forms a thumb-up sign
  • the terminal may compare coordinates of a detected thumb-up over a predefined period of time and determine that a lack of change in the coordinates of the detected thumb-up corresponds to the thumb-up.
  • the terminal may compare the configuration of the hand over a predefined period
  • the spatial gesture may be a pointing gesture.
  • the pointing gesture may correspond to a user holding a hand having an input means (e.g., an outstretched finger, thumb, or the like) over substantially a same position relative to the screen for at least a predefined period of time.
  • the pointing gesture may include the hand (e.g., the outstretched finger) being held in substantially a constant same position relative to the screen which is disposed away from the screen such that no contact is made between the outstretched finger and the screen surface.
  • the terminal may detect the pointing gesture by determining that the hand is in a predetermined configuration (e.g., the hand forms a pointing gesture in which at least one finger is outstretched), and determining that the pointing gesture is held in substantially a constant position away from the screen for at least a predefined period of time. For example, the terminal may compare coordinates of a detected pointing gesture over a predefined period of time and determine that a lack of change in the coordinates of the detected pointing gesture corresponds to a pointing gesture. As another example, the terminal may compare the configuration of the hand over a predefined period of time and determine that maintaining the configuration of the hand in the pointing gesture configuration over the predefined of time corresponds to the pointing gesture.
  • a predetermined configuration e.g., the hand forms a pointing gesture in which at least one finger is outstretched
  • the terminal may compare coordinates of a detected pointing gesture over a predefined period of time and determine that a lack of change in the coordinates of the detected pointing gesture
  • the spatial gesture may be an okay gesture.
  • the okay gesture may correspond to a user holding a hand having an index finger and thumb forming a circle and having the remaining fingers outstretched over substantially a same position relative to the screen for at least a predefined period of time.
  • the okay gesture may include the hand (e.g., the hand having an index finger and thumb forming a circle and having the remaining fingers outstretched) being held in substantially a constant same position relative to the screen which is disposed away from the screen such that no contact is made between the hand and the screen surface.
  • the terminal may detect the okay gesture by determining that the hand is in a predetermined configuration (e.g., the hand having an index finger and thumb forming a circle and having the remaining fingers outstretched), and determining that the okay gesture is held in substantially a constant position away from the screen for at least a predefined period of time. For example, the terminal may compare coordinates of a detected okay gesture over a predefined period of time and determine that a lack of change in the coordinates of the detected okay gesture corresponds to the okay gesture. As another example, the terminal may compare the configuration of the hand over a predefined period of time and determine that maintaining the configuration of the hand in the okay gesture configuration over the predefined of time corresponds to the okay gesture.
  • a predetermined configuration e.g., the hand having an index finger and thumb forming a circle and having the remaining fingers outstretched
  • the terminal may compare coordinates of a detected okay gesture over a predefined period of time and determine that a lack of change in the coordinates of the detected okay gesture corresponds to the
  • the spatial gesture may be a numbering gesture.
  • the numbering gesture may correspond to a user holding a hand having a predefined number of fingers outstretched over substantially a same position relative to the screen for at least a predefined period of time.
  • the numbering gesture may include the hand (e.g., the thumb and two fingers being outstretched) being held in substantially a constant same position relative to the screen which is disposed away from the screen such that no contact is made between the hand and the screen surface.
  • the terminal may detect the numbering gesture by determining that the hand is in a predetermined configuration (e.g., the thumb and two fingers being outstretched), and determining that the numbering gesture is held in substantially a constant position away from the screen for at least a predefined period of time. For example, the terminal may compare coordinates of a detected numbering gesture over a predefined period of time and determine that a lack of change in the coordinates of the detected numbering gesture corresponds to the numbering gesture. As another example, the terminal may compare the configuration of the hand over a predefined period of time and determine that maintaining the configuration of the hand in the numbering gesture configuration over the predefined of time corresponds to the numbering gesture.
  • the numbering gesture may include a plurality of gestures each of which corresponds to a different combination of fingers and thumb being outstretched.
  • the terminal determines a certain action corresponding to the analysis result at step S130.
  • the terminal is capable of determining the action corresponding to the analyzed spatial gesture.
  • the terminal is capable of determining the operation mode of the terminal.
  • the operation mode of the terminal may be determined according to the detected spatial gesture.
  • the operation mode of the terminal may include any of camera shooting mode, multimedia playback mode, text messaging mode, voice call mode, e-book-reading mode, navigation mode, calendar display mode, email messaging mode, video game play mode, and the like.
  • the terminal is also capable of determining a certain action corresponding to the spatial gesture in association with the operation mode.
  • the terminal if the terminal is in the multimedia playback mode, and if the detected spatial gesture corresponds to the sweeping gesture in a certain direction, the terminal is capable of regarding this gesture as a command for playback of the next song in the corresponding direction on a playlist.
  • the terminal if the terminal is in the e-book-reading mode, and if the detected spatial gesture corresponds to the sweeping gesture in a certain direction, the terminal is capable of regarding this gesture as a command for displaying the next page of the e-book in the corresponding direction.
  • a set of spatial interactions as shown in FIG. 26 can be used.
  • FIG. 26 is a diagram illustrating an exemplary spatial interaction set for use in a spatial interaction-based terminal control method according to an exemplary embodiment of the present invention.
  • the terminal is capable of defining the titles of the spatial gestures, describing detailed actions corresponding to the spatial gestures, and storing metaphors of the corresponding spatial gestures.
  • the metaphors indicate actions corresponding to the spatial actions.
  • the terminal is capable of analyzing the detected spatial gesture and retrieving the action mapped to the spatial gesture from the interaction set based on the analysis result.
  • the defined spatial gestures and associated action mapped to the defined spatial gestures may be stored as a Look-Up Table (LUT).
  • the terminal performs the action at step S140. For example, based on detection and determination of a spatial gesture, the terminal may determine an operation or action corresponding to the spatial gesture and perform the operation or action corresponding to the spatial gesture.
  • the terminal is capable of performing the determined action.
  • FIG. 27 is a diagram illustrating exemplary screen displays presented in association with a spatial interaction-based terminal control method according to an exemplary embodiment of the present invention.
  • the terminal detects an input image taken by the camera and analyzes the image.
  • the terminal is capable of analyzing at least one of direction, speed, and movement distance of an input means based on the input image.
  • the terminal is capable of detecting the movement of the user's hand to the left/right direction.
  • the terminal is capable of performing an action corresponding to the analysis result.
  • the terminal is capable of scrolling items on the screen of the display unit in the left/right direction.
  • the terminal is capable of scrolling the items at a speed in synchronization with the movement speed of the input means.
  • the terminal is also capable of turning one or more pages (or a chapter) of an e-book in synchronization with the movement speed and distance of the input means.
  • the terminal is also capable of turning the photos of an album application or playing a next song listed next to the currently playing song in a playlist depending in accordance with the spatial gesture.
  • FIG. 27 illustrates the turning of the page of a document or e-book based on the detection of a touchless tracking from one side of the screen to an opposing side of the screen, the detection of a touchless sweep from one side of the screen to the opposing side of the screen having a slow speed, and the detection of a touchless sweep from one side of the screen to the opposing side of the screen having a fast speed.
  • the terminal may turn a single page, a few pages, or several (e.g., a lot) of pages.
  • the number of pages turned may correspond to how pronounced the gesture appears to the terminal.
  • FIG. 28 is a diagram illustrating an exemplary situation of making a spatial gesture available in a spatial interaction-based terminal control method according to an exemplary embodiment of the present invention.
  • the terminal is capable of detecting a touch gesture and a spatial gesture simultaneously or sequentially.
  • the terminal is also capable of detecting a spatial gesture in the state of tracing the touch gesture.
  • the terminal is capable of detecting the spatial gesture made in a certain direction in the state that a touch gesture made at a certain position is maintained.
  • the terminal is capable of selecting an item at the position at which the touch gesture is made (e.g., detected) on the screen and moving the corresponding selected item in the left/right direction on the screen in synchronization with the movement direction of the spatial gesture.
  • the selected item may be an icon, a widget, an image arranged on the background, or the like.
  • the terminal is also capable of scrolling the items in the left/right direction, with the exception of the selected item, in synchronization with the movement direction of the spatial gesture.
  • the items can be any of all the types of User Interface (UI) items displayed on the background screen such as icon and widget.
  • UI User Interface
  • the selected item corresponding to the touch gesture may remain in a static display position and other items displayed on the screen may be moved or changed according to the detected spatial gesture (e.g., a touchless sweep).
  • the detected spatial gesture e.g., a touchless sweep
  • the terminal is also capable of placing a call or sending a text message or a file to a recipient selected by a touch gesture according to the spatial gesture.
  • FIG. 29 is a diagram illustrating an exemplary situation of making a spatial gesture available in a spatial interaction-based terminal control method according to an exemplary embodiment of the present invention.
  • the terminal is capable of detecting an input image taken by the camera and analyzing the input image.
  • the terminal is capable of analyzing the input image to detect the spatial gesture corresponding to the fist.
  • the terminal is capable of performing an action corresponding to the analysis result. For example, upon detection of the spatial gesture corresponding to the fist, the terminal may perform a corresponding predefined action.
  • the terminal is capable of capturing and storing the content displayed on the screen or the entire content including a part not displayed on the screen.
  • the terminal is also capable of storing the captured content in the form of an image.
  • the terminal is capable of processing the image to a format having predetermined size, resolution, and aspect ratio in synchronization of movement speed, finger unfolding level before fisting, finger folding level after fisting, and the like.
  • FIG. 30 is a diagram illustrating an exemplary situation of making a spatial gesture available in a spatial interaction-based terminal control method according to an exemplary embodiment of the present invention.
  • the terminal is capable of detecting an input image taken by the camera and analyzing the input image.
  • the terminal is capable of analyzing the input image to detect the spatial gesture oriented to a certain direction.
  • the terminal is capable of performing an action navigating web browsers according to the spatial gesture.
  • the terminal is capable of performing an action of navigating the tabs.
  • the terminal is capable of performing an action navigating the applications.
  • the terminal is also capable of displaying a screen on which the web browser is moved to the left/right/up/down side or the top/bottom side.
  • the terminal is also capable of performing an action of refreshing, expanding/contracting, copying, navigating, printing, turning to previous/next page onto the web browser displayed on the screen, and the like.
  • FIG. 31 is a diagram illustrating an exemplary situation of making a spatial gesture available in a spatial interaction-based terminal control method according to an exemplary embodiment of the present invention.
  • the terminal is capable of detecting an input image taken by the camera and analyzing the shape of the input means.
  • the terminal is capable of determining the shape of the hand.
  • the terminal is capable of determining the number of unfolded fingers, shape of the fingers, palm and back of the hand, positions and sizes of the fingers.
  • the terminal is capable of performing an action according to the shape of the input means.
  • the terminal is capable of determining the shooting time of the camera based on the number of unfolded fingers.
  • the terminal is capable of controlling a camera function such that the camera captures an image after or during the time corresponding to the number of unfolded fingers.
  • the terminal is also capable of controlling the camera function such that the camera to capture image immediately according to the shape of the hand.
  • the terminal is also capable of controlling the camera to autofocus according to the shape of the hand.
  • the camera may be configured to perform autofocusing when the thumb up gesture is detected.
  • the terminal is also capable of applying a certain visual effect according to the detected spatial gesture.
  • FIG. 32 is a diagram illustrating an exemplary situation of making a spatial gesture available in a spatial interaction-based terminal control method according to an exemplary embodiment of the present invention.
  • the terminal is capable of detecting the movement in a certain direction or the gesture of the input means of approaching the terminal.
  • the terminal is capable of detecting a finger approaching
  • the terminal is capable of analyzing the movement direction and speed, coordinates of the approaching finger, and distance.
  • the terminal is capable of performing an action corresponding to the analysis.
  • the terminal is capable of displaying a water surface moving according to a certain movement direction and speed of the input means.
  • the terminal may also display a screen showing a water drop falling at a certain position.
  • the terminal is capable of displaying screen images giving various emotional effects.
  • the terminal is capable of displaying a 2-Dimensional (2D) or 3-Dimensional (3D) screen images with the output of vibration and sound effect in synchronization with the change of the screen image.
  • the terminal may display a screen showing a wave, a ripple, or a series of waves travel across the screen.
  • the terminal may display a screen showing a ripple effect associated with an object or droplet of water being dropped into a liquid.
  • FIG. 33 is a diagram illustrating an exemplary situation of making a spatial gesture available in a spatial interaction-based terminal control method according to an exemplary embodiment of the present invention.
  • the terminal is capable of detecting a movement of an object approaching the terminal or the gesture of an input means close to the terminal and analyzing the movement or gesture, and performing an action corresponding to the analysis.
  • the terminal is capable of displaying simple information in association with the operation mode.
  • the terminal is capable of displaying weather information, a time, a battery charge state, and an alarm window in a screen lock mode,a power saving mode, or the like.
  • FIG. 34 is a diagram illustrating an exemplary situation of making a spatial gesture available in a spatial interaction-based terminal control method according to an exemplary embodiment of the present invention.
  • the terminal is capable of detecting a movement and analyzing the direction of the movement.
  • the terminal is also capable of performing an action in association with the operation mode.
  • the terminal is capable of determining whether the terminal is in a driving mode based on data acquired from velocity/acceleration sensor, gyro sensor, and the like. If a movement or spatial gesture is detected when the terminal is in a driving mode, a direction of the movement or spatial gesture is detected and a corresponding action such as, for example, receiving an inbound call, rejecting the call, rejecting the call with the transmission of the notification message informing that the user is at the wheel, and the like may be performed.
  • spatial gestures may be used as a hands-free means for controlling the terminal when the terminal is determined to be in a driving mode.
  • FIG. 35 is a diagram illustrating an exemplary situation of making a spatial gesture available in spatial interaction-based terminal control method according to an exemplary embodiment of the present invention.
  • the terminal is capable of controlling multimedia playback based on the spatial gesture.
  • the terminal is capable of detecting the image input through the camera and navigating forward/backward the multimedia file list, playing/pausing, shuffling multimedia files in accordance to the movement and/or shape of the input means, and the like.
  • FIG. 36 is a block diagram illustrating a configuration of a terminal according to an exemplary embodiment of the present invention.
  • the terminal 100 includes an input unit 110, a sensing unit 120, a control unit 130, a camera unit 140, a storage unit 150, and a display unit 160.
  • the input unit 110 generates an input signal in response to a user's manipulation for controlling the terminal 100.
  • the input unit 110 can be implemented with at least one of a keypad, a dome switch, a touch pad (resistive/capacitive), jog wheel, jog switch, and the like.
  • the sensing unit 120 includes at least one of a touch sensor 121, a proximity sensor 122, a velocity/acceleration sensor 123, a gyro sensor 124, and the like.
  • the touch sensor 121 is capable of detecting a touch input.
  • the touch sensor 121 may be implemented using at least one of a touch film, a touch sheet, a touch pad, and the like.
  • the touch sensor 121 may be integrated with the display unit 160 as a touchscreen.
  • the touch sensor 121 can be configured to convert the pressure,change in capacitance, and the like at a certain position of the display unit to an electrical input signal.
  • the touch sensor 121 may be capable of detecting the touch position and size and touch pressure.
  • the touch sensor 121 If a touch input is detected, the touch sensor 121 generates a corresponding touch signal and transmits the touch signal to the control unit 130.
  • the control unit 130 is capable of determining the position at which the touch input is made based on the touch signal.
  • the display unit 160 may be implemented as a touchscreen.
  • the proximity sensor 122 may be arranged inside of the terminal 100 around the display unit 160.
  • the proximity sensor 122 may be capable of detecting an object approaching a predetermined detection surface or the existence of an object around the terminal 100 by means of the electric field or infrared ray without physical contact.
  • the proximity sensor 122 may be implemented with at least one of a transparent photoelectric sensor, a direct reflective photoelectric sensor, a mirror reflective optical sensor, a microwave oscillation proximity sensor, a capacitive proximity sensor, a magnetic proximity sensor, an infrared proximity sensor, and the like. If the display unit 160 is implemented in the form of a capacitive touchscreen, the proximity sensor 122 may be configured to detect the approach of a pointer based on the change of the electric field.
  • the proximity sensor 122 may be capable of detecting a proximity touch and proximity touch pattern (e.g., proximity touch distance, proximity touch direction, proximity touch speed, proximity touch time, proximity touch position, proximity touch movement state, and the like).
  • the information associated with the detected proximity touch gesture and the proximity touch pattern may be displayed on the screen of the display unit 160.
  • the velocity/acceleration sensor 123 may be capable of detecting the motion of the terminal 100.
  • the velocity/acceleration sensor 123 may be capable of including a device for converting change of acceleration in a certain direction to an electrical signal.
  • control unit 130 is capable of determining whether the terminal is in a driving mode based on the terminal's motion detected by the velocity/acceleration sensor 123.
  • the gyro sensor 124 may be capable of detecting the orientation of the terminal.
  • the gyro sensor 124 may measure the angular velocity of the terminal 100 to detect the rotation angle to the reference orientation.
  • the gyro sensor 124 may be capable of detecting or determining ratio angles to three orientation axes(e.g., azimuth, pitch, and roll).
  • control unit 130 may be capable of determining whether the terminal 100 rests on a flat surface or is mounted on a docking station based on the detected orientation.
  • the control unit 130 controls the operations of the components of the terminal 100.
  • the control unit 130 is capable of detecting a spatial gesture to the terminal 100, analyzing the spatial gesture, determining an action mapped to the spatial gesture, and performing the determined action.
  • control unit 130 The detailed operations of the control unit 130 are as described above.
  • the camera unit 140 may process the video frame such as still and motion picture acquired by the image sensor in the video call mode, a still picture capturing mode, or motion picture capturing mode.
  • the processed video frame can be displayed by the display unit 160.
  • the camera unit 140 is capable of transferring the processed video frame to the storage unit 150 and the control unit 130.
  • the control unit 130 analyzes the image from the camera unit 140 to detect a shape, form, and the like on the image.
  • the storage unit 150 may be capable of storing programs associated with the operation of the control unit 130 and input/output data (e.g., phonebook, messages, still and motion images, and the like) temporarily or semi-permanently.
  • the storage unit 150 may also be capable of storing the data associated with various vibration pattern and sound effects output in detection of the touch gesture on the touchscreen.
  • the storage unit 150 may also be capable of storing a set of spatial interactions mapping the spatial gestures and corresponding action to be performed interactively.
  • the storage unit 150 can be implemented with at least one of flash memory, hard disk, micro multimedia card (e.g. SD and xD memory cards), Random Access Memory (RAM), Static RAM (SRAM), Read-Only Memory (ROM), Electrically Erasable Programmable Read-Only Memory (EEPROM), Programmable Read-Only Memory (PROM), magnetic memory, magnetic disc, optical disc, and the like.
  • micro multimedia card e.g. SD and xD memory cards
  • RAM Random Access Memory
  • SRAM Static RAM
  • ROM Read-Only Memory
  • EEPROM Electrically Erasable Programmable Read-Only Memory
  • PROM Programmable Read-Only Memory
  • magnetic memory magnetic disc, optical disc, and the like.
  • the display unit 160 may be capable of displaying (outputting) the information processed by the terminal 100 in the form of a UI or Graphical UI (GUI).
  • GUI Graphical UI
  • the display unit 160 can be implemented with one of a Liquid Crystal Display (LCD), a Thin Film Transistor LCD (TFT LCD), an Organic Light Emitting Diode (OLED), a flexible display, and a 3-Dimensional (3D) display.
  • LCD Liquid Crystal Display
  • TFT LCD Thin Film Transistor LCD
  • OLED Organic Light Emitting Diode
  • flexible display a flat panel display
  • 3-Dimensional (3D) display 3-Dimensional
  • the display unit 160 can be implemented as a touchscreen with interlayered touch sensor 121 and/or proximity sensor 123. In this case, the display unit 160 may be capable of operating as an input device as well as the output device.
  • the terminal and spatial interaction-based terminal control method are capable of controlling the terminal intuitively and conveniently with spatial gesture of the user.
  • the terminal and spatial interaction-based terminal control method according to exemplary embodiments of the present invention are advantageous in controlling the terminal efficiently without direct touch to the terminal even.
  • the terminal and method for controlling such a terminal according to exemplary embodiments of the present invention may be beneficial when a user is operating the terminal with a glove worn on the user's hand or when the terminal is stained with foreign materials.
  • Any such software may be stored in a non-transitory computer readable storage medium.
  • the computer readable storage medium stores one or more programs (software modules), the one or more programs comprising instructions, which when executed by one or more processors in an electronic device, cause the electronic device to perform a method of the present invention.
  • Any such software may be stored in the form of volatile or non-volatile storage such as, for example, a storage device like a Read Only Memory (ROM), whether erasable or rewritable or not, or in the form of memory such as, for example, Random Access Memory (RAM), memory chips, device or integrated circuits or on an optically or magnetically readable medium such as, for example, a Compact Disk (CD), Digital Versatile Disc (DVD), magnetic disk or magnetic tape or the like.
  • ROM Read Only Memory
  • RAM Random Access Memory
  • CD Compact Disk
  • DVD Digital Versatile Disc
  • magnetic disk or magnetic tape or the like an optically or magnetically readable medium
  • the storage devices and storage media are exemplary embodiments of machine-readable storage that are suitable for storing a program or programs comprising instructions that, when executed, implement exemplary embodiments of the present invention. Accordingly, exemplary embodiments provide a program comprising code for implementing apparatus or a method as claimed in any one of the claims of this specification and a non-

Abstract

A method for controlling an electronic device, the method comprising: acquiring an input image using a camera; and analyzing a user gesture in the input image, by identifying a user's hand disposed in a predefined configuration, and determining a shape of the hand is changed to a predetermined shape by a movement of the at least part of the hand, and executing a camera function corresponding to the user gesture. An electronic device implementing the method.

Description

    BACKGROUND OF THE INVENTION 1. Field of the Invention:
  • The present invention relates to a terminal and method for controlling same based on spatial interaction. More particularly, the present invention relates to a terminal and method for recognizing distinct spatial gestures and executing the input corresponding to a recognized spatial gesture interactively.
  • 2. Description of the Related Art:
  • With the widespread use of smart electronic devices, research and development has focused on various interaction technologies for facilitating user's device control.
  • According to the related art, some representative interaction-based device control technologies include touch or motion-based device control and hand gesture-based device control technologies.
  • Therefore, a need exists for a terminal and method for recognizing distinct spatial gestures and executing the input corresponding to a recognized spatial gesture interactively.
  • The above information is presented as background information only to assist with an understanding of the present disclosure. No determination has been made, and no assertion is made, as to whether any of the above might be applicable as prior art with regard to the present invention.
  • SUMMARY OF THE INVENTION
  • Aspects of the present invention are to address at least the above-mentioned problems and/or disadvantages and to provide at least the advantages described below. Accordingly, an aspect of the present invention provides a terminal and method for controlling the same in such a way of recognizing distinct spatial gestures and executing the input corresponding to a recognized spatial gesture interactively.
  • In accordance with an aspect of the present invention, a method for controlling a terminal is provided. The method includes detecting a spatial gesture, analyzing the spatial gesture, and executing an action depending on the analysis result.
  • In accordance with another aspect of the present invention, a method for controlling a terminal is provided. The method includes detecting a user gesture moving an object in a certain direction within proximity of the terminal, determining at least one of the movement direction, movement speed, and movement distance of the user gesture, and performing a control operation associated with a currently running application according to the determined at least one of movement direction, movement speed, and movement distance of the user gesture.
  • In accordance with another aspect of the present invention, a terminal is provided. The terminal includes a sensing unit which detects a user gesture moving an object in a certain direction within proximity of the terminal, a control unit which determines at least one of the movement direction, movement speed, and movement distance of the user gesture and performs a control operation associated with a currently running application according to the determined at least one of movement direction, movement speed, and movement distance of the user gesture, and a display unit which displays an execution screen of the application under the control of the control unit.
  • In accordance with another aspect of the present invention, a method for controlling a terminal inside a vehicle is provided. The method includes initiating a driving mode based on at least one of a movement state of the terminal and a movement state of the vehicle, detecting a user gesture within proximity of the terminal when the terminal is in the driving mode, and executing a function associated with the corresponding user gesture when the terminal is in the driving mode.
  • In accordance with another aspect of the present invention, a terminal is provided. The terminal includes a sensing unit which detects a user gesture moving in a certain direction within proximity of the terminal, a control unit which initiates a driving mode based on at least one of a movement state of the terminal and a movement state of the vehicle and which executes, when the user gesture is detected by the sensing unit when the terminal is in the driving mode, a function associated with the corresponding user gesture when the terminal is the driving mode, and an output unit which outputs a function execution result under the control of the control unit.
  • In accordance with another aspect of the present invention, a method for controlling a terminal is provided. The method includes entering a specific operation mode according to a state of the terminal, detecting a user gesture within proximity of the terminal when the terminal is in the specific operation mode, and displaying information associated with the operation mode in response to the user gesture.
  • In accordance with another aspect of the present invention, a terminal is provided. The terminal includes a sensing unit which detects a user gesture within proximity of the terminal, a display unit which displays information, and a control unit which controls entering a specific operation mode according to a condition of the terminal and displaying, when the user gesture by the sensing unit when the terminal is in the specific operation mode, the information associated with the operation mode in response to the user gesture.
  • In accordance with another aspect of the present invention, a method for controlling a terminal is provided. The method includes detecting a user gesture within proximity of the terminal, determining a type of the user gesture, and performing a control operation of the terminal according to the user gesture.
  • In accordance with another aspect of the present invention, a terminal is provided. The terminal includes a sensing unit which detects a user gesture within proximity of the terminal, a control unit which determines a type of the user gesture and performs a control operation of the terminal according to the user gesture, and a display unit which displays a result of the control operation of the control unit.
  • Other aspects, advantages, and salient features of the invention will become apparent to those skilled in the art from the following detailed description, which, taken in conjunction with the annexed drawings, discloses exemplary embodiments of the invention.
  • BRIEF DESCRIPTION OF THE DRAWINGS
  • The above and other aspects, features, and advantages of certain exemplary embodiments of the present invention will be more apparent from the following description taken in conjunction with the accompanying drawings, in which:
    • FIG. 1 is a flowchart illustrating a method for controlling a terminal based on spatial gestures according to an exemplary embodiment of the present invention;
    • FIGS. 2 to 25 are diagrams illustrating spatial gestures defined for use in a spatial interaction-based terminal control method according to an exemplary embodiment of the present invention;
    • FIG. 26 is a diagram illustrating an exemplary spatial interaction set for use in a spatial interaction-based terminal control method according to an exemplary embodiment of the present invention;
    • FIG. 27 is a diagram illustrating exemplary screen displays presented in association with a spatial interaction-based terminal control method according to an exemplary embodiment of the present invention;
    • FIG. 28 is a diagram illustrating an exemplary situation of making a spatial gesture available in a spatial interaction-based terminal control method according to an exemplary embodiment of the present invention;
    • FIG. 29 is a diagram illustrating an exemplary situation of making a spatial gesture available in a spatial interaction-based terminal control method according to an exemplary embodiment of the present invention;
    • FIG. 30 is a diagram illustrating an exemplary situation of making a spatial gesture available in a spatial interaction-based terminal control method according to an exemplary embodiment of the present invention;
    • FIG. 31 is a diagram illustrating an exemplary situation of making a spatial gesture available in a spatial interaction-based terminal control method according to an exemplary embodiment of the present invention;
    • FIG. 32 is a diagram illustrating an exemplary situation of making a spatial gesture available in a spatial interaction-based terminal control method according to an exemplary embodiment of the present invention;
    • FIG. 33 is a diagram illustrating an exemplary situation of making a spatial gesture available in a spatial interaction-based terminal control method according to an exemplary embodiment of the present invention;
    • FIG. 34 is a diagram illustrating an exemplary situation of making a spatial gesture available in a spatial interaction-based terminal control method according to exemplary embodiment of the present invention;
    • FIG. 35 is a diagram illustrating an exemplary situation of making a spatial gesture available in a spatial interaction-based terminal control method according to an exemplary embodiment of the present invention; and
    • FIG. 36 is a block diagram illustrating a configuration of a terminal according to an exemplary embodiment of the present invention.
  • Throughout the drawings, it should be noted that like reference numbers are used to depict the same or similar elements, features, and structures.
  • DETAILED DESCRIPTION OF EXEMPLARY EMBODIMENTS
  • The following description with reference to the accompanying drawings is provided to assist in a comprehensive understanding of exemplary embodiments of the invention as defined by the claims and their equivalents. It includes various specific details to assist in that understanding but these are to be regarded as merely exemplary. Accordingly, those of ordinary skill in the art will recognize that various changes and modifications of the embodiments described herein can be made without departing from the scope and spirit of the invention. In addition, descriptions of well-known functions and constructions may be omitted for clarity and conciseness.
  • The terms and words used in the following description and claims are not limited to the bibliographical meanings, but, are merely used by the inventor to enable a clear and consistent understanding of the invention. Accordingly, it should be apparent to those skilled in the art that the following description of exemplary embodiments of the present invention is provided for illustration purpose only and not for the purpose of limiting the invention as defined by the appended claims and their equivalents.
  • It is to be understood that the singular forms "a," "an," and "the" include plural referents unless the context clearly dictates otherwise. Thus, for example, reference to "a component surface" includes reference to one or more of such surfaces.
  • The present invention is applicable to all the types of devices capable of recognizing spatial gestures as well as the electronic terminals including a smartphone, a portable terminal, a mobile terminal, a Personal Digital Assistant (PDA), a Portable Multimedia Player (PMP), a laptop computer, a tablet computer, a Personal Computer, WiBro terminal, a digital camera, a portable game console, an MP3 player, a handheld e-book, a Global Positioning System (GPS) navigation, a high Definition Television (HDTV), and the like.
  • The terminology used herein is for the purpose of describing particular exemplary embodiments only and is not intended to be limiting of the present invention. Unless otherwise defined, all terms used herein have the same meaning as commonly understood by one of ordinary skill in the art to which the present invention pertains, and should not be interpreted as having an excessively comprehensive meaning nor as having an excessively contracted meaning. If technical terms used herein is erroneous that fails to accurately express the technical idea of the present invention, such technical terms should be replaced with technical terms that allow the person in the art to properly understand. The general terms used herein should be interpreted according to the definitions in the dictionary or in the context and should not be interpreted as an excessively contracted meaning.
  • As used herein, the singular forms "a", "an" and "the" are intended to include the plural forms as well, unless the context clearly indicates otherwise. It will be further understood that the terms "comprises" "comprising," "includes" and/or "including" when used herein, specify the presence of stated features, integers, steps, operations, elements, and/or components, but do not preclude the presence or addition of one or more other features, steps, operations, elements, components, and/or groups thereof.
  • Exemplary embodiments of the present invention are described hereinafter with reference to the accompanying drawings in detail.
  • FIG. 1 is a flowchart illustrating a method for controlling a terminal based on spatial gestures according to an exemplary embodiment of the present invention.
  • Referring to FIG. 1, the terminal control method according to an exemplary embodiment of the present invention comprises the following steps.
  • At step S110, the terminal detects a spatial gesture. As an example, the spatial gesture can be made with an input means such as user's hand (including fingers) or the like.
  • According to exemplary embodiments of the present invention, the terminal is capable of detecting the spatial gesture by means of (e.g., using) a camera,a proximity sensor, and/or the like. The terminal is also capable of detecting multiple spatial gestures simultaneously or sequentially.
  • According to exemplary embodiments of the present invention,the terminal may be capable of detecting a touch gesture along with the spatial gesture. As an example, the terminal may be capable of detecting the spatial gesture and the touch gesture simultaneously or sequentially. As another example, terminal may be capable of detecting the spatial gesture in a state in which the touch gesture is maintained.
  • Thereafter, the terminal analyzes the detected spatial gesture at step S120.
  • According to exemplary embodiments of the present invention, the terminal may be capable of detecting the spatial gesture by analyzing the picture taken by the camera and/or by means of (e.g., using) the proximity sensor. The terminal is capable of analyzing a motion oriented to at least one direction or drawing a certain shape (which may be referred to as sweep), a motion speed, a motion length, an input means' shape (e.g., hand or finger shape), at least one coordinates associated with the input means' approach, an approaching speed, an approaching distance, and the like.
  • If the user's hand corresponds to the input means, the user is capable of making spatial gestures recognizable by the terminal. For example, spatial gestures may include a tap (e.g., bending forward and then stretching the hand), a sweep (e.g., sweeping hand up/down/left/right), a wave (e.g., a horizontal wave and/or a vertical wave), a circle (e.g., a drawn circle in a clockwise or a counter-clockwise direction), a grasp (e.g., a clench of a palm to form a fist), a pinch (e.g., a squeeze between a thumb and an index finger), an approach, a trace (e.g., tracking a hand motion in real time), a cover (e.g., covering the terminal with a hand), a push (e.g., a push of a hand to the terminal), a two hand push, a two hand open, a two hand circle, put two hand, a touch and sweep, an approach, and the like. The aforementioned gestures are depicted in FIGS. 2 to 25 respectively.
  • FIGS. 2 to 25 are diagrams illustrating spatial gestures defined for use in a spatial interaction-based terminal control method according to an exemplary embodiment of the present invention.
  • Referring to FIG. 2, the spatial gesture may be a touchless tap. For example, the touchless tap may correspond to a user moving a hand closer to the screen. The touchless tap may correspond to the user moving the hand closer to a specific desired portion of the screen. According to exemplary embodiments of the present invention, the terminal may detect the touchless tap by determining that the user's hand is moved closer to the screen. For example, the terminal may compare coordinates of a detected user's hand and determine that a change in the coordinates of the detected user's hand corresponds to the touchless tap.
  • Referring to FIG. 3, the spatial gesture may be a touchless sweep. The touchless sweep may be a sweep motion in at least one of the up, down, left, and right directions. For example, the touchless sweep may correspond to a user's hand being disposed such that a surface of the palm of the user's hand is substantially perpendicular to the screen surface and moved in at least one of the up, down, left, and right directions. According to exemplary embodiments of the present invention, the terminal may detect the touchless sweep by determining that the user's hand is disposed in a predefined configuration (e.g., such that the surface of the palm is substantially perpendicular to the screen surface) and determining that the user's hand is moved relative to (e.g., across) the screen. For example, the terminal may compare coordinates of a detected user's hand and determine that a change in the coordinates of the detected user's hand corresponds to the touchless sweep.
  • Referring to FIG. 4, the spatial gesture may be a touchless horizontal wave. The touchless horizontal wave may be a motion of the user's hand so as to variously move across the screen from one side of the terminal to an opposite side of the terminal. For example, the motion of the user's hand may be moved from a left side of the terminal screen to the right side of the screen. The motion of the user's hand may be repeated in the touchless horizontal wave. As an example, the horizontal wave may include the user's hand being outstretched such that the user's palm is facing (e.g., substantially parallel to) the screen. to exemplary embodiments of the present invention, the terminal may detect the touchless horizontal wave by determining that the user's hand is disposed in a predefined configuration (e.g., such that the surface of the palm is substantially parallel to the screen surface) and determining that the user's hand is moved relative to (e.g., from the left side to the right side, from the right side to the left side, or the like) the screen. For example, the terminal may compare coordinates of a detected user's hand and determine that a change in the coordinates of the detected user's hand corresponds to the touchless horizontal wave.
  • Referring to FIG. 5, the spatial gesture may be touchless vertical wave. The touchless vertical wave may be a motion of the user's hand so as to variously move across the screen from one side of the terminal to an opposite side of the terminal. For example, the motion of the user's hand may be moved from a top side of the terminal screen to the bottom side of the screen. The motion of the user's hand may be repeated in the touchless vertical wave. As an example, the vertical wave may include the user's hand being outstretched such that the user's palm is facing (e.g., substantially parallel to) the screen when the hand is between the motion to the top side of the terminal and the motion to the bottom side of the terminal. According to exemplary embodiments of the present invention, the terminal may detect the touchless vertical wave by determining that the user's hand is disposed in a predefined configuration (e.g., such that the surface of the palm is substantially parallel to the screen surface) and determining that the user's hand (e.g., the user's outstretched fingers) is moved relative to (e.g., from the top side to the bottom side, from the bottom side to the top side, or the like) the screen. For example, the terminal may compare coordinates of a detected user's hand (e.g., the user's outstretched fingers) and determine that a change in the coordinates of the detected user's hand corresponds to the touchless vertical wave.
  • Referring to FIG. 6, the spatial gesture may be a touchless cover. For example, the touchless cover may correspond to the user's hand being moved so as to cover the screen. The touchless cover may include the user's hand having the palm facing the screen so as to be substantially parallel to the screen and substantially covering the screen. The user's fingers may be outstretched during the touchless cover. According to exemplary embodiments of the present invention, the terminal may detect the touchless cover by determining that the user's hand is disposed in a predefined configuration (e.g., such that the surface of the palm is substantially parallel to the screen surface) and determining that the user's hand is moved relative to the screen (e.g., so as to cover substantially the entire screen). For example, the terminal may compare coordinates of a detected user's hand and determine that a change in the coordinates of the detected user's hand corresponds to the touchless cover. For example, the touchless cover may include the user moving the hand closer to the screen and so as to cover substantially the entire screen. As another example, the touchless cover may include the user moving the hand substantially along an original plane so as to cover substantially the entire screen.
  • Referring to FIG. 7, the spatial gesture may be a touchless palm push. For example, the touchless palm push may include the user's hand being moved closer to the screen. The touchless palm push may include the user's fingers being outstretched and the palm facing the screen (e.g., such that the surface of the palm is substantially parallel to the screen).According to exemplary embodiments of the present invention, the terminal may detect the touchless palm push by determining that the user's hand is disposed in a predefined configuration (e.g., such that the surface of the palm is substantially parallel to the screen surface) and determining that the user's hand is moved relative to the screen (e.g., such that the user's palm is moved closer to the screen). For example, the terminal may compare coordinates of a detected user's hand and determine that a change in the coordinates of the detected user's hand (e.g., the user's palm) corresponds to the touchless palm push.
  • Referring to FIG. 8, the spatial gesture may be a touchless circling in a clockwise direction. The touchless circling may include moving a portion of the user's hand in a clockwise direction. For example, the touchless circling may include moving the palm in a clockwise direction relative to the screen. As another example, the touchless circling may include moving at least one finger in a clockwise direction relative to the screen. According to exemplary embodiments of the present invention, the terminal may detect the touchless circling by determining that the user's hand is disposed in a predefined configuration (e.g., such that the surface of the palm is substantially parallel to the screen surface) and determining that the user's hand is moved relative to the screen (e.g., such that at least a portion of the user's hand is moved in a clockwise direction). For example, the terminal may compare coordinates of a detected user's hand (e.g., the palm, at least one finger, and/or the like) and determine that a change in the coordinates of the detected user's hand (e.g., the palm, at least one finger, and/or the like) corresponds to the touchless circling in a clockwise direction.
  • Referring to FIG. 9, the spatial gesture may be a touchless circling in a counter-clockwise direction. The touchless circling may include moving a portion of the user's hand in a counter-clockwise direction. For example, the touchless circling may include moving the palm in a counter-clockwise direction relative to the screen. As another example, the touchless circling may include moving at least one finger in a counter-clockwise direction relative to the screen. According to exemplary embodiments of the present invention, the terminal may detect the touchless circling by determining that the user's hand is disposed in a predefined configuration (e.g., such that the surface of the palm is substantially parallel to the screen surface) and determining that the user's hand is moved relative to the screen (e.g., such that at least a portion of the user's hand is moved in a counter-clockwise direction). For example, the terminal may compare coordinates of a detected user's hand (e.g., the palm, at least one finger, and/or the like) and determine that a change in the coordinates of the detected user's hand (e.g., the palm, at least one finger, and/or the like) corresponds to the touchless circling in a counter-clockwise direction.
  • Referring to FIG. 10, the spatial gesture may be a touchless grab. The touchless grab may include moving a user's hand such that the user's fingers are moved from a substantially outstretched position to a position in which the fingers are brought closer to the user's thumb, or such that the fingers are brought towards a clenched fist. According to exemplary embodiments of the present invention, the terminal may detect the touchless grab by determining that the user's hand is disposed in a predefined configuration (e.g., such that the user's fingers are substantially outstretched relative to the user's palm and/or thumb) and determining that the user's hand is moved relative to the screen (e.g., such that the user's fingers are brought towards at least one of the user's thumb and palm so as to form or begin to form, for example, a clenched fist). For example, the terminal may compare coordinates of a detected user's hand (e.g., the palm, at least one finger, and/or the like) and determine that a change in the coordinates of the detected user's hand (e.g., the palm, at least one finger, and/or the like) corresponds to the touchless grab.
  • Referring to FIG. 11, the spatial gesture may be a touchless pinch. The touchless pinch may include moving a user's hand from a position in which at least one finger is outstretched to a position in which at least one outstretched finger is substantially touching the thumb. For example, the touchless pinch may include moving an outstretched finger and outstretched thumb towards each other. According to exemplary embodiments of the present invention, the terminal may detect the touchless pinch by determining that the user's hand is disposed in a predefined configuration (e.g., such that at least one of the user's fingers is substantially outstretched relative to the user's palm and/or thumb) and determining that the user's hand is moved relative to the screen (e.g., such that at least one outstretched finger is brought towards the user's thumb). For example, the terminal may compare coordinates of a detected user's hand (e.g., the palm, at least one finger, the thumb, and/or the like) and determine that a change in the coordinates of the detected user's hand (e.g., the palm, at least one finger, the thumb, and/or the like) corresponds to the touchless pinch.
  • Referring to FIG. 12, the spatial gesture may be a touchless approach. For example, the touchless approach may include the user's hand being moved closer to the screen. The touchless approach may include the user's fingers being outstretched and the palm facing the screen (e.g., such that the surface of the palm is substantially parallel to the screen).According to exemplary embodiments of the present invention, the terminal may detect the touchless approach by determining that the user's hand is disposed in a predefined configuration (e.g., such that the surface of the palm is substantially parallel to the screen surface) and determining that the user's hand is moved relative to the screen (e.g., such that the user's hand is moved closer to the screen). For example, the terminal may compare coordinates of a detected user's hand and determine that a change in the coordinates of the detected user's hand (e.g., the user's palm) corresponds to a touchless approach. According to exemplary embodiments, the terminal may distinguish between a touchless palm push and a touchless approach based on the distance along an axis perpendicular to the screen which the hand moves in a direction closer to the screen. For example, the touchless approach may include the hand being initially disposed at a distance farther from the screen than for the touchless palm push.
  • Referring to FIG. 13, the spatial gesture may be a touchless tracking. For example, the touchless tracking may include a user moving an outstretched finger across the screen. The touchless tracking may include the user moving the outstretched finger along a path across the screen. According to exemplary embodiments of the present invention, the terminal may detect the touchless tracking by determining that the user's hand is disposed in a predefined configuration (e.g., such that at least one of the user's fingers is substantially outstretched relative to the user's palm and/or thumb) and determining that the user's hand is moved relative to the screen (e.g., such that at least one outstretched finger is moved across the screen). For example, the terminal may compare coordinates of a detected user's hand (e.g., the outstretched finger) and determine that a change in the coordinates of the detected user's hand (e.g., the outstretched finger) corresponds to the touchless tracking. For example, for a touchless tracking gesture, the distance between the outstretched finger and the surface of the screen may remain substantially the same throughout the tracking.
  • Referring to FIG. 14, the spatial gesture may be a touchless palm push of two hands. For example, the touchless palm push of two hands may include the two hands being moved closer to the screen. The touchless palm push of two hands may include the fingers on two hands being outstretched and the hands' respective palms facing the screen (e.g., such that the surface of the palm is substantially parallel to the screen).According to exemplary embodiments of the present invention, the terminal may detect the touchless palm push of two hands by determining that the two hands hand are respectively disposed in a predefined configuration (e.g., such that the surface of the palm is substantially parallel to the screen surface) and determining that both of the two hands are moved relative to the screen (e.g., such that the respective palms are moved closer to the screen). For example, the terminal may compare coordinates of a detected two hands and determine that a change in the coordinates of the detected two hands (e.g., the respective palms) corresponds to the touchless palm push of two hands.
  • Referring to FIG. 15, the spatial gesture may be a touchless pinch of two hands. For example, the touchless pinch of two hands may include at least one of two hands being moved relative to the screen so as to move the two hands closer together. The touchless pinch of two hands may include at least one finger or thumb on each of the two hands being outstretched and the outstretched at least one finger or thumb of each of the two hands being moved closer together. The touchless pinch of two hands may include the outstretched at least one finger or thumb of each of the two hands being moved so as to be in touching relation. According to exemplary embodiments of the present invention, the terminal may detect the touchless pinch of two hands by determining that the two hands hand are respectively disposed in a predefined configuration (e.g., such that each of the two hands has at least one finger or thumb outstretched) and determining that at least one of the two hands is moved relative to the screen towards the other hand (e.g., such that the outstretched at least one finger or thumb on each hand is moved towards each other, and particularly, such that the outstretched at least one finger or thumb on each hand is moved so as to be in touching relation). For example, the terminal may compare coordinates of a detected two hands and determine that a change in the coordinates of the detected at least one of the two hands (e.g., the at least one outstretched finger or thumb) corresponds to the touchless pinch of two hands.
  • Referring to FIG. 16, the spatial gesture may be a touchless parallel rotate. The touchless parallel rotate may include two hands disposed at a distance from one another and moved around an axis that is perpendicular to the screen. For example, the touchless parallel rotate may correspond to two hands being disposed such that a surface of the palm of the two hands is substantially perpendicular to the screen surface and disposed such that the palm of each hand is parallel to the other, and the hands moved around an axis that is perpendicular to the screen. According to exemplary embodiments of the present invention, the terminal may detect the touchless parallel rotate by determining that each of the two hands is disposed in a predefined configuration (e.g., such that the surface of the palm is substantially perpendicular to the screen surface), determining that the palms of each of the two hands is substantially parallel with the other, and determining that the two hands are moved relative to the screen(e.g., around an axis that is perpendicular to (and extends from) the screen). For example, the terminal may compare coordinates of a detected two hands and determine that a change in the coordinates of the detected two hands corresponds to the touchless parallel rotate.
  • Referring to FIG. 17, the spatial gesture may be a touchless clap. The touchless clap may include two hands disposed at a distance from one another (e.g., such that the hands' respective palms are facing each other) and moved towards each other (e.g., along an axis that is substantially parallel to the screen). For example, the touchless clap may correspond to the two hands being disposed such that a surface of the respective palms of the two hands is substantially perpendicular to the screen surface and disposed such that the palm of each hand is parallel to the other (e.g., the palms are opposing or facing each other), and the hands are moved along an axis that is substantially parallel to the screen. According to exemplary embodiments of the present invention, the terminal may detect the touchless clap by determining that each of the two hands is disposed in a predefined configuration (e.g., such that the surface of the palm is substantially perpendicular to the screen surface), determining that the palms of each of the two hands opposes or faces the other, and determining that the two hands are moved relative to the screen (e.g., around an axis that is substantially parallel to the screen). For example, the terminal may compare coordinates of a detected two hands and determine that a change in the relative coordinates of the detected two hands corresponds to the touchless clap.
  • Referring to FIG. 18, the spatial gesture may be a touch with (e.g., in conjunction with) a touchless sweep. The touch with the touchless sweep may include a user touching the screen with an input means (e.g., a stylus, a finger, a thumb, and/or the like) and a sweep motion in at least one of the up, down, left, and right directions. For example, the touch with the touchless sweep may correspond to a user touching the screen with an input means and a hand being disposed such that a surface of the palm of the hand is substantially perpendicular to the screen surface and moved in at least one of the up, down, left, and right directions. According to exemplary embodiments of the present invention, the terminal may detect a touch (e.g., by a change in capacitance in the screen), and detect the touchless sweep by determining that the hand is disposed in a predefined configuration (e.g., such that the surface of the palm is substantially perpendicular to the screen surface) and determining that the hand is moved relative to (e.g., across at least a portion of) the screen. For example, the terminal may compare coordinates of a detected hand and determine that a change in the coordinates of the detected hand in conjunction with a touch event corresponds to the touch with touchless sweep.
  • Referring to FIG. 19, the spatial gesture may be a finger hovering. For example, the finger hovering may correspond to a user holding a finger (or other input means) over substantially a same position relative to the screen for at least a predefined period of time. The finger hovering may include the finger being held in substantially a constant same position relative to the screen which is disposed away from the screen such that no contact is made between the input means disposed away from the screen and the screen surface. According to exemplary embodiments of the present invention, the terminal may detect the finger hovering by determining that the input means is held in substantially a constant position away from the screen for at least a predefined period of time. For example, the terminal may compare coordinates of a detected input means over a predefined period of time and determine that a lack of change in the coordinates of the detected input means corresponds to the finger hovering.
  • Referring to FIG. 20, the spatial gesture may be a palm. For example, the palm may correspond to a user holding an outstretched hand (e.g., having fingers outstretched) over substantially a same position relative to the screen for at least a predefined period of time. The palm may include the palm being held in substantially a constant same position relative to the screen which is disposed away from the screen such that no contact is made between the palm and the screen surface. According to exemplary embodiments of the present invention, the terminal may detect the palm by determining that the palm is held in substantially a constant position away from the screen for at least a predefined period of time. For example, the terminal may compare coordinates of a detected palm over a predefined period of time and determine that a lack of change in the coordinates of the detected palm corresponds to a palm. As another example, the terminal may compare the configuration of the hand over a predefined period of time and determine that maintaining the configuration of the hand in an outstretched configuration over the predefined of time corresponds to the palm.
  • Referring to FIG. 21, the spatial gesture may be a fist. For example, the fist may correspond to a user holding a clenched hand (e.g., the fingers and thumb clenched in a fist) over substantially a same position relative to the screen for at least a predefined period of time. The fist may include the hand (e.g., the fist) being held in substantially a constant same position relative to the screen which is disposed away from the screen such that no contact is made between the fist and the screen surface. According to exemplary embodiments of the present invention, the terminal may detect the fist by determining that the hand is in a predetermined configuration (e.g., the hand forms a clenched fist), and determining that the fist is held in substantially a constant position away from the screen for at least a predefined period of time. For example, the terminal may compare coordinates of a detected fist over a predefined period of time and determine that a lack of change in the coordinates of the detected fist corresponds to a fist. As another example, the terminal may compare the configuration of the hand over a predefined period of time and determine that maintaining the configuration of the hand in a fist configuration over the predefined of time corresponds to the fist.
  • Referring to FIG. 22, the spatial gesture may be a thumb-up.For example, the thumb-up may correspond to a user holding a hand having an outstretched thumb and fingers clenched towards the palm over substantially a same position relative to the screen for at least a predefined period of time. The thumb-up may include the hand (e.g., the outstretched thumb) being held in substantially a constant same position relative to the screen which is disposed away from the screen such that no contact is made between the hand and the screen surface. According to exemplary embodiments of the present invention, the terminal may detect the thumb by determining that the hand is in a predetermined configuration (e.g., the hand forms a thumb-up sign), and determining that the thumb-up is held in substantially a constant position away from the screen for at least a predefined period of time. For example, the terminal may compare coordinates of a detected thumb-up over a predefined period of time and determine that a lack of change in the coordinates of the detected thumb-up corresponds to the thumb-up. As another example, the terminal may compare the configuration of the hand over a predefined period of time and determine that maintaining the configuration of the hand in the thumb-up configuration over the predefined of time corresponds to the thumb-up.
  • Referring to FIG. 23, the spatial gesture may be a pointing gesture. For example, the pointing gesture may correspond to a user holding a hand having an input means (e.g., an outstretched finger, thumb, or the like) over substantially a same position relative to the screen for at least a predefined period of time. The pointing gesture may include the hand (e.g., the outstretched finger) being held in substantially a constant same position relative to the screen which is disposed away from the screen such that no contact is made between the outstretched finger and the screen surface. According to exemplary embodiments of the present invention, the terminal may detect the pointing gesture by determining that the hand is in a predetermined configuration (e.g., the hand forms a pointing gesture in which at least one finger is outstretched), and determining that the pointing gesture is held in substantially a constant position away from the screen for at least a predefined period of time. For example, the terminal may compare coordinates of a detected pointing gesture over a predefined period of time and determine that a lack of change in the coordinates of the detected pointing gesture corresponds to a pointing gesture. As another example, the terminal may compare the configuration of the hand over a predefined period of time and determine that maintaining the configuration of the hand in the pointing gesture configuration over the predefined of time corresponds to the pointing gesture.
  • Referring to FIG. 24, the spatial gesture may be an okay gesture. For example, the okay gesture may correspond to a user holding a hand having an index finger and thumb forming a circle and having the remaining fingers outstretched over substantially a same position relative to the screen for at least a predefined period of time. The okay gesture may include the hand (e.g., the hand having an index finger and thumb forming a circle and having the remaining fingers outstretched) being held in substantially a constant same position relative to the screen which is disposed away from the screen such that no contact is made between the hand and the screen surface. According to exemplary embodiments of the present invention, the terminal may detect the okay gesture by determining that the hand is in a predetermined configuration (e.g., the hand having an index finger and thumb forming a circle and having the remaining fingers outstretched), and determining that the okay gesture is held in substantially a constant position away from the screen for at least a predefined period of time. For example, the terminal may compare coordinates of a detected okay gesture over a predefined period of time and determine that a lack of change in the coordinates of the detected okay gesture corresponds to the okay gesture. As another example, the terminal may compare the configuration of the hand over a predefined period of time and determine that maintaining the configuration of the hand in the okay gesture configuration over the predefined of time corresponds to the okay gesture.
  • Referring to FIG. 25, the spatial gesture may be a numbering gesture. For example, the numbering gesture may correspond to a user holding a hand having a predefined number of fingers outstretched over substantially a same position relative to the screen for at least a predefined period of time. The numbering gesture may include the hand (e.g., the thumb and two fingers being outstretched) being held in substantially a constant same position relative to the screen which is disposed away from the screen such that no contact is made between the hand and the screen surface. According to exemplary embodiments of the present invention, the terminal may detect the numbering gesture by determining that the hand is in a predetermined configuration (e.g., the thumb and two fingers being outstretched), and determining that the numbering gesture is held in substantially a constant position away from the screen for at least a predefined period of time. For example, the terminal may compare coordinates of a detected numbering gesture over a predefined period of time and determine that a lack of change in the coordinates of the detected numbering gesture corresponds to the numbering gesture. As another example, the terminal may compare the configuration of the hand over a predefined period of time and determine that maintaining the configuration of the hand in the numbering gesture configuration over the predefined of time corresponds to the numbering gesture. According to exemplary embodiments of the present invention, the numbering gesture may include a plurality of gestures each of which corresponds to a different combination of fingers and thumb being outstretched.
  • Referring back to FIG. 1, after performing step S120, the terminal determines a certain action corresponding to the analysis result at step S130.
  • According to exemplary embodiments of the present invention, the terminal is capable of determining the action corresponding to the analyzed spatial gesture. For example, the terminal is capable of determining the operation mode of the terminal. The operation mode of the terminal may be determined according to the detected spatial gesture.
  • According to exemplary embodiments of the present invention, the operation mode of the terminal may include any of camera shooting mode, multimedia playback mode, text messaging mode, voice call mode, e-book-reading mode, navigation mode, calendar display mode, email messaging mode, video game play mode, and the like.
  • According to exemplary embodiments of the present invention, the terminal is also capable of determining a certain action corresponding to the spatial gesture in association with the operation mode.
  • According to exemplary embodiments of the present invention, if the terminal is in the multimedia playback mode, and if the detected spatial gesture corresponds to the sweeping gesture in a certain direction, the terminal is capable of regarding this gesture as a command for playback of the next song in the corresponding direction on a playlist.
  • According to exemplary embodiments of the present invention, if the terminal is in the e-book-reading mode, and if the detected spatial gesture corresponds to the sweeping gesture in a certain direction, the terminal is capable of regarding this gesture as a command for displaying the next page of the e-book in the corresponding direction.
  • In order to determine the action corresponding to the spatial gesture, a set of spatial interactions as shown in FIG. 26 can be used.
  • FIG. 26 is a diagram illustrating an exemplary spatial interaction set for use in a spatial interaction-based terminal control method according to an exemplary embodiment of the present invention.
  • Referring to FIG. 26, the terminal is capable of defining the titles of the spatial gestures, describing detailed actions corresponding to the spatial gestures, and storing metaphors of the corresponding spatial gestures. The metaphors indicate actions corresponding to the spatial actions. For example, the terminal is capable of analyzing the detected spatial gesture and retrieving the action mapped to the spatial gesture from the interaction set based on the analysis result. The defined spatial gestures and associated action mapped to the defined spatial gestures may be stored as a Look-Up Table (LUT).
  • Thereafter, the terminal performs the action at step S140. For example, based on detection and determination of a spatial gesture, the terminal may determine an operation or action corresponding to the spatial gesture and perform the operation or action corresponding to the spatial gesture.
  • According to exemplary embodiments of the present invention, the terminal is capable of performing the determined action.
  • The spatial gesture-based terminal control methods according to exemplary embodiments of the present invention are described hereinafter with reference to the accompanying drawings in detail.
  • FIG. 27 is a diagram illustrating exemplary screen displays presented in association with a spatial interaction-based terminal control method according to an exemplary embodiment of the present invention.
  • Referring to FIG. 27, the terminal detects an input image taken by the camera and analyzes the image. The terminal is capable of analyzing at least one of direction, speed, and movement distance of an input means based on the input image. For example, the terminal is capable of detecting the movement of the user's hand to the left/right direction.
  • The terminal is capable of performing an action corresponding to the analysis result. For example, the terminal is capable of scrolling items on the screen of the display unit in the left/right direction.
  • The terminal is capable of scrolling the items at a speed in synchronization with the movement speed of the input means. The terminal is also capable of turning one or more pages (or a chapter) of an e-book in synchronization with the movement speed and distance of the input means. The terminal is also capable of turning the photos of an album application or playing a next song listed next to the currently playing song in a playlist depending in accordance with the spatial gesture.
  • As an example, FIG. 27 illustrates the turning of the page of a document or e-book based on the detection of a touchless tracking from one side of the screen to an opposing side of the screen, the detection of a touchless sweep from one side of the screen to the opposing side of the screen having a slow speed, and the detection of a touchless sweep from one side of the screen to the opposing side of the screen having a fast speed. According to the detected gesture, the terminal may turn a single page, a few pages, or several (e.g., a lot) of pages. For example, the number of pages turned may correspond to how pronounced the gesture appears to the terminal.
  • FIG. 28 is a diagram illustrating an exemplary situation of making a spatial gesture available in a spatial interaction-based terminal control method according to an exemplary embodiment of the present invention.
  • Referring to FIG. 28, the terminal is capable of detecting a touch gesture and a spatial gesture simultaneously or sequentially. The terminal is also capable of detecting a spatial gesture in the state of tracing the touch gesture. For example, the terminal is capable of detecting the spatial gesture made in a certain direction in the state that a touch gesture made at a certain position is maintained.
  • In this case, the terminal is capable of selecting an item at the position at which the touch gesture is made (e.g., detected) on the screen and moving the corresponding selected item in the left/right direction on the screen in synchronization with the movement direction of the spatial gesture. For example, the selected item may be an icon, a widget, an image arranged on the background, or the like.
  • The terminal is also capable of scrolling the items in the left/right direction, with the exception of the selected item, in synchronization with the movement direction of the spatial gesture. The items can be any of all the types of User Interface (UI) items displayed on the background screen such as icon and widget.
  • According to exemplary embodiments of the present invention, the selected item corresponding to the touch gesture may remain in a static display position and other items displayed on the screen may be moved or changed according to the detected spatial gesture (e.g., a touchless sweep).
  • The terminal is also capable of placing a call or sending a text message or a file to a recipient selected by a touch gesture according to the spatial gesture.
  • FIG. 29 is a diagram illustrating an exemplary situation of making a spatial gesture available in a spatial interaction-based terminal control method according to an exemplary embodiment of the present invention.
  • Referring to FIG. 29, the terminal is capable of detecting an input image taken by the camera and analyzing the input image. The terminal is capable of analyzing the input image to detect the spatial gesture corresponding to the fist.
  • The terminal is capable of performing an action corresponding to the analysis result. For example, upon detection of the spatial gesture corresponding to the fist, the terminal may perform a corresponding predefined action.
  • For example, the terminal is capable of capturing and storing the content displayed on the screen or the entire content including a part not displayed on the screen. The terminal is also capable of storing the captured content in the form of an image.
  • In the case of storing the captured content as an image, the terminal is capable of processing the image to a format having predetermined size, resolution, and aspect ratio in synchronization of movement speed, finger unfolding level before fisting, finger folding level after fisting, and the like.
  • FIG. 30 is a diagram illustrating an exemplary situation of making a spatial gesture available in a spatial interaction-based terminal control method according to an exemplary embodiment of the present invention.
  • Referring to FIG. 30, the terminal is capable of detecting an input image taken by the camera and analyzing the input image. The terminal is capable of analyzing the input image to detect the spatial gesture oriented to a certain direction.
  • The terminal is capable of performing an action navigating web browsers according to the spatial gesture. In the case that multiple web browsers are arranged in the form of tabs, the terminal is capable of performing an action of navigating the tabs. In the case that multiple applications are running, the terminal is capable of performing an action navigating the applications.
  • The terminal is also capable of displaying a screen on which the web browser is moved to the left/right/up/down side or the top/bottom side. The terminal is also capable of performing an action of refreshing, expanding/contracting, copying, navigating, printing, turning to previous/next page onto the web browser displayed on the screen, and the like.
  • FIG. 31 is a diagram illustrating an exemplary situation of making a spatial gesture available in a spatial interaction-based terminal control method according to an exemplary embodiment of the present invention.
  • Referring to FIG. 31, the terminal is capable of detecting an input image taken by the camera and analyzing the shape of the input means. For example, the terminal is capable of determining the shape of the hand. The terminal is capable of determining the number of unfolded fingers, shape of the fingers, palm and back of the hand, positions and sizes of the fingers.
  • The terminal is capable of performing an action according to the shape of the input means.
  • According to exemplary embodiments of the present invention, the terminal is capable of determining the shooting time of the camera based on the number of unfolded fingers. For example, the terminal is capable of controlling a camera function such that the camera captures an image after or during the time corresponding to the number of unfolded fingers. The terminal is also capable of controlling the camera function such that the camera to capture image immediately according to the shape of the hand.
  • According to exemplary embodiments of the present invention, the terminal is also capable of controlling the camera to autofocus according to the shape of the hand. For example, the camera may be configured to perform autofocusing when the thumb up gesture is detected.
  • The terminal is also capable of applying a certain visual effect according to the detected spatial gesture.
  • FIG. 32 is a diagram illustrating an exemplary situation of making a spatial gesture available in a spatial interaction-based terminal control method according to an exemplary embodiment of the present invention.
  • Referring to FIG. 32, the terminal is capable of detecting the movement in a certain direction or the gesture of the input means of approaching the terminal. For example, the terminal is capable of detecting a finger approaching
  • The terminal is capable of analyzing the movement direction and speed, coordinates of the approaching finger, and distance.
  • The terminal is capable of performing an action corresponding to the analysis.
  • According to exemplary embodiments of the present invention, the terminal is capable of displaying a water surface moving according to a certain movement direction and speed of the input means. The terminal may also display a screen showing a water drop falling at a certain position. The terminal is capable of displaying screen images giving various emotional effects. The terminal is capable of displaying a 2-Dimensional (2D) or 3-Dimensional (3D) screen images with the output of vibration and sound effect in synchronization with the change of the screen image.
  • For example, upon detection of a touchless sweep, the terminal may display a screen showing a wave, a ripple, or a series of waves travel across the screen. As another example, upon detection of a pointing effect or a touchless hover, the terminal may display a screen showing a ripple effect associated with an object or droplet of water being dropped into a liquid.
  • FIG. 33 is a diagram illustrating an exemplary situation of making a spatial gesture available in a spatial interaction-based terminal control method according to an exemplary embodiment of the present invention.
  • Referring to FIG. 33, the terminal is capable of detecting a movement of an object approaching the terminal or the gesture of an input means close to the terminal and analyzing the movement or gesture, and performing an action corresponding to the analysis.
  • The terminal is capable of displaying simple information in association with the operation mode. For example, the terminal is capable of displaying weather information, a time, a battery charge state, and an alarm window in a screen lock mode,a power saving mode, or the like.
  • FIG. 34 is a diagram illustrating an exemplary situation of making a spatial gesture available in a spatial interaction-based terminal control method according to an exemplary embodiment of the present invention.
  • Referring to FIG. 34, the terminal is capable of detecting a movement and analyzing the direction of the movement. The terminal is also capable of performing an action in association with the operation mode.
  • According to exemplary embodiments of the present invention, the terminal is capable of determining whether the terminal is in a driving mode based on data acquired from velocity/acceleration sensor, gyro sensor, and the like. If a movement or spatial gesture is detected when the terminal is in a driving mode, a direction of the movement or spatial gesture is detected and a corresponding action such as, for example, receiving an inbound call, rejecting the call, rejecting the call with the transmission of the notification message informing that the user is at the wheel, and the like may be performed. For example, spatial gestures may be used as a hands-free means for controlling the terminal when the terminal is determined to be in a driving mode.
  • FIG. 35 is a diagram illustrating an exemplary situation of making a spatial gesture available in spatial interaction-based terminal control method according to an exemplary embodiment of the present invention.
  • Referring to FIG. 35, the terminal is capable of controlling multimedia playback based on the spatial gesture.
  • For example, the terminal is capable of detecting the image input through the camera and navigating forward/backward the multimedia file list, playing/pausing, shuffling multimedia files in accordance to the movement and/or shape of the input means, and the like.
  • FIG. 36 is a block diagram illustrating a configuration of a terminal according to an exemplary embodiment of the present invention.
  • Referring to FIG. 36, the terminal 100 includes an input unit 110, a sensing unit 120, a control unit 130, a camera unit 140, a storage unit 150, and a display unit 160.
  • The input unit 110 generates an input signal in response to a user's manipulation for controlling the terminal 100. The input unit 110 can be implemented with at least one of a keypad, a dome switch, a touch pad (resistive/capacitive), jog wheel, jog switch, and the like.
  • The sensing unit 120 includes at least one of a touch sensor 121, a proximity sensor 122, a velocity/acceleration sensor 123, a gyro sensor 124, and the like.
  • The touch sensor 121 is capable of detecting a touch input. The touch sensor 121 may be implemented using at least one of a touch film, a touch sheet, a touch pad, and the like.
  • According to exemplary embodiments of the present invention, the touch sensor 121 may be integrated with the display unit 160 as a touchscreen. The touch sensor 121 can be configured to convert the pressure,change in capacitance, and the like at a certain position of the display unit to an electrical input signal. The touch sensor 121 may be capable of detecting the touch position and size and touch pressure.
  • If a touch input is detected, the touch sensor 121 generates a corresponding touch signal and transmits the touch signal to the control unit 130. The control unit 130 is capable of determining the position at which the touch input is made based on the touch signal.
  • According to exemplary embodiments of the present invention, the display unit 160 may be implemented as a touchscreen.
  • The proximity sensor 122 may be arranged inside of the terminal 100 around the display unit 160. The proximity sensor 122 may be capable of detecting an object approaching a predetermined detection surface or the existence of an object around the terminal 100 by means of the electric field or infrared ray without physical contact.
  • The proximity sensor 122 may be implemented with at least one of a transparent photoelectric sensor, a direct reflective photoelectric sensor, a mirror reflective optical sensor, a microwave oscillation proximity sensor, a capacitive proximity sensor, a magnetic proximity sensor, an infrared proximity sensor, and the like. If the display unit 160 is implemented in the form of a capacitive touchscreen, the proximity sensor 122 may be configured to detect the approach of a pointer based on the change of the electric field.
  • The proximity sensor 122 may be capable of detecting a proximity touch and proximity touch pattern (e.g., proximity touch distance, proximity touch direction, proximity touch speed, proximity touch time, proximity touch position, proximity touch movement state, and the like). The information associated with the detected proximity touch gesture and the proximity touch pattern may be displayed on the screen of the display unit 160.
  • The velocity/acceleration sensor 123 may be capable of detecting the motion of the terminal 100. The velocity/acceleration sensor 123 may be capable of including a device for converting change of acceleration in a certain direction to an electrical signal.
  • According to exemplary embodiments of the present invention, the control unit 130 is capable of determining whether the terminal is in a driving mode based on the terminal's motion detected by the velocity/acceleration sensor 123.
  • The gyro sensor 124 may be capable of detecting the orientation of the terminal. The gyro sensor 124 may measure the angular velocity of the terminal 100 to detect the rotation angle to the reference orientation. For example, the gyro sensor 124 may be capable of detecting or determining ratio angles to three orientation axes(e.g., azimuth, pitch, and roll).
  • According to exemplary embodiments of the present invention, the control unit 130 may be capable of determining whether the terminal 100 rests on a flat surface or is mounted on a docking station based on the detected orientation.
  • The control unit 130 controls the operations of the components of the terminal 100. For example, the control unit 130 is capable of detecting a spatial gesture to the terminal 100, analyzing the spatial gesture, determining an action mapped to the spatial gesture, and performing the determined action.
  • The detailed operations of the control unit 130 are as described above.
  • The camera unit 140 may process the video frame such as still and motion picture acquired by the image sensor in the video call mode, a still picture capturing mode, or motion picture capturing mode. The processed video frame can be displayed by the display unit 160.
  • The camera unit 140 is capable of transferring the processed video frame to the storage unit 150 and the control unit 130. The control unit 130 analyzes the image from the camera unit 140 to detect a shape, form, and the like on the image.
  • The storage unit 150 may be capable of storing programs associated with the operation of the control unit 130 and input/output data (e.g., phonebook, messages, still and motion images, and the like) temporarily or semi-permanently. The storage unit 150 may also be capable of storing the data associated with various vibration pattern and sound effects output in detection of the touch gesture on the touchscreen.
  • The storage unit 150 may also be capable of storing a set of spatial interactions mapping the spatial gestures and corresponding action to be performed interactively.
  • The storage unit 150 can be implemented with at least one of flash memory, hard disk, micro multimedia card (e.g. SD and xD memory cards), Random Access Memory (RAM), Static RAM (SRAM), Read-Only Memory (ROM), Electrically Erasable Programmable Read-Only Memory (EEPROM), Programmable Read-Only Memory (PROM), magnetic memory, magnetic disc, optical disc, and the like.
  • The display unit 160 may be capable of displaying (outputting) the information processed by the terminal 100 in the form of a UI or Graphical UI (GUI).
  • The display unit 160 can be implemented with one of a Liquid Crystal Display (LCD), a Thin Film Transistor LCD (TFT LCD), an Organic Light Emitting Diode (OLED), a flexible display, and a 3-Dimensional (3D) display.
  • The display unit 160 can be implemented as a touchscreen with interlayered touch sensor 121 and/or proximity sensor 123. In this case, the display unit 160 may be capable of operating as an input device as well as the output device.
  • As described above, the terminal and spatial interaction-based terminal control method according to exemplary embodiments of the present invention are capable of controlling the terminal intuitively and conveniently with spatial gesture of the user.
  • In addition, the terminal and spatial interaction-based terminal control method according to exemplary embodiments of the present invention are advantageous in controlling the terminal efficiently without direct touch to the terminal even. For example, the terminal and method for controlling such a terminal according to exemplary embodiments of the present invention may be beneficial when a user is operating the terminal with a glove worn on the user's hand or when the terminal is stained with foreign materials.
  • It will be appreciated that exemplary embodiments of the present invention according to the claims and description in the specification can be realized in the form of hardware, software or a combination of hardware and software.
  • Any such software may be stored in a non-transitory computer readable storage medium. The computer readable storage medium stores one or more programs (software modules), the one or more programs comprising instructions, which when executed by one or more processors in an electronic device, cause the electronic device to perform a method of the present invention.
  • Any such software may be stored in the form of volatile or non-volatile storage such as, for example, a storage device like a Read Only Memory (ROM), whether erasable or rewritable or not, or in the form of memory such as, for example, Random Access Memory (RAM), memory chips, device or integrated circuits or on an optically or magnetically readable medium such as, for example, a Compact Disk (CD), Digital Versatile Disc (DVD), magnetic disk or magnetic tape or the like. It will be appreciated that the storage devices and storage media are exemplary embodiments of machine-readable storage that are suitable for storing a program or programs comprising instructions that, when executed, implement exemplary embodiments of the present invention. Accordingly, exemplary embodiments provide a program comprising code for implementing apparatus or a method as claimed in any one of the claims of this specification and a non-transitory machine-readable storage storing such a program.
  • While the invention has been shown and described with reference to certain exemplary embodiments thereof, it will be understood by those skilled in the art the various changes in form and details may be made therein without from the spirit and scope of the invention as defined by the appended claims and their equivalents.
  • Embodiments according to the present disclosure are provided in the following clauses.
    • Clause 1. A method for controlling a terminal, the method comprising:
      • detecting a user gesture moving an object in a certain direction within proximity of the terminal;
      • determining at least one of movement direction, movement speed, and movement distance of the user gesture; and
      • performing a control operation associated with a currently running application according to the determined at least one of the movement direction, movement speed, and movement distance of the user gesture.
    • Clause 2. The method of clause 1, wherein the user gesture comprises:
      • approaching of the object into a predetermined range around the terminal; and
      • moving the object in a certain direction within the predetermined range.
    • Clause 3. The method of clause 1, wherein the user gesture is one of a unidirectional horizontal movement, a unidirectional vertical movement, a unidirectional sweep, a reciprocal movement, a bidirectional movement, and a wave movement.
    • Clause 4. The method of clause 1, wherein the performing of the control operation comprises turning at least one page of an electronic book in the movement direction.
    • Clause 5. The method of clause 4, wherein the turning of the at least one page of the electronic book comprises turning one or more pages or a chapter of the electronic book based on at least one of the movement speed and movement distance of the user gesture.
    • Clause 6. The method of clause 1, wherein the performing of the control operation comprises controlling a playback of a media file based on the user gesture.
    • Clause 7. The method of clause 6, wherein the controlling of the playback of the media file comprises at least one of playing a media file proceeding or following current media file, adjusting volume up or down, starting or ending playback, and shuffling files.
    • Clause 8. The method of clause 1, wherein the performing of the control operation comprises controlling at least one of moving a content of an application in an up, down, left, or right direction, expanding or shrinking the content of the application, copying the content of the application, printing the content of the application, and searching for the content of the application on a screen, in accordance with at least one of the movement direction, movement speed, and movement distance.
    • Clause 9. The method of clause 10, wherein the content includes at least one of a webpage, a list, an image, and an electronic book.
    • Clause 10. The method of clause 1, wherein the performing of the control operation comprises controlling an application of an effect corresponding to the user gesture on a screen of a display unit.
    • Clause 11. The method of clause 10, wherein the controlling of the application of the effect comprises applying at least one of a wave effect, a water drop effect, a wind effect, a color change effect, and a screen switching effect.
    • Clause 12. The method of clause 10, wherein the screen can be one of an idle screen, a lock screen, and a user interface screen.
    • Clause 13. The method of clause 1, wherein the performing of the control operation comprises navigating tabs of a web browser running on a background according to the user gesture.
    • Clause 14. The method of clause 1, wherein the performing of the control operation comprises switching among at least one of tabs of webpages and currently running applications.
    • Clause 15. The method of clause 14, wherein the switching among at least one of the tabs of the webpages and the currently running applications comprises:
      • determining a tab or background running application corresponding to the movement direction on a list of the webpages or the currently running applications; and
      • switching to the determined webpage or background running application to be displayed on the screen.
    • Clause 16. A terminal comprising:
      • a sensing unit which detects a user gesture moving an object in a certain direction within proximity of the terminal;
      • a control unit which determines at least one of movement direction, movement speed, and movement distance of the user gesture and performs a control operation associated with a currently running application according to the determined at least one of movement direction, movement speed, and movement distance of the user gesture; and
      • a display unit which displays an execution screen of the application under the control of the control unit.
    • Clause 17. The terminal of clause 16, wherein the user gesture comprises:
      • approaching of the object into a predetermined range around the terminal; and
      • moving the object in a certain direction within the predetermined range.
    • Clause 18. The terminal of clause 16, wherein the control unit controls at least one of page-turning, media playback, content movement, web browser tab switching, application switching, and visual effect application on screen.
    • Clause 19. The terminal of clause 16, wherein the sensing unit detects the user gesture using at least one of a proximity sensor, an infrared sensor, and a camera sensor; and
      wherein the sensing unit generates an input signal corresponding to the user gesture to the control unit.
    • Clause 20. The terminal of clause 16, wherein the user gesture corresponds to one of a unidirectional horizontal movement, a unidirectional vertical movement, a unidirectional sweep, a reciprocal movement, a bidirectional movement, a wave movement, and a hovering event.
    • Clause 21. A non-transient computer-readable storage medium storing instructions that when executed cause at least one processor to perform the method of clause 1.

Claims (15)

  1. A method for controlling an electronic device, the method comprising:
    acquiring an input image using a camera;
    analyzing a user gesture in the input image, by:
    identifying a user's hand disposed in a predefined configuration, and
    determining a shape of the hand is changed to a predetermined shape by a movement of the at least part of the hand, and
    executing a camera function corresponding to the user gesture.
  2. The method of claim 1, wherein the executing the camera function comprises:
    activating a timer to capture an image using the camera based on the user gesture; and
    in response to the timer expiring, capturing the image using the camera.
  3. The method of claim 2, wherein activating the timer comprises setting the timer for a predetermined time.
  4. The method of claim 1, wherein determining whether the shape of the hand is changed to the predetermined shape comprises:
    determining coordinates of the at least part of the hand corresponds to the predetermined shape.
  5. The method of claim 1, further comprising, in response to the hand being disposed in the predefined configuration, extracting the area including the hand.
  6. The method of claim 5, further comprising, in response to at least one finger of the hand being substantially outstretched relative to a palm of the hand, determining that the hand is disposed in the predefined configuration.
  7. The method of claim 1, wherein determining whether the shape of the hand is changed to the predetermined shape comprises detecting a movement of at least one finger of the hand.
  8. The method of claim 7, wherein determining whether the shape of the hand is changed to the predetermined shape comprises tracking a coordinate of at least one point of the at least one finger.
  9. The method of claim 1, wherein determining whether the shape of the hand is changed to the predetermined shape comprises determining whether the shape of the hand is changed to a clenched fist.
  10. An electronic device comprising:
    a camera; and
    at least one processor configured to:
    acquire an input image using the camera,
    analyze a user gesture in the input image, by:
    identifying a user's hand disposed in a predefined configuration, and
    determining a shape of the hand is changed to a predetermined shape by a movement of the at least part of the hand, and
    executing a camera function corresponding to the user gesture.
  11. The electronic device of claim 10, wherein the at least one processor is configured to:
    activate a timer to capture an image using the camera based on the user gesture; and
    in response to the timer expiring, capture the image using the camera.
  12. The electronic device of claim 10, wherein the at least one processor is configured to determine coordinates of the at least part of the hand corresponds to the predetermined shape.
  13. The electronic device of claim 10, wherein the at least one processor is further configured to, in response to the hand being disposed in the predefined configuration, extract the area including the hand.
  14. The electronic device of claim 10, wherein the at least one processor is further configured to detect a movement of at least one finger of the hand.
  15. The electronic device of claim 10, wherein the at least one processor is further configured to track a coordinate of at least one point of the at least one finger.
EP20185022.9A 2012-05-04 2013-05-02 Terminal and method for controlling the same based on spatial interaction Withdrawn EP3742263A1 (en)

Applications Claiming Priority (5)

Application Number Priority Date Filing Date Title
KR1020120047678A KR20130124061A (en) 2012-05-04 2012-05-04 Control method of terminal by using spatial interaction
KR1020120132222A KR20130124138A (en) 2012-11-21 2012-11-21 Control method of terminal by using spatial interaction
KR1020120132223A KR20130124139A (en) 2012-11-21 2012-11-21 Control method of terminal by using spatial interaction
KR1020120137740A KR20130124143A (en) 2012-11-30 2012-11-30 Control method of terminal by using spatial interaction
EP13166321.3A EP2660682A3 (en) 2012-05-04 2013-05-02 Terminal and method for controlling the same based on spatial interaction

Related Parent Applications (1)

Application Number Title Priority Date Filing Date
EP13166321.3A Division EP2660682A3 (en) 2012-05-04 2013-05-02 Terminal and method for controlling the same based on spatial interaction

Publications (1)

Publication Number Publication Date
EP3742263A1 true EP3742263A1 (en) 2020-11-25

Family

ID=48227029

Family Applications (2)

Application Number Title Priority Date Filing Date
EP20185022.9A Withdrawn EP3742263A1 (en) 2012-05-04 2013-05-02 Terminal and method for controlling the same based on spatial interaction
EP13166321.3A Withdrawn EP2660682A3 (en) 2012-05-04 2013-05-02 Terminal and method for controlling the same based on spatial interaction

Family Applications After (1)

Application Number Title Priority Date Filing Date
EP13166321.3A Withdrawn EP2660682A3 (en) 2012-05-04 2013-05-02 Terminal and method for controlling the same based on spatial interaction

Country Status (5)

Country Link
US (3) US20130293454A1 (en)
EP (2) EP3742263A1 (en)
JP (1) JP2013235588A (en)
CN (2) CN103383598A (en)
AU (1) AU2013205613B2 (en)

Families Citing this family (83)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
TWI463371B (en) * 2012-06-20 2014-12-01 Pixart Imaging Inc Gesture detection apparatus and method for determining continuous gesture depending on velocity
CN103713735B (en) * 2012-09-29 2018-03-16 华为技术有限公司 A kind of method and apparatus that terminal device is controlled using non-contact gesture
US20140267094A1 (en) * 2013-03-13 2014-09-18 Microsoft Corporation Performing an action on a touch-enabled device based on a gesture
US20140368737A1 (en) 2013-06-17 2014-12-18 Spotify Ab System and method for playing media during navigation between media streams
US9430045B2 (en) * 2013-07-17 2016-08-30 Lenovo (Singapore) Pte. Ltd. Special gestures for camera control and image processing operations
US20150033121A1 (en) * 2013-07-26 2015-01-29 Disney Enterprises, Inc. Motion based filtering of content elements
US10110649B2 (en) 2013-08-01 2018-10-23 Spotify Ab System and method for transitioning from decompressing one compressed media stream to decompressing another media stream
WO2015022498A1 (en) * 2013-08-15 2015-02-19 Elliptic Laboratories As Touchless user interfaces
US9785241B2 (en) * 2013-08-26 2017-10-10 Paypal, Inc. Gesture identification
US9917869B2 (en) 2013-09-23 2018-03-13 Spotify Ab System and method for identifying a segment of a file that includes target content
US9529888B2 (en) 2013-09-23 2016-12-27 Spotify Ab System and method for efficiently providing media and associated metadata
KR20150043149A (en) * 2013-10-14 2015-04-22 삼성전자주식회사 Method for controlling digital apparatus and photographing method by recognition of hand shape, and apparatus thereof
US10152136B2 (en) * 2013-10-16 2018-12-11 Leap Motion, Inc. Velocity field interaction for free space gesture interface and control
US9063640B2 (en) 2013-10-17 2015-06-23 Spotify Ab System and method for switching between media items in a plurality of sequences of media items
KR102063349B1 (en) * 2013-11-19 2020-01-08 엘지디스플레이 주식회사 Display device and driving method thereof
CN104679230B (en) * 2013-12-03 2018-05-18 华为终端(东莞)有限公司 A kind of method and terminal of contactless input information
CN103645897A (en) * 2013-12-05 2014-03-19 深圳市中兴移动通信有限公司 Mobile terminal and operation method thereof
US9891712B2 (en) 2013-12-16 2018-02-13 Leap Motion, Inc. User-defined virtual interaction space and manipulation of virtual cameras with vectors
CN104063047B (en) * 2014-01-09 2018-02-02 朱明甫 Remote control and its page turning method based on gesture remote control to e-book page turning
KR20150087544A (en) 2014-01-22 2015-07-30 엘지이노텍 주식회사 Gesture device, operating method thereof and vehicle having the same
KR20160114158A (en) * 2014-01-28 2016-10-04 후아웨이 디바이스 컴퍼니 리미티드 Terminal equipment processing method and terminal equipment
GB2523132A (en) * 2014-02-13 2015-08-19 Nokia Technologies Oy An apparatus and associated methods for controlling content on a display user interface
WO2015126392A1 (en) * 2014-02-20 2015-08-27 Hewlett-Packard Development Company, L.P. Emulating a user performing spatial gestures
CN103838496B (en) * 2014-02-20 2020-01-31 联想(北京)有限公司 method for controlling electronic equipment and electronic equipment
CN104866070A (en) * 2014-02-20 2015-08-26 联想(北京)有限公司 Method for information processing and electronic equipment
US10228768B2 (en) 2014-03-25 2019-03-12 Analog Devices, Inc. Optical user interface
CN104978133A (en) * 2014-04-04 2015-10-14 阿里巴巴集团控股有限公司 Screen capturing method and screen capturing device for intelligent terminal
CN105021204B (en) * 2014-04-24 2018-10-16 深圳市赛格导航科技股份有限公司 A kind of contactless onboard navigation system and air navigation aid
KR102251541B1 (en) * 2014-06-23 2021-05-14 엘지전자 주식회사 Mobile terminal and method for controlling the same
US10067570B2 (en) 2014-06-30 2018-09-04 Clarion Co., Ltd. Non-contact operation detection device
JP5682899B1 (en) * 2014-08-04 2015-03-11 求 藤川 IMAGING DEVICE, IMAGING DEVICE CONTROL METHOD, ADJUSTING DEVICE, ADJUSTING DEVICE CONTROL METHOD, SETTING DEVICE, SETTING DEVICE CONTROL METHOD, DEVICE PROGRAM
CN104219786B (en) * 2014-08-21 2019-05-14 深圳市金立通信设备有限公司 A kind of method of terminal connection
CN104243692A (en) * 2014-08-21 2014-12-24 深圳市金立通信设备有限公司 Terminal connecting method
CN104219384B (en) * 2014-08-21 2018-03-13 深圳市金立通信设备有限公司 A kind of terminal
CN104243693A (en) * 2014-08-21 2014-12-24 深圳市金立通信设备有限公司 Terminal
JP6519074B2 (en) * 2014-09-08 2019-05-29 任天堂株式会社 Electronics
CN105487648A (en) * 2014-09-16 2016-04-13 深圳富泰宏精密工业有限公司 Software control system and method with lens-adding function
CN105717982B (en) * 2014-12-03 2019-03-08 联想(北京)有限公司 Information processing method and electronic equipment
WO2016104266A1 (en) * 2014-12-24 2016-06-30 ソニー株式会社 Electronic device and method for controlling electronic device
DE102015201613A1 (en) * 2015-01-30 2016-08-04 Robert Bosch Gmbh Method and device for operating an input device, input device
JP6519075B2 (en) * 2015-02-10 2019-05-29 任天堂株式会社 INFORMATION PROCESSING APPARATUS, INFORMATION PROCESSING PROGRAM, INFORMATION PROCESSING SYSTEM, AND INFORMATION PROCESSING METHOD
US10477090B2 (en) * 2015-02-25 2019-11-12 Kyocera Corporation Wearable device, control method and non-transitory storage medium
JP5869711B1 (en) * 2015-04-07 2016-02-24 株式会社メディアドゥ Content display device, content display program, and content display method
CN104765278B (en) * 2015-04-20 2019-03-22 宇龙计算机通信科技(深圳)有限公司 A kind of intelligent home equipment control method and electronic equipment
WO2016172846A1 (en) * 2015-04-28 2016-11-03 华为技术有限公司 Blowing action-based method for operating mobile terminal and mobile terminal
CN107967116B (en) 2015-06-16 2019-07-19 深圳市腾讯计算机系统有限公司 The method and apparatus, computer equipment of lock onto target in scene of game
CN104978131B (en) * 2015-06-18 2018-09-18 Tcl移动通信科技(宁波)有限公司 A kind of e-book page turning detection method and system based on camera image
US10769402B2 (en) * 2015-09-09 2020-09-08 Thales Dis France Sa Non-contact friction ridge capture device
CN105242599A (en) * 2015-09-30 2016-01-13 南京邮电大学 Music wireless control system based on gesture tracking sensor, and method thereof
CN105487662A (en) * 2015-11-27 2016-04-13 努比亚技术有限公司 Increasing and decreasing adjustment method and device of mobile terminal
CN105491206B (en) * 2015-11-30 2018-04-06 上海斐讯数据通信技术有限公司 It is a kind of closely to sense unlocking method and system, smart machine
CN105872691A (en) * 2015-12-14 2016-08-17 乐视致新电子科技(天津)有限公司 Method and device for controlling browser
CN105718056B (en) * 2016-01-19 2019-09-10 北京小米移动软件有限公司 Gesture identification method and device
CN105975072A (en) * 2016-04-29 2016-09-28 乐视控股(北京)有限公司 Method, device and system for identifying gesture movement
US20170351336A1 (en) * 2016-06-07 2017-12-07 Stmicroelectronics, Inc. Time of flight based gesture control devices, systems and methods
CN106126029B (en) * 2016-06-24 2019-02-22 维沃移动通信有限公司 A kind of background picture switching method and mobile terminal
CN106249899A (en) * 2016-08-15 2016-12-21 珠海格力电器股份有限公司 A kind of gesture recognition system, electrical equipment and control method thereof
CN106406524A (en) * 2016-09-05 2017-02-15 珠海格力电器股份有限公司 Device control method and device
CN106484105A (en) * 2016-09-20 2017-03-08 努比亚技术有限公司 A kind of method and device realizing gesture identification
CN106502383A (en) * 2016-09-21 2017-03-15 努比亚技术有限公司 A kind of information processing method and mobile terminal
CN106371759A (en) * 2016-09-29 2017-02-01 努比亚技术有限公司 Velocity recognition system based on proximity sensor and mobile terminal
CN106603811A (en) * 2016-10-31 2017-04-26 努比亚技术有限公司 Terminal silencing control device and method
US10775998B2 (en) * 2017-01-04 2020-09-15 Kyocera Corporation Electronic device and control method
US20190369807A1 (en) * 2017-02-13 2019-12-05 Sony Corporation Information processing device, information processing method, and program
CN107046597B (en) * 2017-02-22 2019-11-26 Tcl移动通信科技(宁波)有限公司 A kind of volume setting processing method and mobile terminal based on mobile terminal
JP6293953B1 (en) * 2017-04-04 2018-03-14 京セラ株式会社 Electronic device, program, and control method
CN107155009A (en) * 2017-07-13 2017-09-12 彭声强 A kind of mobile phone photograph method
CN108196666A (en) * 2017-09-28 2018-06-22 努比亚技术有限公司 A kind of method of terminal control, terminal and computer readable storage medium
US20190179499A1 (en) * 2017-10-20 2019-06-13 Qurika Inc. Systems and methods for providing a digital content reader and collection display application implementing 3d graphical user interfaces and interactions
CN108632411A (en) * 2018-05-11 2018-10-09 Oppo广东移动通信有限公司 Electronic device
US11875012B2 (en) 2018-05-25 2024-01-16 Ultrahaptics IP Two Limited Throwable interface for augmented reality and virtual reality environments
CN108989553A (en) * 2018-06-29 2018-12-11 北京微播视界科技有限公司 The method, apparatus and electronic equipment of scene manipulation
JP7244231B2 (en) * 2018-07-27 2023-03-22 京セラ株式会社 ELECTRONIC DEVICE, CONTROL PROGRAM AND DISPLAY CONTROL METHOD
JP7034856B2 (en) * 2018-07-27 2022-03-14 京セラ株式会社 Electronic devices, control programs and display control methods
CN109343754A (en) * 2018-08-27 2019-02-15 维沃移动通信有限公司 A kind of image display method and terminal
USD930676S1 (en) * 2018-09-07 2021-09-14 Samsung Display Co., Ltd. Display device with generated image for display
JP2020052681A (en) * 2018-09-26 2020-04-02 シュナイダーエレクトリックホールディングス株式会社 Operation processing device
JP7280032B2 (en) * 2018-11-27 2023-05-23 ローム株式会社 input devices, automobiles
CN110069137B (en) * 2019-04-30 2022-07-08 徐州重型机械有限公司 Gesture control method, control device and control system
CN117784927A (en) * 2019-08-19 2024-03-29 华为技术有限公司 Interaction method of air-separation gestures and electronic equipment
CN111524157B (en) * 2020-04-26 2022-07-01 南瑞集团有限公司 Touch screen object analysis method and system based on camera array and storage medium
CN112270302A (en) * 2020-11-17 2021-01-26 支付宝(杭州)信息技术有限公司 Limb control method and device and electronic equipment
JP2022087700A (en) * 2020-12-01 2022-06-13 京セラドキュメントソリューションズ株式会社 Electronic device and image forming apparatus

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20040196400A1 (en) * 2003-04-07 2004-10-07 Stavely Donald J. Digital camera user interface using hand gestures
US20050094019A1 (en) * 2003-10-31 2005-05-05 Grosvenor David A. Camera control
US20100013943A1 (en) * 2008-07-18 2010-01-21 Sony Ericsson Mobile Communications Ab Arrangement and method relating to an image recording device
US20100199232A1 (en) * 2009-02-03 2010-08-05 Massachusetts Institute Of Technology Wearable Gestural Interface

Family Cites Families (39)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2986455B1 (en) * 1998-07-24 1999-12-06 株式会社エイ・ティ・アール知能映像通信研究所 Hand gesture recognition device
US8381135B2 (en) * 2004-07-30 2013-02-19 Apple Inc. Proximity detector in handheld device
JP4717539B2 (en) * 2005-07-26 2011-07-06 キヤノン株式会社 Imaging apparatus and imaging method
DE102006037156A1 (en) * 2006-03-22 2007-09-27 Volkswagen Ag Interactive operating device and method for operating the interactive operating device
GB0612624D0 (en) * 2006-06-26 2006-08-02 Symbian Software Ltd Speed of gesture
US20080040692A1 (en) * 2006-06-29 2008-02-14 Microsoft Corporation Gesture input
US7877707B2 (en) * 2007-01-06 2011-01-25 Apple Inc. Detecting and interpreting real-world and security gestures on touch and hover sensitive devices
US8219936B2 (en) * 2007-08-30 2012-07-10 Lg Electronics Inc. User interface for a mobile device using a user's gesture in the proximity of an electronic device
CN101465961B (en) * 2007-12-19 2013-10-30 神基科技股份有限公司 Photographic device and method capable of controlling shutter through characteristic image identification
US9772689B2 (en) * 2008-03-04 2017-09-26 Qualcomm Incorporated Enhanced gesture-based image manipulation
JP2009260907A (en) * 2008-03-18 2009-11-05 Nec Corp Portable terminal apparatus, terminal operation method, program, and recording medium
US8576181B2 (en) * 2008-05-20 2013-11-05 Lg Electronics Inc. Mobile terminal using proximity touch and wallpaper controlling method thereof
US9197738B2 (en) * 2008-12-04 2015-11-24 Microsoft Technology Licensing, Llc Providing selected data through a locked display
US8499251B2 (en) * 2009-01-07 2013-07-30 Microsoft Corporation Virtual page turn
JP2010170166A (en) * 2009-01-20 2010-08-05 Sony Corp Apparatus, method and program for processing information
JP2010244132A (en) * 2009-04-01 2010-10-28 Mitsubishi Electric Corp User interface device with touch panel, method and program for controlling user interface
JP4840620B2 (en) * 2009-04-30 2011-12-21 株式会社デンソー In-vehicle electronic device operation device
US8619029B2 (en) * 2009-05-22 2013-12-31 Motorola Mobility Llc Electronic device with sensing assembly and method for interpreting consecutive gestures
JP2010277197A (en) * 2009-05-26 2010-12-09 Sony Corp Information processing device, information processing method, and program
JP5282661B2 (en) * 2009-05-26 2013-09-04 ソニー株式会社 Information processing apparatus, information processing method, and program
US20110022307A1 (en) * 2009-07-27 2011-01-27 Htc Corporation Method for operating navigation frame, navigation apparatus and recording medium
JP2011134271A (en) * 2009-12-25 2011-07-07 Sony Corp Information processor, information processing method, and program
CN101742114A (en) * 2009-12-31 2010-06-16 上海量科电子科技有限公司 Method and device for determining shooting operation through gesture identification
US10007393B2 (en) * 2010-01-19 2018-06-26 Apple Inc. 3D view of file structure
JP5413673B2 (en) * 2010-03-08 2014-02-12 ソニー株式会社 Information processing apparatus and method, and program
KR20110109551A (en) * 2010-03-31 2011-10-06 삼성전자주식회사 Touch screen device and method for processing input of the same
US8457353B2 (en) * 2010-05-18 2013-06-04 Microsoft Corporation Gestures and gesture modifiers for manipulating a user-interface
US9086956B2 (en) * 2010-05-21 2015-07-21 Zetta Research and Development—RPO Series Methods for interacting with an on-screen document
JP5601045B2 (en) * 2010-06-24 2014-10-08 ソニー株式会社 Gesture recognition device, gesture recognition method and program
KR20120024247A (en) * 2010-09-06 2012-03-14 삼성전자주식회사 Method for operating a mobile device by recognizing a user gesture and the mobile device thereof
CN102055844B (en) * 2010-11-15 2013-05-15 惠州Tcl移动通信有限公司 Method for realizing camera shutter function by means of gesture recognition and handset device
US8982045B2 (en) * 2010-12-17 2015-03-17 Microsoft Corporation Using movement of a computing device to enhance interpretation of input events produced when interacting with the computing device
CN201945948U (en) * 2011-03-10 2011-08-24 中科芯集成电路股份有限公司 Non-contact gesture page-turning demonstration system based on image detection
CN102142084B (en) * 2011-05-06 2012-12-26 北京网尚数字电影院线有限公司 Method for gesture recognition
US20120304132A1 (en) * 2011-05-27 2012-11-29 Chaitanya Dev Sareen Switching back to a previously-interacted-with application
CN102420942A (en) * 2011-11-28 2012-04-18 康佳集团股份有限公司 Photograph device and photograph control method based on same
US20130155237A1 (en) * 2011-12-16 2013-06-20 Microsoft Corporation Interacting with a mobile device within a vehicle using gestures
US8811938B2 (en) * 2011-12-16 2014-08-19 Microsoft Corporation Providing a user interface experience based on inferred vehicle state
US10209881B2 (en) * 2012-03-15 2019-02-19 Ibrahim Farid Cherradi El Fadili Extending the free fingers typing technology and introducing the finger taps language technology

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20040196400A1 (en) * 2003-04-07 2004-10-07 Stavely Donald J. Digital camera user interface using hand gestures
US20050094019A1 (en) * 2003-10-31 2005-05-05 Grosvenor David A. Camera control
US20100013943A1 (en) * 2008-07-18 2010-01-21 Sony Ericsson Mobile Communications Ab Arrangement and method relating to an image recording device
US20100199232A1 (en) * 2009-02-03 2010-08-05 Massachusetts Institute Of Technology Wearable Gestural Interface

Also Published As

Publication number Publication date
EP2660682A3 (en) 2017-04-26
US20180107282A1 (en) 2018-04-19
EP2660682A2 (en) 2013-11-06
AU2013205613B2 (en) 2017-12-21
AU2013205613A1 (en) 2013-11-21
CN110647244A (en) 2020-01-03
US20200257373A1 (en) 2020-08-13
US20130293454A1 (en) 2013-11-07
JP2013235588A (en) 2013-11-21
CN103383598A (en) 2013-11-06

Similar Documents

Publication Publication Date Title
US20200257373A1 (en) Terminal and method for controlling the same based on spatial interaction
US11320931B2 (en) Swipe-based confirmation for touch sensitive devices
US9239674B2 (en) Method and apparatus for providing different user interface effects for different implementation characteristics of a touch event
EP3028123B1 (en) Electronic device and method of recognizing input in electronic device
KR101984154B1 (en) Control method for terminal using touch and gesture input and terminal thereof
US9030430B2 (en) Multi-touch navigation mode
US8963865B2 (en) Touch sensitive device with concentration mode
US20140380244A1 (en) Visual table of contents for touch sensitive devices
US10042445B1 (en) Adaptive display of user interface elements based on proximity sensing
KR20130124061A (en) Control method of terminal by using spatial interaction
KR20130124139A (en) Control method of terminal by using spatial interaction
US9256360B2 (en) Single touch process to achieve dual touch user interface
KR102194778B1 (en) Control method of terminal by using spatial interaction
KR20130124143A (en) Control method of terminal by using spatial interaction
KR102061254B1 (en) Control method of terminal by using spatial interaction
KR20130124138A (en) Control method of terminal by using spatial interaction
KR20200143346A (en) Control method of terminal by using spatial interaction
US8823665B2 (en) Handheld electronic device and frame control method of digital information thereof
KR20150044081A (en) Control method of display device
CN104423627A (en) Information processing method and electronic equipment

Legal Events

Date Code Title Description
PUAI Public reference made under article 153(3) epc to a published international application that has entered the european phase

Free format text: ORIGINAL CODE: 0009012

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: THE APPLICATION HAS BEEN PUBLISHED

AC Divisional application: reference to earlier application

Ref document number: 2660682

Country of ref document: EP

Kind code of ref document: P

AK Designated contracting states

Kind code of ref document: A1

Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: THE APPLICATION IS DEEMED TO BE WITHDRAWN

18D Application deemed to be withdrawn

Effective date: 20210526