WO2012126426A2 - 一种非接触式手势控制方法及装置 - Google Patents

一种非接触式手势控制方法及装置 Download PDF

Info

Publication number
WO2012126426A2
WO2012126426A2 PCT/CN2012/075798 CN2012075798W WO2012126426A2 WO 2012126426 A2 WO2012126426 A2 WO 2012126426A2 CN 2012075798 W CN2012075798 W CN 2012075798W WO 2012126426 A2 WO2012126426 A2 WO 2012126426A2
Authority
WO
WIPO (PCT)
Prior art keywords
gesture action
fingers
gesture
index finger
hand
Prior art date
Application number
PCT/CN2012/075798
Other languages
English (en)
French (fr)
Other versions
WO2012126426A3 (zh
Inventor
李英涛
Original Assignee
华为技术有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 华为技术有限公司 filed Critical 华为技术有限公司
Priority to EP12759930.6A priority Critical patent/EP2631739B1/en
Priority to CN2012800011749A priority patent/CN103229127A/zh
Priority to EP14190503.4A priority patent/EP2853989A1/en
Priority to PCT/CN2012/075798 priority patent/WO2012126426A2/zh
Publication of WO2012126426A2 publication Critical patent/WO2012126426A2/zh
Publication of WO2012126426A3 publication Critical patent/WO2012126426A3/zh
Priority to US13/875,476 priority patent/US8866781B2/en

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/017Gesture based interaction, e.g. based on a set of recognized hand gestures

Definitions

  • the invention belongs to the technical field of human-computer interaction, and in particular relates to a contactless gesture control method and device.
  • the gesture control device is one of the human-computer interaction technologies. Compared with the traditional graphical user interface, the gesture control does not require the user to hold a specific input device, and only needs to control the device or the device through a specific hand motion. Enter specific information.
  • the existing gesture control is mainly divided into two categories: one is based on the position information of the gesture, that is, the movement of the related elements on the display screen is mapped by the spatial movement of the finger; the other is the gesture information based on the gesture, that is, the formation by the human hand Various complex gestures, corresponding to different control commands to control the device.
  • a series of operations such as selecting, copying, pasting, moving, deleting, and switching of a specific display object (such as an icon, a box, a scroll bar, etc.) on a pair of display screens in the prior art define corresponding gestures.
  • a gesture to copy a display object click on the object with one finger when selected; a gesture to paste a display object: Quickly double-click with a finger.
  • the prior art has the following drawbacks: the operation instructions for the display object are various, and the gesture rules that the user needs to memorize are too many to learn and memorize.
  • the basic operations of the mouse are converted into gesture definitions.
  • the prior art 2 does not define a specific operation of the display object, but implements functions of all specific operations by defining a mouse-like operation (for example, by selecting a specific operation in the right-click menu) achieve).
  • the prior art 2 has the following drawbacks:
  • the gesture actions of the mouse-like operation are quite different.
  • the movement of the thumb moves the orientation, and then the hand needs to be extended to extend the thumb and forefinger, and then the thumb is moved.
  • This series of motion switching is complicated, which affects the convenience and smoothness of the operation.
  • An object of the embodiments of the present invention is to provide a non-contact gesture control method, which solves the problem that the prior art has too many gesture rules, is difficult to learn and memorize, and has complicated operation and poor operation fluency.
  • a contactless gesture control method includes:
  • the gesture action includes: the index finger is circled, the remaining four fingers are clenched and the circle is larger than the contour of the hand; or the five fingers are close together, the palm moves in the direction of the device; or the thumb, the index finger and the middle finger are naturally extended, and the other two fingers are close together, the index finger is free Move; or the thumb, index finger, middle finger naturally extend, the other two fingers close together, the thumb moves up and down; or the thumb, index finger, middle finger naturally extend, the other two fingers close together, the middle finger moves up and down.
  • the embodiment of the invention further provides a contactless gesture control device, the device comprising:
  • a gesture action obtaining unit configured to acquire a gesture action of the user
  • control instruction acquiring unit configured to acquire, according to a mapping relationship between the pre-stored gesture action and the control instruction, a control instruction corresponding to the gesture action acquired by the gesture action acquiring unit;
  • control instruction execution unit configured to execute a control instruction acquired by the control instruction acquisition unit
  • the gesture action includes: the index finger is circled, the remaining four fingers are clenched and the circle is larger than the contour of the hand; or the five fingers are close together, the palm moves in the direction of the device; or the thumb, the index finger and the middle finger are naturally extended, and the other two fingers are close together, the index finger is free Move; or the thumb, index finger, middle finger naturally extend, the other two fingers close together, the thumb moves up and down; or the thumb, index finger, middle finger naturally extend, the other two fingers close together, the middle finger moves up and down.
  • the beneficial effects obtained by the embodiment of the present invention compared with the prior art are as follows: First, the gesture rules are small, and the user is easy to learn and memorize. Compared with the prior art solution 1, the embodiment of the invention greatly simplifies the number of gesture rules, and the user can complete most operations like a mouse by simply remembering three basic gesture actions. Second, the operation is simple and smooth. Compared with the prior art solution 2, when the user implements three basic gestures, the basic hand posture is the same (extending the thumb, the index finger, the middle finger, and the other two fingers together), so that the user does not need to operate continuously. By changing the basic pose, you can smoothly switch between different gestures, and the ease of use and practicability are strong.
  • FIG. 1 is a flowchart of an implementation of a contactless gesture control method according to an embodiment of the present invention
  • FIG. 2 is an example of a gesture provided by an embodiment of the present invention
  • FIG. 3 is a structural diagram of a non-contact gesture control apparatus according to an embodiment of the present invention.
  • FIG. 1 is a flowchart showing an implementation process of a contactless gesture control method according to an embodiment of the present invention. The method is detailed as follows:
  • step S101 a gesture action of the user is acquired.
  • the gesture action of the user may be acquired in various manners, including but not limited to collecting a gesture image of the user through the image collector, and extracting a gesture action from the gesture image; or, by reflective gesture tracking
  • the user's gestures are acquired by an infrared attitude tracking or an ultrasonic attitude tracker.
  • the gesture action includes but is not limited to: the index finger is circled, the remaining four fingers are clenched and the circle is larger than the outline of the hand; or the five fingers are close together, the palm moves in the direction of the device; or the thumb, the index finger, the middle finger naturally extend, and the rest The two fingers are close together, and the index finger moves freely; or the thumb, index finger, and middle finger naturally extend, the other two fingers close together, and the thumb moves up and down; or the thumb, index finger, and middle finger naturally extend, and the other two fingers close together, and the middle finger moves up and down.
  • step S102 a control instruction corresponding to the gesture action is acquired according to a mapping relationship between the pre-stored gesture action and the control command.
  • the method before acquiring the control instruction corresponding to the gesture action, the method further includes creating and storing a mapping relationship between the gesture action and the control instruction.
  • the gesture action includes a right hand gesture action and a left hand gesture action.
  • mapping relationship between the creating and storing the gesture action and the control instruction specifically includes:
  • the gesture motion including the motion track of the hand and the hand at the beginning of the motion track (when the hand enters the preset target area) and the end (the hand leaves the preset a hand gesture when the target area, or when the hand stays in a certain area of the target area, exceeds a preset time value;
  • the correspondence is stored. Wherein, the N is greater than or equal to 1.
  • mapping relationship between the right-hand gesture action and the control instruction includes but is not limited to at least one of the following:
  • the gesture action is for the index finger to circle the remaining four fingers and the circle is larger than the contour of the hand
  • the corresponding control command is “determine” or “negative”.
  • the gesture action is that the index finger is clockwise and the remaining four fingers are clenched and the circle is larger than the contour of the hand, and the corresponding control instruction is “determined” (as shown in the figure). 2a); gesture action is the index finger counterclockwise circled the remaining four fingers clenched fists and the circle is larger than the outline of the hand, the corresponding control command is "negative” (as shown in Figure 2b);
  • the corresponding control command is “return”, for example, returning to the previous page or the previous menu (as shown in FIG. 2c);
  • the gesture action is the thumb
  • the index finger and the middle finger naturally extend, and the other two fingers are close together, and the thumb moves up and down, the corresponding control command is “left mouse click” (as shown in FIG. 2 e );
  • mapping relationship between the left-hand gesture action and the control instruction includes but is not limited to at least one of the following:
  • the corresponding control instruction is “determine” or “negative”; preferably, when the gesture is for the index finger, the remaining four fingers are fisted and circled.
  • the corresponding control command is “determined”; when the gesture action is that the index finger is clockwise and the remaining four fingers are clenched and the circle is larger than the contour of the hand, the corresponding control command is “negative”;
  • the corresponding control command is “the movement of the mouse track along the moving finger”
  • the gesture action is the thumb
  • the index finger and the middle finger naturally extend, and the other two fingers are close together, and the thumb moves up and down, the corresponding control command is “right mouse click”;
  • step S103 the control instruction is executed.
  • the system reads the control instructions and executes to perform the corresponding functions. For example, the system reads the "return” instruction and executes the function of returning to the previous page or the previous menu.
  • gesture action of the embodiment of the present invention defaults to the gesture action of the right hand.
  • the beneficial effects obtained by the embodiment of the present invention compared with the prior art are as follows: First, the gesture rules are small, and the user is easy to learn and memorize. Compared with the prior art solution 1, the embodiment of the invention greatly simplifies the number of gesture rules, and the user can complete most operations like a mouse by simply remembering three basic gesture actions. Second, the operation is simple and smooth. Compared with the prior art solution 2, when the user implements three basic gestures, the basic hand posture is the same (extending the thumb, the index finger, the middle finger, and the other two fingers together), so that the user does not need to operate continuously. Transform the basic pose and switch between different gestures smoothly. In addition, the definition of the gesture action is consistent with the user's habit.
  • the left button defined in the embodiment of the present invention is (if the right hand) the thumb moves, and the right button is the (right hand) middle finger movement, and the intuitive left and right mapping is consistent with the user's habit. It is convenient for users to learn and remember, and it is easy to use and practical.
  • the method is after step S101, before step S102, Also includes:
  • Detecting whether the gesture action is a right-hand gesture gesture or a left-hand gesture gesture and when the gesture gesture is a right-hand gesture gesture, acquiring a control instruction corresponding to the right-hand gesture gesture according to a mapping relationship between the right-hand gesture gesture and the control instruction
  • the control command corresponding to the left-hand gesture is acquired according to the mapping relationship between the left-hand gesture and the control command.
  • the detecting whether the gesture action is a right-hand gesture motion or a left-hand gesture motion is specifically: setting a left target region and a right target region in advance (the center point or the nose position of the two eyes of the face may be defined as a boundary line, and the left portion is a left target region) The right part is the right target area.
  • the gesture is detected whether it is in the left target area or the right target area. If it is in the left target area, it is a left-hand gesture, otherwise it is a right-hand gesture.
  • FIG. 3 shows a composition of a contactless gesture control apparatus according to Embodiment 2 of the present invention. For convenience of description, only parts related to the embodiment of the present invention are shown.
  • the contactless gesture control device can be operated on various electronic devices (including mobile phones, tablet mobile phones, desktop computers, tablet computers, televisions, refrigerators, washing machines, air conditioners, digital cameras, surveillance cameras, medical electronic instruments, etc.).
  • the non-contact gesture control device 3 includes a gesture action acquisition unit 31, a control instruction acquisition unit 32, and a control instruction execution unit 33. Among them, the specific functions of each unit are as follows:
  • the gesture action acquiring unit 31 is configured to acquire a gesture action of the user; preferably, the gesture action acquisition unit 31 is configured to: collect a gesture image of the user through the image collector, and extract a gesture action from the gesture image; or The user's gestures are acquired by a reflective attitude tracker, an infrared attitude tracking or an ultrasonic attitude tracker.
  • the control instruction acquiring unit 32 is configured to acquire, according to the mapping relationship between the pre-stored gesture action and the control instruction, a control instruction corresponding to the gesture action acquired by the gesture action acquiring unit 31;
  • control instruction execution unit 33 configured to execute the control instruction acquired by the control instruction acquisition unit 32;
  • the gesture action includes but is not limited to: the index finger is circled, the remaining four fingers are clenched and the circle is larger than the outline of the hand; or the five fingers are close together, the palm moves in the direction of the device; or the thumb, the index finger, the middle finger naturally extend, and the rest The two fingers are close together, and the index finger moves freely; or the thumb, index finger, and middle finger naturally extend, the other two fingers close together, and the thumb moves up and down; or the thumb, index finger, and middle finger naturally extend, and the other two fingers close together, and the middle finger moves up and down.
  • the device 3 further includes:
  • the mapping relationship creating unit 34 is configured to create and store a mapping relationship between the gesture action and the control instruction before the control instruction acquiring unit 32 acquires the control instruction corresponding to the gesture action acquired by the gesture action acquiring unit 31, where the gesture Actions include right hand gestures and left hand gestures;
  • the detecting unit 35 is configured to detect whether the gesture action is a right-hand gesture or a left-hand gesture before the control instruction acquisition unit 32 acquires the control instruction corresponding to the gesture action after the gesture action acquisition unit 31 acquires the gesture action. action.
  • mapping relationship between the right-hand gesture action and the control instruction includes but is not limited to at least one of the following:
  • the corresponding control command is “determine” or “negative”; preferably, when the gesture action is the index finger, the remaining four fingers are fisted clockwise and When the circle is larger than the outline of the hand, the corresponding control command is “determined”; when the gesture action is the index finger counterclockwise circled and the remaining four fingers are clenched and the circle is larger than the outline of the hand, the corresponding control command is “negative”;
  • the corresponding control command is “the movement of the mouse track along the moving finger”
  • the gesture action is the thumb
  • the index finger and the middle finger naturally extend, and the other two fingers are close together, and the thumb moves up and down, the corresponding control command is “left mouse click”;
  • mapping relationship between the left-hand gesture action and the control instruction includes but is not limited to at least one of the following:
  • the corresponding control instruction is “determine” or “negative”; preferably, when the gesture is for the index finger, the remaining four fingers are punched counterclockwise and When the circle is larger than the outline of the hand, the corresponding control command is “OK”; when the gesture action is that the index finger is clockwise and the remaining four fingers are clenched and the circle is larger than the outline of the hand, the corresponding control command is “negative”;
  • the corresponding control command is “the movement of the mouse track along the moving finger”
  • the gesture action is the thumb
  • the index finger and the middle finger naturally extend, and the other two fingers are close together, and the thumb moves up and down, the corresponding control command is “right mouse click”;
  • the non-contact gesture control apparatus provided in this embodiment may be used in the foregoing related non-contact gesture control method.
  • the beneficial effects obtained by the embodiment of the present invention compared with the prior art are as follows: First, the gesture rules are small, and the user is easy to learn and memorize. Compared with the prior art solution 1, the embodiment of the invention greatly simplifies the number of gesture rules, and the user can complete most operations like a mouse by simply remembering three basic gesture actions. Second, the operation is simple and smooth. Compared with the prior art solution 2, when the user implements three basic gestures, the basic hand posture is the same (extending the thumb, the index finger, the middle finger, and the other two fingers together), so that the user does not need to operate continuously. Transform the basic pose and switch between different gestures smoothly. Moreover, the definition of the gesture action is consistent with the user's habit.
  • the left button defined in the embodiment of the present invention is (if the right hand) the thumb moves, and the right button is (if the right hand) the middle finger moves, and the intuitive left and right mapping is consistent with the user's habit. It is convenient for users to learn and remember, and it is easy to use and practical.
  • the embodiment of the present invention creates and stores a mapping relationship between the right-hand gesture action and the left-hand gesture action and the control instruction, and satisfies the usage requirements of different users (habitually using the right hand or the left hand) The user) further enhances the ease of use and utility of the present invention.
  • the data caching method in the multi-node system provided by the embodiment of the present invention may be completed in whole or in part by hardware related to program instructions. For example, it can be done by computer running.
  • the program can be stored in a readable storage medium such as a random access memory, a magnetic disk, an optical disk, or the like.

Description

一种非接触式手势控制方法及装置 技术领域
本发明属于人机交互技术领域,尤其涉及一种非接触式手势控制方法及装置。
背景技术
通过手势控制设备是人机交互技术中的一种,与传统的图形用户界面相比,手势控制不需要用户持握特定的输入设备,只需通过特定的手部动作就能控制设备或者向设备输入特定的信息。
现有的手势控制主要分为两类:一类是基于手势的位置信息,即通过手指的空间移动来映射显示屏上相关元素的移动;另一类是基于手势的姿态信息,即利用人手形成的各种复杂手势,对应不同的控制命令来对设备进行控制。
现有技术一对显示屏上特定的显示对象(如图标、方框、滚动条等)的诸如选择、复制、粘帖、移动、删除和切换等一系列操作定义对应的手势。例如 复制某个显示对象的手势:当被选定时通过一个手指在对象上点击;粘帖某个显示对象的手势:通过一个手指快速双击。
现有技术一存在如下缺陷:针对显示对象的操作指令多种多样,需要用户记忆的手势规则太多,难以学习和记忆。
现有技术二把鼠标的基本操作(击左键、击右键、光标位移等)转化为手势定义。与现有技术一不同的是,现有技术二不定义显示对象的具体操作,而是通过定义类鼠标的操作来实现所有具体操作的功能(例如可以通过在右键菜单中选择某个具体操作来实现)。
现有技术二存在如下缺陷: 类鼠标操作的各手势动作差异较大,当用户在进行一系列操作时,需要频繁在各种动作中切换,例如当用户需要选中某个显示对象然后打开右键菜单时,用户需要通过握拳和伸出大拇指的动作来移动方位,然后又需将手改为伸出拇指和食指,然后移动拇指,这一系列的动作切换比较复杂,影响了操作的便捷性和流畅性。
技术问题
本发明实施例的目的在于提供一种非接触式手势控制方法,以解决现有技术存在的手势规则太多,难以学习和记忆以及操作复杂,操作流畅性差的问题。
技术解决方案
本发明实施例是这样实现的,一种非接触式手势控制方法,所述方法包括:
获取用户的手势动作;
根据预存的手势动作与控制指令的映射关系,获取与所述手势动作对应的控制指令;
执行所述控制指令;
所述手势动作包括:食指划圆圈其余四指握拳且圆圈大于手的轮廓;或者五指并拢,手掌向所述设备所在方向移动;或者拇指、食指、中指自然伸出,其余二指并拢,食指自由移动;或者拇指、食指、中指自然伸出,其余二指并拢,拇指上下移动;或者拇指、食指、中指自然伸出,其余二指并拢,中指上下移动。
本发明实施例还提供一种非接触式手势控制装置,所述装置包括:
手势动作获取单元,用于获取用户的手势动作;
控制指令获取单元,用于根据预存的手势动作与控制指令的映射关系,获取与所述手势动作获取单元获取的手势动作对应的控制指令;
控制指令执行单元,用于执行所述控制指令获取单元获取的控制指令;
所述手势动作包括:食指划圆圈其余四指握拳且圆圈大于手的轮廓;或者五指并拢,手掌向所述设备所在方向移动;或者拇指、食指、中指自然伸出,其余二指并拢,食指自由移动;或者拇指、食指、中指自然伸出,其余二指并拢,拇指上下移动;或者拇指、食指、中指自然伸出,其余二指并拢,中指上下移动。
有益效果
本发明实施例与现有技术相比获得的有益效果有:首先,手势规则少、便于用户学习、记忆。与现有技术方案一相比,本发明实施例大大简化了手势规则的数量,用户只要记住三个基本手势动作,就能像使用鼠标一样完成绝大多数的操作。其次,操作简单、流畅。与现有技术方案二相比,用户在实施三个基本手势时,基本的手部姿态是相同的(伸出拇指、食指、中指,并拢另两个手指),使得用户在连续操作时不需要变换基本姿态,可以流畅地在不同手势间切换,易用性和实用性都较强。
附图说明
为了更清楚地说明本发明实施例中的技术方案,下面将对实施例或现有技术描述中所需要使用的附图作简单地介绍,显而易见地,下面描述中的附图仅仅是本发明的一些实施例,对于本领域普通技术人员来讲,在不付出创造性劳动性的前提下,还可以根据这些附图获得其他的附图。
图1是本发明实施例提供的非接触式手势控制方法的实现流程图;
图2中的2a、2b、2c、2d、2e、2f是本发明实施例提供的手势示例图;
图3是本发明实施例提供的非接触式手势控制装置的组成结构图。
本发明的实施方式
为了使本发明的目的、技术方案及优点更加清楚明白,以下结合附图及实施例,对本发明进行进一步详细说明。应当理解,此处所描述的具体实施例仅仅用以解释本发明,并不用于限定本发明。
为了说明本发明所述的技术方案,下面通过具体实施例来进行说明。
图1示出了本发明实施例提供的非接触式手势控制方法的实现流程,该方法过程详述如下:
在步骤S101中,获取用户的手势动作。
在本实施例中,可以通过多种方式获取用户的手势动作,包括但不局限于通过图像采集器采集用户的手势图像,并从所述手势图像中提取手势动作;或者,通过反射式姿态追踪器、红外姿态追踪或超声波姿态追踪器等获取用户的手势动作。
其中,所述手势动作包括但不局限于:食指划圆圈其余四指握拳且圆圈大于手的轮廓;或者五指并拢,手掌向所述设备所在方向移动;或者拇指、食指、中指自然伸出,其余二指并拢,食指自由移动;或者拇指、食指、中指自然伸出,其余二指并拢,拇指上下移动;或者拇指、食指、中指自然伸出,其余二指并拢,中指上下移动。
在步骤S102中,根据预存的手势动作与控制指令的映射关系,获取与所述手势动作对应的控制指令。
在本实施例中,获取与所述手势动作对应的控制指令之前,还包括创建并存储手势动作与控制指令的映射关系。优选的是,所述手势动作包括右手手势动作和左手手势动作。
所述创建并存储手势动作与控制指令的映射关系具体包括:
获取用户的N个手势动作,所述手势动作包括手部的运动轨迹以及所述手部在运动轨迹起始处(手部进入预设的目标区域时)及结束处(手部离开预设的目标区域时,或者手部停留在目标区域的某个区域时间超过预设时间值时)的手部姿态;
从预先编辑的控制指令中选择N个控制指令,并与所述手势动作建立一一对应的关系;
存储所述对应关系。其中,所述N大于等于1。
在本实施例中,所述右手手势动作与控制指令的映射关系包括但不局限于以下至少一个:
当手势动作为食指划圆圈其余四指握拳且圆圈大于手的轮廓时,对应的控制指令为“确定”或者“否定”。优选的,当显示屏幕上显示有“确定”和“否定”选项时,手势动作为食指顺时针划圆圈其余四指握拳且圆圈大于手的轮廓时,对应的控制指令为“确定”(如图2a所示);手势动作为食指逆时针划圆圈其余四指握拳且圆圈大于手的轮廓时,对应的控制指令为“否定”(如图2b所示);
当手势动作为五指并拢,手掌向所述设备所在方向移动时,对应的控制指令为“返回”,例如返回前一页面或者上一级菜单(如图2c所示);
当手势动作为拇指、食指、中指自然伸出,其余二指并拢,食指自由移动时,对应的控制指令为“鼠标轨沿所述移动的手指的迹移动” (如图2d所示);
当手势动作为拇指、食指、中指自然伸出,其余二指并拢,拇指上下移动时,对应的控制指令为“鼠标左键点击”(如图2e所示);
当手势动作为为拇指、食指、中指自然伸出,其余二指并拢,中指上下移动时,对应的控制指令为“鼠标右键点击”(如图2f所示)。
所述左手手势动作与控制指令的映射关系包括但不局限于以下至少一个:
当手势动作为食指划圆圈其余四指握拳且圆圈大于手的轮廓时,对应的控制指令为“确定”或者“否定”;优选的,当手势动作为食指逆时针划圆圈其余四指握拳且圆圈大于手的轮廓时,对应的控制指令为“确定”;当手势动作为食指顺时针划圆圈其余四指握拳且圆圈大于手的轮廓时,对应的控制指令为“否定”;
当手势动作为五指并拢,手掌向所述设备所在方向移动时,对应的控制指令为“返回”;
当手势动作为拇指、食指、中指自然伸出,其余二指并拢,食指自由移动时,对应的控制指令为“鼠标轨沿所述移动的手指的迹移动”;
当手势动作为拇指、食指、中指自然伸出,其余二指并拢,拇指上下移动时,对应的控制指令为“鼠标右键点击”;
当手势动作为为拇指、食指、中指自然伸出,其余二指并拢,中指上下移动时,对应的控制指令为“鼠标左键点击”。
在步骤S103中,执行所述控制指令。
在本实施例中,系统读取所述控制指令并执行以完成相应的功能。例如系统读取“返回”指令,执行返回前一页面或者上一级菜单功能。
需要说明的是,本发明实施例的手势动作默认为右手的手势动作。
本发明实施例与现有技术相比获得的有益效果有:首先,手势规则少、便于用户学习、记忆。与现有技术方案一相比,本发明实施例大大简化了手势规则的数量,用户只要记住三个基本手势动作,就能像使用鼠标一样完成绝大多数的操作。其次,操作简单、流畅。与现有技术方案二相比,用户在实施三个基本手势时,基本的手部姿态是相同的(伸出拇指、食指、中指,并拢另两个手指),使得用户在连续操作时不需要变换基本姿态,可以流畅地在不同手势间切换。另外,手势动作的定义与用户习惯相符,例如本发明实施例定义的击左键是(若右手)拇指移动,击右键是(若右手)中指移动,这种直观地左右映射与用户习惯相符,方便用户学习和记忆,易用性和实用性都较强。
作为本发明的另一实施例,为了满足不同用户的使用需求(习惯用右手或者左手的用户),进一步增强本发明的易用性和实用性,所述方法在步骤S101之后,步骤S102之前,还包括:
检测所述手势动作为右手手势动作还是左手手势动作,在所述手势动作为右手手势动作时,根据右手手势动作与控制指令的映射关系,获取与所述右手手势动作对应的控制指令,在所述势动作为左手手势动作时,根据左手手势动作与控制指令的映射关系,获取与所述左手手势动作对应的控制指令。
其中,检测所述手势动作为右手手势动作还是左手手势动作具体为:预先设置左目标区域和右目标区域(可以以人脸两眼的中心点或者鼻子位置为分界线,左边部分为左目标区域,右边部分为右目标区域),检测所述手势动作是在左目标区域还是右目标区域,若在左目标区域则为左手手势动作,否则为右手手势动作。
图3示出了本发明实施例二提供的非接触式手势控制装置的组成结构,为了便于说明,仅示出了与本发明实施例相关的部分。
该非接触式手势控制装置可以是运行于各电子设备(包括手机、平板电脑手机、台式电脑、平板电脑、电视机、电冰箱、洗衣机、空调、数码相机、监控摄像头、医疗电子仪器等)的软件单元、硬件单元或者软硬件相结合的单元。
该非接触式手势控制装置3包括手势动作获取单元31、控制指令获取单元32以及控制指令执行单元33。其中,各单元的具体功能如下:
手势动作获取单元31,用于获取用户的手势动作;优选的,所述手势动作获取单元31具体用于:通过图像采集器采集用户的手势图像,并从所述手势图像中提取手势动作;或者,通过反射式姿态追踪器、红外姿态追踪或超声波姿态追踪器获取用户的手势动作。
控制指令获取单元32,用于根据预存的手势动作与控制指令的映射关系,获取与所述手势动作获取单元31获取的手势动作对应的控制指令;
控制指令执行单元33,用于执行所述控制指令获取单元32获取的控制指令;
其中,所述手势动作包括但不局限于:食指划圆圈其余四指握拳且圆圈大于手的轮廓;或者五指并拢,手掌向所述设备所在方向移动;或者拇指、食指、中指自然伸出,其余二指并拢,食指自由移动;或者拇指、食指、中指自然伸出,其余二指并拢,拇指上下移动;或者拇指、食指、中指自然伸出,其余二指并拢,中指上下移动。
进一步的,所述装置3还包括:
映射关系创建单元34,用于在所述控制指令获取单元32获取与所述手势动作获取单元31获取的手势动作对应的控制指令前,创建并存储手势动作与控制指令的映射关系,所述手势动作包括右手手势动作和左手手势动作;
检测单元35,用于在所述手势动作获取单元31获取手势动作之后,在所述控制指令获取单元32获取所述手势动作对应的控制指令之前,检测所述手势动作为右手手势动作还是左手手势动作。
在本实施例中,所述右手手势动作与控制指令的映射关系包括但不局限于以下至少一个:
当手势动作为食指划圆圈其余四指握拳且圆圈大于手的轮廓时,对应的控制指令为“确定”或者“否定”;优选的是,当手势动作为食指顺时针划圆圈其余四指握拳且圆圈大于手的轮廓时,对应的控制指令为“确定”;当手势动作为食指逆时针划圆圈其余四指握拳且圆圈大于手的轮廓时,对应的控制指令为“否定”;
当手势动作为五指并拢,手掌向所述设备所在方向移动时,对应的控制指令为“返回”;
当手势动作为拇指、食指、中指自然伸出,其余二指并拢,食指自由移动时,对应的控制指令为“鼠标轨沿所述移动的手指的迹移动”;
当手势动作为拇指、食指、中指自然伸出,其余二指并拢,拇指上下移动时,对应的控制指令为“鼠标左键点击”;
当手势动作为为拇指、食指、中指自然伸出,其余二指并拢,中指上下移动时,对应的控制指令为“鼠标右键点击”。
所述左手手势动作与控制指令的映射关系包括但不局限于以下至少一个:
当手势动作为食指划圆圈其余四指握拳且圆圈大于手的轮廓时,对应的控制指令为“确定”或者“否定”;优选的是,当手势动作为食指逆时针划圆圈其余四指握拳且圆圈大于手的轮廓时,对应的控制指令为“确定”;当手势动作为食指顺时针划圆圈其余四指握拳且圆圈大于手的轮廓时,对应的控制指令为“否定”;
当手势动作为五指并拢,手掌向所述设备所在方向移动时,对应的控制指令为“返回”;
当手势动作为拇指、食指、中指自然伸出,其余二指并拢,食指自由移动时,对应的控制指令为“鼠标轨沿所述移动的手指的迹移动”;
当手势动作为拇指、食指、中指自然伸出,其余二指并拢,拇指上下移动时,对应的控制指令为“鼠标右键点击”;
当手势动作为为拇指、食指、中指自然伸出,其余二指并拢,中指上下移动时,对应的控制指令为“鼠标左键点击”。
本实施例提供的非接触式手势控制装置可以使用在前述对应的非接触式手势控制方法,详情参见上述非接触式手势控制方法实施例一的相关描述,在此不再赘述。
综上所述,本发明实施例与现有技术相比获得的有益效果有:首先,手势规则少、便于用户学习、记忆。与现有技术方案一相比,本发明实施例大大简化了手势规则的数量,用户只要记住三个基本手势动作,就能像使用鼠标一样完成绝大多数的操作。其次,操作简单、流畅。与现有技术方案二相比,用户在实施三个基本手势时,基本的手部姿态是相同的(伸出拇指、食指、中指,并拢另两个手指),使得用户在连续操作时不需要变换基本姿态,可以流畅地在不同手势间切换。而且,手势动作的定义与用户习惯相符,例如本发明实施例定义的击左键是(若右手)拇指移动,击右键是(若右手)中指移动,这种直观地左右映射与用户习惯相符,方便用户学习和记忆,易用性和实用性都较强。另外,本发明实施例在创建并存储手势动作与控制指令的映射关系时,创建并存储了右手手势动作和左手手势动作与控制指令的映射关系,满足不同用户的使用需求(习惯用右手或者左手的用户),进一步增强本发明的易用性和实用性。
本发明实施例提供的多节点系统中数据缓存方法,其全部或部分步骤是可以通过程序指令相关的硬件来完成。比如可以通过计算机运行程来完成。该程序可以存储在可读取存储介质,例如,随机存储器、磁盘、光盘等。

Claims (15)

  1. 一种非接触式手势控制方法,其特征在于,所述方法包括:
    获取用户的手势动作;
    根据预存的手势动作与控制指令的映射关系,获取与所述手势动作对应的控制指令;
    执行所述控制指令;
    所述手势动作包括:食指划圆圈其余四指握拳且圆圈大于手的轮廓;或者五指并拢,手掌向所述设备所在方向移动;或者拇指、食指、中指自然伸出,其余二指并拢,食指自由移动;或者拇指、食指、中指自然伸出,其余二指并拢,拇指上下移动;或者拇指、食指、中指自然伸出,其余二指并拢,中指上下移动。
  2. 如权利要求1所述的方法,其特征在于,在所述获取与所述手势动作对应的控制指令之前,还包括:
    创建并存储手势动作与控制指令的映射关系,所述手势动作包括右手手势动作和左手手势动作。
  3. 如权利要求2所述的方法,其特征在于,在所述获取用户的手势动作步骤之后,根据预存的手势动作与控制指令的映射关系,获取与所述手势动作对应的控制指令步骤之前,还包括:
    检测所述手势动作为右手手势动作还是左手手势动作。
  4. 如权利要求2或3所述的方法,其特征在于,所述右手手势动作与控制指令的映射关系包括以下至少一个:
    当手势动作为食指划圆圈其余四指握拳且圆圈大于手的轮廓时,对应的控制指令为“确定”或者“否定”;
    当手势动作为五指并拢,手掌向所述设备所在方向移动时,对应的控制指令为“返回”;
    当手势动作为拇指、食指、中指自然伸出,其余二指并拢,食指自由移动时,对应的控制指令为“鼠标轨沿所述移动的手指的迹移动”;
    当手势动作为拇指、食指、中指自然伸出,其余二指并拢,拇指上下移动时,对应的控制指令为“鼠标左键点击”;
    当手势动作为为拇指、食指、中指自然伸出,其余二指并拢,中指上下移动时,对应的控制指令为“鼠标右键点击”。
  5. 如权利要求4所述的方法,其特征在于,所述当手势动作为食指划圆圈其余四指握拳且圆圈大于手的轮廓时,对应的控制指令为“确定”或者“否定”包括:
    当手势动作为食指顺时针划圆圈其余四指握拳且圆圈大于手的轮廓时,对应的控制指令为“确定”;
    当手势动作为食指逆时针划圆圈其余四指握拳且圆圈大于手的轮廓时,对应的控制指令为“否定”。
  6. 如权利要求2或3所述的方法,其特征在于,所述左手手势动作与控制指令的映射关系包括以下至少一个:
    当手势动作为食指划圆圈其余四指握拳且圆圈大于手的轮廓时,对应的控制指令为“确定”或者“否定”;
    当手势动作为五指并拢,手掌向所述设备所在方向移动时,对应的控制指令为“返回”;
    当手势动作为拇指、食指、中指自然伸出,其余二指并拢,食指自由移动时,对应的控制指令为“鼠标轨沿所述移动的手指的迹移动”;
    当手势动作为拇指、食指、中指自然伸出,其余二指并拢,拇指上下移动时,对应的控制指令为“鼠标右键点击”;
    当手势动作为为拇指、食指、中指自然伸出,其余二指并拢,中指上下移动时,对应的控制指令为“鼠标左键点击”。
  7. 如权利要求6所述的方法,其特征在于,所述当手势动作为食指划圆圈其余四指握拳且圆圈大于手的轮廓时,对应的控制指令为“确定”或者“否定”包括:
    当手势动作为食指逆时针划圆圈其余四指握拳且圆圈大于手的轮廓时,对应的控制指令为“确定”;
    当手势动作为食指顺时针划圆圈其余四指握拳且圆圈大于手的轮廓时,对应的控制指令为“否定”。
  8. 如权利要求1至7任意一项所述的方法,其特征在于,所述获取用户的手势动作包括:
    通过图像采集器采集用户的手势图像,并从所述手势图像中提取手势动作;
    或者,通过反射式姿态追踪器、红外姿态追踪或超声波姿态追踪器获取用户的手势动作。
  9. 一种非接触式手势控制装置,其特征在于,所述装置包括:
    手势动作获取单元,用于获取用户的手势动作;
    控制指令获取单元,用于根据预存的手势动作与控制指令的映射关系,获取与所述手势动作获取单元获取的手势动作对应的控制指令;
    控制指令执行单元,用于执行所述控制指令获取单元获取的控制指令;
    所述手势动作包括:食指划圆圈其余四指握拳且圆圈大于手的轮廓;或者五指并拢,手掌向所述设备所在方向移动;或者拇指、食指、中指自然伸出,其余二指并拢,食指自由移动;或者拇指、食指、中指自然伸出,其余二指并拢,拇指上下移动;或者拇指、食指、中指自然伸出,其余二指并拢,中指上下移动。
  10. 如权利要求9所述的装置,其特征在于,所述装置还包括:
    映射关系创建单元,用于在所述控制指令获取单元获取与所述手势动作获取单元获取的手势动作对应的控制指令前,创建并存储手势动作与控制指令的映射关系,所述手势动作包括右手手势动作和左手手势动作;
    检测单元,用于在所述手势动作获取单元获取手势动作之后,在所述控制指令获取单元获取所述手势动作对应的控制指令之前,检测所述手势动作为右手手势动作还是左手手势动作。
  11. 如权利要求10所述的装置,其特征在于,所述右手手势动作与控制指令的映射关系包括以下至少一个:
    当手势动作为食指划圆圈其余四指握拳且圆圈大于手的轮廓时,对应的控制指令为“确定”或者“否定”;
    当手势动作为五指并拢,手掌向所述设备所在方向移动时,对应的控制指令为“返回”;
    当手势动作为拇指、食指、中指自然伸出,其余二指并拢,食指自由移动时,对应的控制指令为“鼠标轨沿所述移动的手指的迹移动”;
    当手势动作为拇指、食指、中指自然伸出,其余二指并拢,拇指上下移动时,对应的控制指令为“鼠标左键点击”;
    当手势动作为为拇指、食指、中指自然伸出,其余二指并拢,中指上下移动时,对应的控制指令为“鼠标右键点击”。
  12. 如权利要求11所述的装置,其特征在于,所述当手势动作为食指划圆圈其余四指握拳且圆圈大于手的轮廓时,对应的控制指令为“确定”或者“否定”包括:
    当手势动作为食指顺时针划圆圈其余四指握拳且圆圈大于手的轮廓时,对应的控制指令为“确定”;
    当手势动作为食指逆时针划圆圈其余四指握拳且圆圈大于手的轮廓时,对应的控制指令为“否定”。
  13. 如权利要求10所述的装置,其特征在于,所述左手手势动作与控制指令的映射关系包括以下至少一个:
    当手势动作为食指划圆圈其余四指握拳且圆圈大于手的轮廓时,对应的控制指令为“确定”或者“否定”;
    当手势动作为五指并拢,手掌向所述设备所在方向移动时,对应的控制指令为“返回”;
    当手势动作为拇指、食指、中指自然伸出,其余二指并拢,食指自由移动时,对应的控制指令为“鼠标轨沿所述移动的手指的迹移动”;
    当手势动作为拇指、食指、中指自然伸出,其余二指并拢,拇指上下移动时,对应的控制指令为“鼠标右键点击”;
    当手势动作为为拇指、食指、中指自然伸出,其余二指并拢,中指上下移动时,对应的控制指令为“鼠标左键点击”。
  14. 如权利要求13所述的装置,其特征在于,所述当手势动作为食指划圆圈其余四指握拳且圆圈大于手的轮廓时,对应的控制指令为“确定”或者“否定”包括:
    当手势动作为食指逆时针划圆圈其余四指握拳且圆圈大于手的轮廓时,对应的控制指令为“确定”;
    当手势动作为食指顺时针划圆圈其余四指握拳且圆圈大于手的轮廓时,对应的控制指令为“否定”。
  15. 如权利要求9至14任一项所述的装置,其特征在于,所述手势动作获取单元具体用于:通过图像采集器采集用户的手势图像,并从所述手势图像中提取手势动作;或者,通过反射式姿态追踪器、红外姿态追踪或超声波姿态追踪器获取用户的手势动作。
PCT/CN2012/075798 2012-05-21 2012-05-21 一种非接触式手势控制方法及装置 WO2012126426A2 (zh)

Priority Applications (5)

Application Number Priority Date Filing Date Title
EP12759930.6A EP2631739B1 (en) 2012-05-21 2012-05-21 Contactless gesture-based control method and apparatus
CN2012800011749A CN103229127A (zh) 2012-05-21 2012-05-21 一种非接触式手势控制方法及装置
EP14190503.4A EP2853989A1 (en) 2012-05-21 2012-05-21 Contactless gesture-based control method and apparatus
PCT/CN2012/075798 WO2012126426A2 (zh) 2012-05-21 2012-05-21 一种非接触式手势控制方法及装置
US13/875,476 US8866781B2 (en) 2012-05-21 2013-05-02 Contactless gesture-based control method and apparatus

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/CN2012/075798 WO2012126426A2 (zh) 2012-05-21 2012-05-21 一种非接触式手势控制方法及装置

Related Child Applications (1)

Application Number Title Priority Date Filing Date
US13/875,476 Continuation US8866781B2 (en) 2012-05-21 2013-05-02 Contactless gesture-based control method and apparatus

Publications (2)

Publication Number Publication Date
WO2012126426A2 true WO2012126426A2 (zh) 2012-09-27
WO2012126426A3 WO2012126426A3 (zh) 2013-04-25

Family

ID=46879802

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2012/075798 WO2012126426A2 (zh) 2012-05-21 2012-05-21 一种非接触式手势控制方法及装置

Country Status (4)

Country Link
US (1) US8866781B2 (zh)
EP (2) EP2631739B1 (zh)
CN (1) CN103229127A (zh)
WO (1) WO2012126426A2 (zh)

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103970260A (zh) * 2013-01-31 2014-08-06 华为技术有限公司 一种非接触式手势控制方法及电子终端设备
AT514528A1 (de) * 2013-06-21 2015-01-15 Engel Austria Gmbh Formgebungsanlage mit Gestensteuerung
CN106843501A (zh) * 2017-03-03 2017-06-13 宇龙计算机通信科技(深圳)有限公司 一种设备操作控制方法及装置
CN108594995A (zh) * 2018-04-13 2018-09-28 广东小天才科技有限公司 一种基于手势识别的电子设备操作方法及电子设备
CN110334561A (zh) * 2018-03-31 2019-10-15 广州卓腾科技有限公司 一种控制对象旋转的手势控制方法
CN111469859A (zh) * 2020-03-27 2020-07-31 一汽奔腾轿车有限公司 一种汽车手势交互系统

Families Citing this family (25)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9014911B2 (en) * 2011-11-16 2015-04-21 Flextronics Ap, Llc Street side sensors
US10295338B2 (en) 2013-07-12 2019-05-21 Magic Leap, Inc. Method and system for generating map data from an image
KR102165818B1 (ko) 2013-09-10 2020-10-14 삼성전자주식회사 입력 영상을 이용한 사용자 인터페이스 제어 방법, 장치 및 기록매체
KR101844390B1 (ko) * 2014-01-03 2018-04-03 인텔 코포레이션 사용자 인터페이스 제어를 위한 시스템 및 기법
US9430046B2 (en) 2014-01-16 2016-08-30 Denso International America, Inc. Gesture based image capturing system for vehicle
US11004139B2 (en) 2014-03-31 2021-05-11 Monticello Enterprises LLC System and method for providing simplified in store purchases and in-app purchases using a use-interface-based payment API
US11282131B2 (en) 2014-03-31 2022-03-22 Monticello Enterprises LLC User device enabling access to payment information in response to user input
US10726472B2 (en) * 2014-03-31 2020-07-28 Monticello Enterprises LLC System and method for providing simplified in-store, product-based and rental payment processes
US10511580B2 (en) * 2014-03-31 2019-12-17 Monticello Enterprises LLC System and method for providing a social media shopping experience
US10832310B2 (en) 2014-03-31 2020-11-10 Monticello Enterprises LLC System and method for providing a search entity-based payment process
US11080777B2 (en) 2014-03-31 2021-08-03 Monticello Enterprises LLC System and method for providing a social media shopping experience
US20240013283A1 (en) * 2014-03-31 2024-01-11 Monticello Enterprises LLC System and method for providing a social media shopping experience
US11250493B2 (en) 2014-03-31 2022-02-15 Monticello Enterprises LLC System and method for performing social media cryptocurrency transactions
US9965796B2 (en) * 2014-06-26 2018-05-08 Paypal, Inc. Social media buttons with payment capability
FR3026502A1 (fr) * 2014-09-30 2016-04-01 Valeo Comfort & Driving Assistance Systeme et procede de commande d'un equipement d'un vehicule automobile
DE102014224632A1 (de) * 2014-12-02 2016-06-02 Robert Bosch Gmbh Verfahren zum Betreiben einer Eingabevorrichtung, Eingabevorrichtung
TWI552892B (zh) * 2015-04-14 2016-10-11 鴻海精密工業股份有限公司 車輛控制系統及其操作方法
CN106125911B (zh) * 2016-06-16 2020-02-11 北京地平线机器人技术研发有限公司 用于机器的人机交互学习方法及机器
JP6208837B1 (ja) * 2016-10-12 2017-10-04 株式会社エイチアイ ユーザインタフェースを制御する方法、プログラム及び装置
US10099368B2 (en) 2016-10-25 2018-10-16 Brandon DelSpina System for controlling light and for tracking tools in a three-dimensional space
CN109991859B (zh) * 2017-12-29 2022-08-23 青岛有屋科技有限公司 一种手势指令控制方法及智能家居控制系统
KR20210034843A (ko) * 2019-09-23 2021-03-31 삼성전자주식회사 차량의 제어 장치 및 방법
CN111123986A (zh) * 2019-12-25 2020-05-08 四川云盾光电科技有限公司 一种基于手势进行二自由度转台控制的控制装置
US11916900B2 (en) * 2020-04-13 2024-02-27 Ouraring, Inc. Authorized remote control device gesture control methods and apparatus
CN114115536A (zh) * 2021-11-22 2022-03-01 北京字节跳动网络技术有限公司 一种交互方法、装置、电子设备和存储介质

Family Cites Families (40)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5203704A (en) * 1990-12-21 1993-04-20 Mccloud Seth R Method of communication using pointing vector gestures and mnemonic devices to assist in learning point vector gestures
EP1717684A3 (en) * 1998-01-26 2008-01-23 Fingerworks, Inc. Method and apparatus for integrating manual input
DE19845030A1 (de) * 1998-09-30 2000-04-20 Siemens Ag Bildsystem
US7109970B1 (en) * 2000-07-01 2006-09-19 Miller Stephen S Apparatus for remotely controlling computers and other electronic appliances/devices using a combination of voice commands and finger movements
US7030861B1 (en) * 2001-02-10 2006-04-18 Wayne Carl Westerman System and method for packing multi-touch gestures onto a hand
JP2006172439A (ja) * 2004-11-26 2006-06-29 Oce Technologies Bv 手操作を用いたデスクトップスキャン
KR100687737B1 (ko) * 2005-03-19 2007-02-27 한국전자통신연구원 양손 제스쳐에 기반한 가상 마우스 장치 및 방법
US8537112B2 (en) * 2006-02-08 2013-09-17 Oblong Industries, Inc. Control system for navigating a principal dimension of a data space
US9910497B2 (en) * 2006-02-08 2018-03-06 Oblong Industries, Inc. Gestural control of autonomous and semi-autonomous systems
US8665213B2 (en) * 2006-02-08 2014-03-04 Oblong Industries, Inc. Spatial, multi-modal control device for use with spatial operating system
WO2007097548A1 (en) * 2006-02-20 2007-08-30 Cheol Woo Kim Method and apparatus for user-interface using the hand trace
CN1881994A (zh) 2006-05-18 2006-12-20 北京中星微电子有限公司 一种用于移动设备的手写输入及手势识别的方法和装置
US8086971B2 (en) 2006-06-28 2011-12-27 Nokia Corporation Apparatus, methods and computer program products providing finger-based and hand-based gesture commands for portable electronic device applications
US8589824B2 (en) * 2006-07-13 2013-11-19 Northrop Grumman Systems Corporation Gesture recognition interface system
US9696808B2 (en) * 2006-07-13 2017-07-04 Northrop Grumman Systems Corporation Hand-gesture recognition method
US8972902B2 (en) * 2008-08-22 2015-03-03 Northrop Grumman Systems Corporation Compound gesture recognition
US9317124B2 (en) 2006-09-28 2016-04-19 Nokia Technologies Oy Command input by hand gestures captured from camera
KR100783552B1 (ko) 2006-10-11 2007-12-07 삼성전자주식회사 휴대 단말기의 입력 제어 방법 및 장치
CN101369180A (zh) * 2007-08-15 2009-02-18 联想(北京)有限公司 手指指点装置
US9261979B2 (en) * 2007-08-20 2016-02-16 Qualcomm Incorporated Gesture-based mobile interaction
CN107102723B (zh) 2007-08-20 2019-12-06 高通股份有限公司 用于基于手势的移动交互的方法、装置、设备和非暂时性计算机可读介质
US20120204133A1 (en) * 2009-01-13 2012-08-09 Primesense Ltd. Gesture-Based User Interface
US8555207B2 (en) * 2008-02-27 2013-10-08 Qualcomm Incorporated Enhanced input using recognized gestures
CA2718680C (en) * 2008-03-18 2016-12-06 Elliptic Laboratories As Object and movement detection
WO2009128064A2 (en) * 2008-04-14 2009-10-22 Pointgrab Ltd. Vision based pointing device emulation
CN101605399A (zh) * 2008-06-13 2009-12-16 英华达(上海)电子有限公司 一种实现手语识别的移动终端及方法
EP2304527A4 (en) * 2008-06-18 2013-03-27 Oblong Ind Inc GESTIK BASED CONTROL SYSTEM FOR VEHICLE INTERFACES
GB2477044B (en) * 2008-08-22 2012-04-04 Northrop Grumman Systems Corp Compound gesture recognition
US8516561B2 (en) * 2008-09-29 2013-08-20 At&T Intellectual Property I, L.P. Methods and apparatus for determining user authorization from motion of a gesture-based control unit
US9569001B2 (en) * 2009-02-03 2017-02-14 Massachusetts Institute Of Technology Wearable gestural interface
TW201101198A (en) * 2009-06-17 2011-01-01 Sonix Technology Co Ltd Command input method
JP2011028366A (ja) * 2009-07-22 2011-02-10 Sony Corp 操作制御装置および操作制御方法
US9174123B2 (en) * 2009-11-09 2015-11-03 Invensense, Inc. Handheld computer systems and techniques for character and command recognition related to human movements
CN101807114B (zh) 2010-04-02 2011-12-07 浙江大学 一种基于三维手势的自然交互方法
TW201142465A (en) * 2010-05-17 2011-12-01 Hon Hai Prec Ind Co Ltd Front projection device and front projection controlling method
CN102339125A (zh) 2010-07-23 2012-02-01 夏普株式会社 信息设备及其控制方法和系统
CN101916161B (zh) * 2010-08-04 2012-10-10 宇龙计算机通信科技(深圳)有限公司 基于手指按压区域图形选择界面模式的方法及移动终端
CN102053702A (zh) * 2010-10-26 2011-05-11 南京航空航天大学 动态手势控制系统与方法
KR101169583B1 (ko) * 2010-11-04 2012-07-31 주식회사 매크론 가상마우스 구동방법
US20130211843A1 (en) * 2012-02-13 2013-08-15 Qualcomm Incorporated Engagement-dependent gesture recognition

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
None
See also references of EP2631739A4

Cited By (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103970260A (zh) * 2013-01-31 2014-08-06 华为技术有限公司 一种非接触式手势控制方法及电子终端设备
WO2014117647A1 (zh) * 2013-01-31 2014-08-07 华为技术有限公司 一种非接触式手势控制方法及电子终端设备
CN103970260B (zh) * 2013-01-31 2017-06-06 华为技术有限公司 一种非接触式手势控制方法及电子终端设备
KR101801073B1 (ko) 2013-01-31 2017-11-24 후아웨이 테크놀러지 컴퍼니 리미티드 비접촉 제스쳐 제어 방법 및 전자 단말 장치
US10671342B2 (en) 2013-01-31 2020-06-02 Huawei Technologies Co., Ltd. Non-contact gesture control method, and electronic terminal device
AT514528A1 (de) * 2013-06-21 2015-01-15 Engel Austria Gmbh Formgebungsanlage mit Gestensteuerung
CN106843501A (zh) * 2017-03-03 2017-06-13 宇龙计算机通信科技(深圳)有限公司 一种设备操作控制方法及装置
CN110334561A (zh) * 2018-03-31 2019-10-15 广州卓腾科技有限公司 一种控制对象旋转的手势控制方法
CN108594995A (zh) * 2018-04-13 2018-09-28 广东小天才科技有限公司 一种基于手势识别的电子设备操作方法及电子设备
CN111469859A (zh) * 2020-03-27 2020-07-31 一汽奔腾轿车有限公司 一种汽车手势交互系统

Also Published As

Publication number Publication date
US20130307765A1 (en) 2013-11-21
US8866781B2 (en) 2014-10-21
EP2631739A2 (en) 2013-08-28
EP2631739A4 (en) 2013-09-11
CN103229127A (zh) 2013-07-31
WO2012126426A3 (zh) 2013-04-25
EP2631739B1 (en) 2016-02-03
EP2853989A1 (en) 2015-04-01

Similar Documents

Publication Publication Date Title
WO2012126426A2 (zh) 一种非接触式手势控制方法及装置
JP5900393B2 (ja) 情報処理装置、操作制御方法及びプログラム
WO2015099293A1 (en) Device and method for displaying user interface of virtual input device based on motion recognition
WO2013163920A1 (zh) 插入或删除电子表格中单元格或行列的方法及其装置
WO2017032187A1 (zh) 自动捕捉目标物的方法、装置及存储介质
TW201133319A (en) Touch sensing system, electronic touch apparatus, and touch sensing method
CN108616712B (zh) 一种基于摄像头的界面操作方法、装置、设备及存储介质
KR101095851B1 (ko) 터치스크린장치 및 그 제어방법
CN104808788A (zh) 一种非接触式手势操控用户界面的方法
WO2021197487A1 (zh) 一种鼠标控制终端屏幕的方法、装置、鼠标及存储介质
WO2014054861A1 (en) Terminal and method for processing multi-point input
CN108304116A (zh) 一种单手指触控交互的方法
WO2017211056A1 (zh) 一种移动终端的单手操作方法、及系统
WO2016017931A1 (ko) Nui 장치를 통하여 사용자와 상호작용하는 인터페이스 제공방법 및 제공장치
WO2013081413A1 (ko) 터치스크린 상의 화면조작방법
TWI646526B (zh) 子畫面佈局控制方法和裝置
WO2012092770A1 (zh) 触控绘画处理系统及方法
WO2010095783A1 (ko) 터치 스크린 제어 방법 및 이를 이용하는 터치 스크린 장치
Boruah et al. Development of a learning-aid tool using hand gesture based human computer interaction system
WO2023179694A1 (zh) 一种基于纹理识别的差异触控方法
CN109739422B (zh) 一种窗口控制方法、装置及设备
CN107092433A (zh) 触控一体机的触摸控制方法及装置
CN1991715A (zh) 电视触摸屏幕控制装置及方法
WO2019056614A1 (zh) 一种触摸数据的分离方法、装置、设备和存储介质
WO2015152487A1 (ko) 사용자 인터페이스를 제공하기 위한 방법, 디바이스, 시스템 및 비일시성의 컴퓨터 판독 가능한 기록 매체

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 12759930

Country of ref document: EP

Kind code of ref document: A2

WWE Wipo information: entry into national phase

Ref document number: 2012759930

Country of ref document: EP

NENP Non-entry into the national phase

Ref country code: DE