WO2023015886A1 - 基于人工现实的手势交互方法、系统 - Google Patents

基于人工现实的手势交互方法、系统 Download PDF

Info

Publication number
WO2023015886A1
WO2023015886A1 PCT/CN2022/081494 CN2022081494W WO2023015886A1 WO 2023015886 A1 WO2023015886 A1 WO 2023015886A1 CN 2022081494 W CN2022081494 W CN 2022081494W WO 2023015886 A1 WO2023015886 A1 WO 2023015886A1
Authority
WO
WIPO (PCT)
Prior art keywords
index finger
thumb
hand
operation icon
artificial reality
Prior art date
Application number
PCT/CN2022/081494
Other languages
English (en)
French (fr)
Inventor
孙飞
余海桃
吴涛
李阿川
Original Assignee
青岛小鸟看看科技有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 青岛小鸟看看科技有限公司 filed Critical 青岛小鸟看看科技有限公司
Priority to EP22854891.3A priority Critical patent/EP4345583A1/en
Publication of WO2023015886A1 publication Critical patent/WO2023015886A1/zh
Priority to US18/400,624 priority patent/US20240134461A1/en

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/017Gesture based interaction, e.g. based on a set of recognized hand gestures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/002Specific input/output arrangements not covered by G06F3/01 - G06F3/16
    • G06F3/005Input arrangements through a video camera
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/04815Interaction with a metaphor-based environment or interaction object displayed as three-dimensional, e.g. changing the user viewpoint with respect to the environment or object
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/04817Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance using icons
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/04842Selection of displayed objects or displayed text elements
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/0486Drag-and-drop
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser

Definitions

  • the present disclosure relates to the technical field of virtual reality, and more specifically, to a gesture interaction method and system based on artificial reality.
  • virtual reality systems Due to the advancement of technology and the diversified development of market demand, virtual reality systems are becoming more and more common and applied in many fields, such as computer games, health and safety, industry and education and training. To give a few examples, mixed virtual reality systems are being integrated into mobile communication devices, game consoles, personal computers, movie theaters, theme parks, university laboratories, student classrooms, hospital exercise gyms and other corners of life.
  • the purpose of the present disclosure is to provide a gesture interaction method based on artificial reality to solve the current interaction gestures that use one-handed index finger "click”, “pinch” with thumb and index finger, and make a fist for "confirmation”.
  • Recognition accuracy requirements are high, so the human and financial resources invested in this are bound to be large, and although the accuracy of interactive gesture-to-gesture recognition is high, the interaction accuracy and experience are relatively poor.
  • the present disclosure provides a gesture interaction method based on artificial reality, which includes:
  • the ray formed from the preset position of the hand to the finger joints when the four fingers are clenched is used as an operation instruction line; wherein, the endpoint of the ray is the preset position of the hand;
  • the operation icon is controlled to move accordingly, so as to move the operation icon to the target button on the display surface;
  • the preset position of the hand is the wrist joint
  • the finger joint is the index finger joint.
  • controlling the operation icon to move accordingly includes: controlling the operation icon to move according to the movement track when the hand moves with four fingers clenched.
  • determining that the operation icon clicks the target button to complete the interaction includes:
  • the distance between the distal finger pulp of the thumb and the joint plane of the index finger is determined as the thumb-index finger spacing
  • the control completes the action of clicking the target button on the operation icon, and Make the action of clicking the target button on the operation icon take effect, and complete an interaction action.
  • the interface to be dragged is determined by the thumb-index finger distance being smaller than the click distance threshold.
  • the position of the hand is determined when the above-mentioned click distance threshold is determined; if it is detected that the hand stops moving, and the distal finger pulp is far away from the joint plane of the index finger, and the distance between the thumb and index finger is greater than the preset effective distance threshold, it is determined that the stop dragging is received. Waiting for the command of the sliding interface.
  • the joint plane of the index finger includes all the surfaces formed between the distal fingertip of the index finger and the interphalangeal joint of the index finger, between two interphalangeal joints of the index finger, and between the interphalangeal joint of the index finger and the metacarpal joint of the index finger. point.
  • the click distance threshold is equal to the effective distance threshold.
  • the present disclosure also provides an artificial reality-based gesture interaction system, which implements the aforementioned artificial reality-based gesture interaction method, including a camera for capturing hand gestures, a display for displaying interactive pages, and a display connected to the The processor of camera and described display; Wherein, described processor comprises:
  • the indication configuration unit is used to use the ray formed from the preset position of the hand to the finger joints in the state of making a fist with four fingers as an operation indication line; wherein, the endpoint of the ray is the preset position of the hand;
  • An icon configuration unit configured to determine the point where the operation indication line intersects with the display surface in the artificial reality as the position of the operation icon in the artificial reality;
  • controlling the operation unit configured to control the operation icon to move accordingly if it is detected that the four fingers of the hand are clenched and moved, so as to move the operation icon to the target button on the display surface, wherein the display is used to displaying the display surface;
  • the response interaction unit is configured to determine that the operation icon is clicked on the target button to complete the interaction if it is detected that the thumb of the hand touches the index finger.
  • the response interaction unit includes an action unit and a validation unit, wherein the response interaction unit is used to determine that the operation icon clicks the target button to complete the interaction if it detects that the thumb of the hand touches the index finger. , specifically for:
  • the distance between the distal finger pulp of the thumb and the joint plane of the index finger is determined as the thumb-index finger spacing
  • the action unit is triggered, and the action unit is used to control the operation icon to click the target button ;
  • the effective unit is triggered, and the effective unit is used to control the completion of the effective distance.
  • the action of clicking the target button with the operation icon, and making the action of clicking the target button with the operation icon take effect, so as to complete an interactive action.
  • a drag and slide unit is also included, and the drag and slide unit is used to determine that an instruction to drag and slide the interface to be dragged is received if it is detected that the hand moves when the distance between the thumb and forefinger is smaller than the click distance threshold, and the to be dragged
  • the sliding interface is determined by the position of the hand when the thumb-index finger distance is less than the click distance threshold;
  • the artificial reality-based gesture interaction method and system use the ray formed from the preset position of the hand to the finger joints when the four fingers are clenched as an operation indicator line; wherein, the ray of the ray The endpoint is the preset position of the hand; the point at which the operation indicator line intersects the display surface in the artificial reality is determined as the position of the operation icon in the artificial reality; if it is detected that the four fingers of the hand make a fist and move, the operation is controlled The icon is moved accordingly to move the operation icon to the target button on the display surface; if it is detected that the thumb of the hand touches the index finger, it is determined that the operation icon clicks the target button to complete the interaction scheme
  • any point where the thumb touches the index finger can realize the click operation, and the precision requirement is small, thereby reducing manpower and financial resources, and the small precision requirement makes it easy to perform the click operation, greatly improving the user's interactive experience.
  • FIG. 1 is a flowchart of a gesture interaction method based on artificial reality according to an embodiment of the present disclosure
  • FIG. 2 is a schematic diagram of the relationship between the hand and the display surface displayed on the display in the gesture interaction method based on artificial reality according to an embodiment of the present disclosure
  • FIG. 3 is a schematic diagram of a thumb touching an index finger in a gesture interaction method based on artificial reality according to an embodiment of the present disclosure
  • FIG. 4 is a schematic diagram of a dragging or sliding interaction in an artificial reality-based gesture interaction method according to an embodiment of the present disclosure
  • FIG. 5 is a schematic diagram of an artificial reality-based gesture interaction system according to an embodiment of the present disclosure.
  • the interactive gestures of "clicking" with the index finger of one hand, “pinch” with the thumb and index finger, and “confirming” with clenching a fist have high requirements for the accuracy of gesture recognition, so the human and financial resources invested in this are bound to be large, and although The accuracy of gesture recognition by interactive gestures is relatively high, but the interaction accuracy and experience are relatively poor.
  • the present disclosure provides a gesture interaction method and system based on artificial reality. Specific embodiments of the present disclosure will be described in detail below with reference to the accompanying drawings.
  • Figure 1, Figure 2, Figure 3, and Figure 4 illustrate the gesture interaction method based on artificial reality in the embodiment of the present disclosure
  • An example of an artificial reality-based gesture interaction system is exemplarily marked.
  • the gesture interaction method based on artificial reality provided by the present disclosure includes:
  • S1 The ray formed from the preset position of the hand to the finger joints when the four fingers are clenched is used as an operation indication line; wherein, the endpoint of the ray is the preset position of the hand;
  • S2 Determine the point where the operation indication line intersects the display surface in the artificial reality as the position of the operation icon in the artificial reality;
  • step S1 is the process of taking the ray formed from the preset position of the hand to the finger joints when the four fingers are clenched into a fist as the operation indicating line, and the end point of the ray is the preset position of the hand.
  • the four fingers are clenched, and the thumb 11 can be in a vertical state or in a clenched state, which is not limited here, and the preset position of the hand and the position of the finger joints are not limited.
  • the The preset position of the hand is the wrist joint 13, and the finger joint is the index finger joint 12, so that the operation indication line 15 is formed with the wrist joint 13 as the endpoint, and the operation indication line can be displayed or not displayed in the artificial reality scene.
  • the operation indication line is transparent and does not display. Since the rays determined for the wrist joint and the index finger joint, the operation indication line is mostly horizontal, so that the user can directly interact with the hand in the later stage.
  • step S2 is to use the point where the operation instruction line intersects with the display surface in the artificial reality as the position of the operation icon in the artificial reality, even if the operation instruction line 15 with the wrist joint 13 as the endpoint is displayed
  • the direction of the surface 16 extends until it intersects with the display surface 16, and the intersecting point is used as the position of the operation icon 14.
  • the operation icon can be transparent and colorless, or colored. If it is colored, it can be any
  • the mark of appropriate size is a spherical icon in this embodiment, so that the user can move the operation icon clearly.
  • step S3 is the process of controlling the operation icon to move correspondingly if it is detected that the four fingers of the hand are clenched and moved, so as to move the operation icon to the target key on the display surface.
  • controlling the operation icon to move accordingly includes: controlling the operation icon to move according to the movement trajectory when the four fingers of the hand are clenched and moved.
  • the operation icon 14 moves correspondingly with the movement of the hand. For example, the user faces the display surface, and the hand is the left hand. When the left hand moves to the left, right, up, down, forward, and backward, the icons on the display surface appear to the user. It also moves left, right, up, down, forward, and backward with the hand, so that the position of the operation icon can be flexibly controlled by the hand.
  • step S4 is if the action of touching the index finger with the thumb of the hand is detected, then it is determined that the operation icon clicks the target button to complete the interaction process.
  • determining that the operation icon clicks the target button to complete the interaction includes:
  • S42 Determine the distance between the distal pulp of the thumb and the joint plane of the index finger as the thumb-index finger distance
  • the operation icon clicks the target button
  • the specific value of the click distance threshold is not limited and can be adjusted at any time; when the click action occurs, the thumb is lifted, even if the distal finger pulp is far away from the plane determined by each joint in the index finger, when the distance between the thumb and the index finger is greater than the preset If the effective distance threshold is set, the click action will be terminated, and the action of clicking the target button on the operation icon will take effect. In this way, one-key interactive operations such as click, cancel, and confirmation are completed. There is no limit to the effective distance threshold and can be adjusted at any time. .
  • Fig. 4 also include:
  • the user may complete the interactive action by dragging and sliding, wherein, responding to the process of completing the interactive action by dragging and sliding may include:
  • S51 Determine the plane of the index finger joint according to the position of the index finger joint; wherein, the position of the index finger joint can be the position of each joint in the index finger;
  • S52 Determine the distance between the distal pulp of the thumb 41 and the joint plane of the index finger as the thumb-index finger distance;
  • the interface to be dragged if one wants to move the interface to be dragged, first determine that the distal finger pulp of the thumb 41 is close to the plane defined by each joint in the index finger, even if the thumb 41 presses the center of the index finger The plane determined by each joint, so that the thumb-index finger distance becomes smaller, when it is less than the preset click distance threshold, that is, press the interface to be dragged at 411, and then the user can move the hand to make the interface to be dragged follow the movement of the hand Move from the position of 411 to the position of 412 via the path 410. At this time, the sliding interface has reached the target position 412.
  • the effective distance threshold and the click distance threshold are the same value, so that it is convenient to adjust the two thresholds together according to the current situation, and each joint in the index finger
  • the determined plane includes all the points covered by the surface formed between the distal fingertip of the index finger and the interphalangeal joint of the index finger, between the interphalangeal joints of the two index fingers, and between the interphalangeal joint of the index finger and the metacarpal joint of the index finger, that is, the distal end of the thumb.
  • the "press” and “click” gestures can be formed by the end finger pulp and the distal fingertip of the index finger, and the “press” and “click” gestures can also be formed by the distal finger pulp of the thumb and the interphalangeal joint of the index finger.
  • the metacarpal joint of the index finger can also be thought of as a "press” or “click” gesture, and the plane formed between the thumb pulp and the interphalangeal joint of the index finger, the plane formed between the interphalangeal joints of the two index fingers, and the plane between the interphalangeal joints of the index finger Any point in the plane formed between them can realize the "press” and "click” gestures, which reduces the precision requirements of gesture control and improves the user experience.
  • the click can click any target button on the display and the interface to be dragged, which can satisfy any interactive action that can find the manipulation page in the display , the specific operation details will not be repeated here.
  • the ray formed from the preset position of the hand to the finger joints when the four fingers are clenched is used as an operation instruction line; wherein, the endpoint of the ray is the hand The preset position of the artificial reality; the point where the operation indicator line intersects with the display surface in the artificial reality is determined as the position of the operation icon of the artificial reality; if it is detected that the four fingers of the hand are clenched and moved, the operation icon is controlled to move accordingly , so as to move the operation icon to the target button on the display surface; if it is detected that the thumb of the hand touches the index finger, it is determined that the operation icon clicks the target button to complete the interaction scheme, so that the thumb touches the Any point of the index finger can be clicked, and requires less precision, thereby reducing manpower and financial resources, and because of the less precision required, it is easy to perform click operations, greatly improving the user's interactive experience.
  • the present disclosure also provides an artificial reality-based gesture interaction system 100, which implements the aforementioned artificial reality-based gesture interaction method, including a camera 110 for capturing hand gestures, and a camera 110 for displaying The display 120 of the interactive page and the processor 13 connecting the camera 110 and the display 120; wherein, the processor 130 includes:
  • the indication configuration unit 131 is used to use the ray formed from the preset position of the hand to the finger joints when the four fingers are clenched as an operation indication line; wherein, the end point of the ray is the preset position of the hand;
  • the icon configuration unit 132 is configured to determine the point where the operation indication line intersects the display surface in the artificial reality as the position of the operation icon in the artificial reality;
  • the control operation unit 133 is configured to control the operation icon to move accordingly if it is detected that the four fingers of the hand are clenched and moved, so as to move the operation icon to the target button on the display surface, wherein the display 120 is used to display the display noodle;
  • the response interaction unit 134 is configured to determine that the operation icon is clicked on a target button to complete the interaction if it is detected that the thumb of the hand touches the index finger.
  • the response interaction unit 134 includes an action unit 134-1 and a validation unit 134-2, wherein the response interaction unit is used to detect that the thumb of the hand touches the index finger, then When it is determined that the operation icon clicks the target button to complete the interaction, it is specifically used for:
  • the distance between the distal finger pulp of the thumb and the joint plane of the index finger is determined as the thumb-index finger spacing
  • the action unit 134-1 is triggered, and the action unit 134-1 is used to control the operation icon to click the target button ;
  • the effective unit 134-2 is triggered, and the effective unit 134-2 is used to control the completion of the operation icon to click on the target The action of pressing the button, and making the action of clicking the target button on the operation icon take effect, so as to complete an interactive action.
  • a dragging unit 135 is also included, and the dragging unit 135 can be integrated in the processor 130, and the dragging unit 135 is used for:
  • the user can complete the interactive action by dragging and sliding, wherein, when the dragging and sliding unit 135 responds to completing the interactive action by dragging and sliding, it can be used to:
  • the joint position of the index finger can be the position of each joint in the index finger
  • the distance between the distal finger pulp of the thumb and the joint plane of the index finger is determined as the thumb-index finger spacing
  • control operation icon is pressed to be dragged on the interface; if it is detected that the hand moves while the thumb-index finger distance is smaller than the preset click distance threshold, control The interface to be dragged is dragged or slid with the movement of the hand;
  • the gesture interaction system based on artificial reality firstly captures the movement posture of the hand through the camera 110, and then uses the indication configuration unit 131 in the processor 130 to predict the hand gesture when the four fingers are clenched.
  • the ray formed from the position to the index finger joint is used as the operation instruction line, and then the point where the operation instruction line intersects with the display surface in the artificial reality displayed on the display is determined by the icon configuration unit 132 to determine the position of the operation icon of the artificial reality, and then through the control operation unit 133, when it is detected that the four fingers of the hand make a fist and move, the operation icon is controlled to move accordingly, so as to move the operation icon to the target button on the display surface, and then the response interaction unit 134 detects that the thumb of the hand When touching the index finger, determine the operation icon and click the target button to complete the interaction scheme, so that any point where the thumb touches the index finger can realize the click operation, and the precision requirement is small, thereby reducing manpower and financial resources. In addition, since the accuracy requirement is small, it is convenient to perform a click operation, which greatly improves the user's interactive experience.

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Multimedia (AREA)
  • Position Input By Displaying (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

提供一种基于人工现实的手势交互方法、系统,方法包括:将四指握拳状态时手的预设位置至手指关节形成的射线作为操作指示线;其中,所述射线的端点为所述手的预设位置;将所述操作指示线与人工现实中的显示面相交的点确定为人工现实的操作图标的位置;若检测到手四指握拳并移动,则控制所述操作图标进行相应的移动,以将所述操作图标在所述显示面中移动至目标按键处;若检测到手的拇指触接食指的动作,则确定所述操作图标点击所述目标按键以完成交互。

Description

基于人工现实的手势交互方法、系统
相关申请的交叉引用
本申请要求于2021年8月12日提交的,申请名称为“基于人工现实的手势交互方法、系统”的、中国专利申请号为“202110926659.X”的优先权,该中国专利申请的全部内容通过引用结合在本申请中。
技术领域
本公开涉及虚拟现实技术领域,更为具体地,涉及一种基于人工现实的手势交互方法、系统。
背景技术
由于科技的进步,市场需求的多元化发展,虚拟现实系统正变得越来越普遍,应用在许多领域,如电脑游戏,健康和安全,工业和教育培训。举几个例子,混合虚拟现实系统正在被整合到移动通讯设备、游戏机、个人电脑、电影院,主题公园,大学实验室,学生教室,医院锻炼健身室等生活各个角落。
随着人工现实领域的发展,用户在VR、AR以及MR场景下与内容的交互必不可少,同时操作便捷的“裸手”手势交互成为今日发展的趋势。目前该场景下已有的手势交互大部分为单手指食指“点击”手势、拇指与食指的“捏取”手势、握拳“确定”手势等。而采用单手食指“点击”、拇指与食指的“捏取”、握拳进行“确定”的交互手势,对手势识别的精度要求较高,因而对此产生的人力、财力投入势必较大,并且虽然交互手势对手势识别的精度较高,但在交互准确度与体验度上却相对较差。
因此,亟需一种能够减少人财投入,提高手势识别精度,提高用户体验性的基于人工现实的手势交互方法。
技术解决方案
鉴于上述问题,本公开的目的是提供一种基于人工现实的手势交互方法,以解决目前采用单手食指“点击”、拇指与食指的“捏取”、握拳进行“确定”的交互手势对手势识别的精度要求较高,因而对此产生的人力、财力投入势必较大,并且虽然交互手势对手势识别的精度较高,但在交互准确度与体验度上却相对较差的问题。
本公开提供的一种基于人工现实的手势交互方法,其中,包括:
将四指握拳状态时手的预设位置至手指关节形成的射线作为操作指示线;其中,所述射线的端点为所述手的预设位置;
将所述操作指示线与人工现实中的显示面相交的点确定为人工现实的操作图标的位置;
若检测到手四指握拳并移动,则控制所述操作图标进行相应的移动,以将所述操作图标在所述显示面中移动至目标按键处;
若检测到手的拇指触接食指的动作,则确定所述操作图标点击所述目标按键以完成交互。
优选地,所述手的预设位置为手腕关节;
所述手指关节为食指关节。
优选地,控制所述操作图标进行相应的移动包括:根据手四指握拳移动时的移动轨迹,控制所述操作图标移动。
优选地,若检测到手的拇指触接食指的动作,则确定所述操作图标点击所述目标按键以完成交互包括:
根据食指关节位置确定食指关节平面;
将拇指的远端指腹与所述食指关节平面的距离确定为拇指-食指间距;
若检测到拇指的远端指腹靠近食指关节平面,且拇指-食指间距小于预设的点击距离阈值,则控制所述操作图标点击所述目标按键;
若检测到拇指抬起,所述远端指腹远离所述食指关节平面,所述拇指-食指间距大于预设的生效距离阈值,则控制完成所述操作图标点击所述目标按键的动作,并使所述操作图标点击所述目标按键的动作生效,完成一次交互动作。
优选地,还包括若检测到手在拇指-食指间距小于所述点击距离阈值时移动,则确定接收到拖滑待拖滑界面的指令,所述待拖滑界面由所述拇指-食指间距小于所述点击距离阈值时手的位置确定;若检测到手停止移动,且所述远端指腹远离食指关节平面,拇指-食指间距大于预设的生效距离阈值时,则确定接收到停止拖滑所述待拖滑界面的指令。
优选地,所述食指关节平面包括食指远端指尖与食指指间关节之间、两个食指指间关节之间、所述食指指间关节与食指掌关节之间构成的面所覆盖的所有的点。
优选地,所述点击距离阈值等于所述生效距离阈值。
本公开还提供一种基于人工现实的手势交互系统,实现如前所述的基于人 工现实的手势交互方法,包括用于捕捉手的运动姿势的摄像头、用于显示交互页面的显示器和连接所述摄像头与所述显示器的处理器;其中,所述处理器包括:
指示配置单元,用于将四指握拳状态时手的预设位置至手指关节形成的射线作为操作指示线;其中,所述射线的端点为所述手的预设位置;
图标配置单元,用于将所述操作指示线与人工现实中的显示面相交的点确定为人工现实的操作图标的位置;
控制操作单元,用于若检测到手四指握拳并移动,则控制所述操作图标进行相应的移动,以将所述操作图标在所述显示面中移动至目标按键处,其中,所述显示器用于显示所述显示面;
响应交互单元,用于若检测到手的拇指触接食指的动作,则确定所述操作图标点击所述目标按键以完成交互。
优选地,所述响应交互单元包括动作单元和生效单元,其中,所述响应交互单元在用于若检测到手的拇指触接食指的动作,则确定所述操作图标点击所述目标按键以完成交互时,具体用于:
根据食指关节位置确定食指关节平面;
将拇指的远端指腹与所述食指关节平面的距离确定为拇指-食指间距;
若检测到拇指的远端指腹靠近食指关节平面,且拇指-食指间距小于预设的点击距离阈值,则触发所述动作单元,所述动作单元用于控制所述操作图标点击所述目标按键;
若检测到拇指抬起,所述远端指腹远离所述食指关节平面,所述拇指-食指间距大于预设的生效距离阈值,则触发所述生效单元,所述生效单元用于控制完成所述操作图标点击所述目标按键的动作,并使所述操作图标点击所述目标按键的动作生效,以完成一次交互动作。
优选地,还包括拖滑单元,所述拖滑单元用于若检测到手在拇指-食指间距小于所述点击距离阈值时移动,则确定接收到拖滑待拖滑界面的指令,所述待拖滑界面由所述拇指-食指间距小于所述点击距离阈值时手的位置确定;
若检测到手停止移动,且所述远端指腹远离食指关节平面,拇指-食指间距大于预设的生效距离阈值时,则确定接收到停止拖滑所述待拖滑界面的指令。
从上面的技术方案可知,本公开提供的基于人工现实的手势交互方法、系统,通过将四指握拳状态时手的预设位置至手指关节形成的射线作为操作指示 线;其中,所述射线的端点为所述手的预设位置;将所述操作指示线与人工现实中的显示面相交的点确定为人工现实的操作图标的位置;若检测到手四指握拳并移动,则控制所述操作图标进行相应的移动,以将所述操作图标在所述显示面中移动至目标按键处;若检测到手的拇指触接食指的动作,则确定所述操作图标点击所述目标按键以完成交互的方案,该过程拇指触接食指的任何一个点均可以实现点击操作,对精度要求较小,从而减少人力、财力,并且由于精度要求较小使得便于进行点击操作,大大提高用户的交互体验。
附图说明
通过参考以下结合附图的说明书内容,并且随着对本公开的更全面理解,本公开的其它目的及结果将更加明白及易于理解。在附图中:
图1为根据本公开实施例的基于人工现实的手势交互方法的流程图;
图2为根据本公开实施例的基于人工现实的手势交互方法中手与显示器显示的显示面的关系示意图;
图3为根据本公开实施例的基于人工现实的手势交互方法中拇指触接食指的示意图;
图4为根据本公开实施例的基于人工现实的手势交互方法中拖拽或滑动交互的示意图;
图5为根据本公开实施例的基于人工现实的手势交互系统的示意图。
具体实施方式
目前采用单手食指“点击”、拇指与食指的“捏取”、握拳进行“确定”的交互手势对手势识别的精度要求较高,因而对此产生的人力、财力投入势必较大,并且虽然交互手势对手势识别的精度较高,但在交互准确度与体验度上却相对较差。
针对上述问题,本公开提供一种基于人工现实的手势交互方法、系统,以下将结合附图对本公开的具体实施例进行详细描述。
为了说明本公开提供的基于人工现实的手势交互方法、系统,图1、图2、图3、图4对本公开实施例的基于人工现实的手势交互方法进行了示例性标示;图5对本公开实施例的基于人工现实的手势交互系统进行了示例性标示。
以下示例性实施例的描述实际上仅仅是说明性的,决不作为对本公开及其应用或使用的任何限制。对于相关领域普通技术人员已知的技术和设备可能不作详细讨论,但在适当情况下,所述技术和设备应当被视为说明书的一部分。
如图1所示,本公开提供的本公开实施例的基于人工现实的手势交互方法,包括:
S1:将四指握拳状态时手的预设位置至手指关节形成的射线作为操作指示线;其中,所述射线的端点为所述手的预设位置;
S2:将所述操作指示线与人工现实中的显示面相交的点确定为人工现实的操作图标的位置;
S3:若检测到手四指握拳并移动,则控制所述操作图标进行相应的移动,以将所述操作图标在所述显示面中移动至目标按键处;
S4:若检测到手的拇指触接食指的动作,则确定所述操作图标点击所述目标按键以完成交互。
如图1、图2共同所示,步骤S1为将四指握拳状态时手的预设位置至手指关节形成的射线作为操作指示线的过程,射线的端点为手的预设位置,在本实施例中,四指紧握,拇指11可以为竖直状态,也可以为攥紧状态,在此不作限制,该手的预设位置和手指关节的位置均不作限制,在本实施例中,该手的预设位置为手腕关节13,该手指关节为食指关节12,从而以手腕关节13为端点形成操作指示线15,该操作指示线在人工现实场景中可以显示,也可以不显示,在本实施例中,该操作指示线透明不显示,由于为手腕关节与食指关节确定的射线,因而该操作指示线多为水平的,以便于后期用户直接用手进行操控交互。
如图1、图2共同所示,步骤S2为将操作指示线与人工现实中的显示面相交的点作为人工现实的操作图标的位置,即使以手腕关节13为端点的操作指示线15向显示面16的方向延伸,延伸至与显示面16相交,并将相交的点作为操作图标14的位置,该操作图标可以为透明无色的,也可以为有色的,若为有色的,可以为任意适当大小的标识,在本实施例中为类似圆球形的图标,以便用户明了的移动该操作图标。
如图1、图2共同所示,步骤S3为若检测到手四指握拳并移动,则控制操作图标进行相应的移动,以将操作图标在所述显示面中移动至目标按键处的过程,在该过程中,控制所述操作图标进行相应的移动包括:根据手四指握拳移动时的移动轨迹,控制所述操作图标移动。操作图标14随着手的运动而产生相对应的移动,例如用户面向显示面,该手为左手,当左手向左、右、上、下、前、后移动,在用户看来显示面中的图标也随着手向向左、右、上、下、前、后移动,以便灵活的通过手来控制操作图标的位置。
如图1、图3共同所示,步骤S4为若检测到手的拇指触接食指的动作,则确定所述操作图标点击目标按键以完成交互的过程,在用户通过拇指触接食指的动作使得操作图标点击目标按键以完成交互的过程中,即若检测到手的拇指触接食指的动作,则确定所述操作图标点击所述目标按键以完成交互包括:
S41:根据食指关节位置确定食指关节平面;
S42:将拇指的远端指腹与所述食指关节平面的距离确定为拇指-食指间距;
S43:若检测到拇指的远端指腹靠近食指关节平面,拇指-食指间距小于预设的点击距离阈值,则控制操作图标点击目标按键;
S44:若检测到拇指抬起,所述远端指腹远离所述食指关节平面,拇指-食指间距大于预设的生效距离阈值,则控制完成操作图标点击目标按键的动作,并使操作图标点击目标按键的动作生效,完成一次交互动作;具体地,食指关节位置指食指中各个关节的位置。
具体的,若想进行点击确定或点击其他目标按键的交互动作,则使拇指如图3所示,即使拇指21的远端指腹(指尖处的指腹)靠近食指关节平面22以使拇指-食指间距变小,甚至为0,当拇指与食指的间距小于预设的点击距离阈值,则操作图标点击目标按键,换句话说当拇指21点击食指关节平面22,则操作图标点击目标按键,该点击距离阈值的具体数值不做限制,可随时调整;当点击动作发生之后,使拇指抬起,即使远端指腹远离食指中各个关节所确定的平面,当拇指-食指间距大于预设的生效距离阈值,则终止点击动作,并使操作图标点击目标按键的动作生效,如此完成一次点击、取消、确认等一键完成的交互操作,该生效距离阈值的大小也不做限制,可随时调整。
在图1、图4共同所示的实施例中,还包括:
S5:若检测到手在拇指-食指间距小于所述点击距离阈值时移动,则确定接收到拖滑待拖滑界面的指令,所述待拖滑界面由所述拇指-食指间距小于所述点击距离阈值时手的位置确定;
S6:若检测到手停止移动,且所述远端指腹远离食指关节平面,拇指-食指间距大于预设的生效距离阈值时,则确定接收到停止拖滑所述待拖滑界面的指令。
可选地,用户可通过拖拽和滑动完成交互动作,其中,响应该通过拖拽和滑动完成交互动作的过程,可包括:
S51:根据食指关节位置确定食指关节平面;其中,食指关节位置可以为 食指中各个关节的位置;
S52:将拇指41的远端指腹与所述食指关节平面的距离确定为拇指-食指间距;
S53:若检测到拇指-食指间距小于预设的点击距离阈值,则控制操作图标按下待拖滑界面,若检测到手在拇指-食指间距小于预设的点击距离阈值的同时还发生了移动,则控制待拖滑界面随着手的移动拖拽或滑动;
S54:若检测到远端指腹远离食指中各个关节所确定的平面,当拇指-食指间距大于预设的生效距离阈值,则确定待拖滑界面移动至目标位置,控制待拖滑界面移动至目标位置的动作生效,完成一次拖拽或滑动的交互动作。
具体的,如图4所示,在本实施例中,若想要移动待拖滑界面,则先确定拇指41的远端指腹靠近食指中各个关节所确定的平面,即使拇指41按食指中各个关节所确定的平面,从而拇指-食指间距变小,当小于预设的点击距离阈值,即按下411处的待拖滑界面,而后用户可移动手,使待拖滑界面随着手的运动由411的位置经由路径410移动至412的位置,此时该拖滑界面已到达目标位置412,此时只需抬起拇指,使远端指腹远离食指中各个关节所确定的平面,当拇指-食指间距大于预设的生效距离阈值,则该拖滑界面移动至412位置的动作即刻生效,从而完成一次拖拽或滑动的交互动作。
在图1、图2、图3、图4所示的实施例中,该生效距离阈值与该点击距离阈值为相同的数值,如此,便于一起依据现状调整两个阈值,并且,食指中各个关节所确定的平面包括食指远端指尖与食指指间关节之间、两个食指指间关节之间、食指指间关节与食指掌关节之间构成的面所覆盖的所有的点,即拇指远端指腹与食指远端指尖可形成“按下”、“点击”手势,拇指远端指腹与食指指间关节也能形成的“按下”、“点击”手势,拇指远端指腹与食指掌关节也可想成“按下”、“点击”手势,并且拇指指腹与食指指间关节之间形成的平面、两个食指指间关节之间形成的平面、食指指间关节之间形成的平面中任意一点均可实现“按下”、“点击”手势,如此降低手势控制的精度要求,提高用户的体验感。
需要说明的是,在通过手势进行“点击”或“按下”操作时,该点击可以点击显示器中的任意目标按键和待拖滑界面,既可以满足任何能够在显示其中找到操控页面的交互动作,具体的操作细节在此不作赘述。
如上所述,本公开提供的基于人工现实的手势交互方法,通过将四指握拳状态时手的预设位置至手指关节形成的射线作为操作指示线;其中,所述射线 的端点为所述手的预设位置;将所述操作指示线与人工现实中的显示面相交的点确定为人工现实的操作图标的位置;若检测到手四指握拳并移动,则控制所述操作图标进行相应的移动,以将所述操作图标在所述显示面中移动至目标按键处;若检测到手的拇指触接食指的动作,则确定所述操作图标点击所述目标按键以完成交互的方案,使得拇指触接食指的任何一个点均可以实现点击操作,对精度要求较小,从而减少人力、财力,并且由于精度要求较小使得便于进行点击操作,大大提高用户的交互体验。
如图5所示,本公开还提供一种基于人工现实的手势交互系统100,实现如前所述的基于人工现实的手势交互方法,包括用于捕捉手的运动姿势的摄像头110、用于显示交互页面的显示器120和连接摄像头110与显示器120的处理器13;其中,理器处130包括:
指示配置单元131,用于将四指握拳状态时手的预设位置至手指关节形成的射线作为操作指示线;其中,该射线的端点为手的预设位置;
图标配置单元132,用于将操作指示线与人工现实中的显示面相交的点确定为人工现实的操作图标的位置;
控制操作单元133,用于若检测到手四指握拳并移动,则控制操作图标进行相应的移动,以将操作图标在显示面中移动至目标按键处,其中,所述显示器120用于显示所述显示面;
响应交互单元134,用于若检测到手的拇指触接食指的动作,则确定操作图标点击目标按键以完成交互。
在图5所示的实施例中,该响应交互单元134包括动作单元134-1和生效单元134-2,其中,所述响应交互单元在用于若检测到手的拇指触接食指的动作,则确定所述操作图标点击所述目标按键以完成交互时,具体用于:
根据食指关节位置确定食指关节平面;
将拇指的远端指腹与所述食指关节平面的距离确定为拇指-食指间距;
若检测到拇指的远端指腹靠近食指关节平面,且拇指-食指间距小于预设的点击距离阈值,则触发该动作单元134-1,该动作单元134-1用于控制操作图标点击目标按键;
若检测到拇指抬起,远端指腹远离食指关节平面,拇指-食指间距大于预设的生效距离阈值,则触发生效单元134-2,该生效单元134-2用于控制完成操作图标点击目标按键的动作,并使操作图标点击目标按键的动作生效,以完成一 次交互动作。
此外,还包括拖滑单元135,该拖滑单元135可以集成在处理器130中,该拖滑单元135用于:
若检测到手在拇指-食指间距小于所述点击距离阈值时移动,则确定接收到拖滑待拖滑界面的指令,所述待拖滑界面由所述拇指-食指间距小于所述点击距离阈值时手的位置确定;
若检测到手停止移动,且所述远端指腹远离食指关节平面,拇指-食指间距大于预设的生效距离阈值时,则确定接收到停止拖滑所述待拖滑界面的指令。
可选地,用户可通过拖拽和滑动完成交互动作,其中,该拖滑单元135在响应该通过拖拽和滑动完成交互动作时,可用于:
根据食指关节位置确定食指关节平面;其中,食指关节位置可以为食指中各个关节的位置;
将拇指的远端指腹与所述食指关节平面的距离确定为拇指-食指间距;
若检测到拇指-食指间距小于预设的点击距离阈值,则控制操作图标按下待拖滑界面,若检测到手在拇指-食指间距小于预设的点击距离阈值的同时还发生了移动,则控制待拖滑界面随着手的移动拖拽或滑动;
若检测到远端指腹远离食指中各个关节所确定的平面,当拇指-食指间距大于预设的生效距离阈值,则确定待拖滑界面移动至目标位置,控制待拖滑界面移动至目标位置的动作生效,完成一次拖拽或滑动的交互动作。
通过上述实施方式可以看出,本公开提供的基于人工现实的手势交互系统,首先通过摄像头110捕捉手的运动姿势,再通过处理器130中的指示配置单元131将四指握拳状态时手的预设位置至食指关节形成的射线作为操作指示线,再通过图标配置单元132将操作指示线与显示器显示的人工现实中的显示面相交的点确定人工现实的操作图标的位置,再经由控制操作单元133在检测到手四指握拳并移动时,则控制所述操作图标进行相应的移动,以将所述操作图标在所述显示面中移动至目标按键处,而后由响应交互单元134在检测到手的拇指触接食指的动作时,确定所述操作图标点击所述目标按键以完成交互的方案,使得拇指触接食指的任何一个点均可以实现点击操作,对精度要求较小,从而减少人力、财力,并且由于精度要求较小使得便于进行点击操作,大大提高用户的交互体验。
如上参照附图以示例的方式描述了根据本公开提出的基于人工现实的手 势交互方法、系统。但是,本领域技术人员应当理解,对于上述本公开所提出的基于人工现实的手势交互方法、系统,还可以在不脱离本公开内容的基础上做出各种改进。因此,本公开的保护范围应当由所附的权利要求书的内容确定。

Claims (10)

  1. 一种基于人工现实的手势交互方法,包括:
    将四指握拳状态时手的预设位置至手指关节形成的射线作为操作指示线;其中,所述射线的端点为所述手的预设位置;
    将所述操作指示线与人工现实中的显示面相交的点确定为人工现实的操作图标的位置;
    若检测到手四指握拳并移动,则控制所述操作图标进行相应的移动,以将所述操作图标在所述显示面中移动至目标按键处;
    若检测到手的拇指触接食指的动作,则确定所述操作图标点击所述目标按键以完成交互。
  2. 如权利要求1所述的基于人工现实的手势交互方法,其中,
    所述手的预设位置为手腕关节;
    所述手指关节为食指关节。
  3. 如权利要求1所述的基于人工现实的手势交互方法,其中,控制所述操作图标进行相应的移动包括:根据手四指握拳移动时的移动轨迹,控制所述操作图标移动。
  4. 如权利要求1所述的基于人工现实的手势交互方法,其中,若检测到手的拇指触接食指的动作,则确定所述操作图标点击所述目标按键以完成交互包括:
    根据食指关节位置确定食指关节平面;
    将拇指的远端指腹与所述食指关节平面的距离确定为拇指-食指间距;
    若检测到拇指的远端指腹靠近所述食指关节平面,拇指-食指间距小于预设的点击距离阈值,则控制所述操作图标点击所述目标按键;
    若检测到拇指抬起,所述远端指腹远离所述食指关节平面,所述拇指-食指间距大于预设的生效距离阈值,则控制完成所述操作图标点击所述目标按键的动作,并使所述操作图标点击所述目标按键的动作生效,完成一次交互动作。
  5. 如权利要求4所述的基于人工现实的手势交互方法,其中,所述方法还包括:
    若检测到手在拇指-食指间距小于所述点击距离阈值时移动,则确定接收到拖滑待拖滑界面的指令,所述待拖滑界面由所述拇指-食指间距小于所述点击距离阈值时手的位置确定;
    若检测到手停止移动,且所述远端指腹远离食指关节平面,拇指-食指间距大于预设的生效距离阈值时,则确定接收到停止拖滑所述待拖滑界面的指令。
  6. 如权利要求5所述的基于人工现实的手势交互方法,其中,
    所述食指关节平面包括食指远端指尖与食指指间关节之间、两个食指指间关节之间、所述食指指间关节与食指掌关节之间构成的面所覆盖的所有的点。
  7. 如权利要求5所述的基于人工现实的手势交互方法,其中,
    所述点击距离阈值等于所述生效距离阈值。
  8. 一种基于人工现实的手势交互系统,实现如权利要求1-7任一所述的基于人工现实的手势交互方法,包括用于捕捉手的运动姿势的摄像头、用于显示交互页面的显示器和连接所述摄像头与所述显示器的处理器;其中,所述处理器包括:
    指示配置单元,用于将四指握拳状态时手的预设位置至手指关节形成的射线作为操作指示线;其中,所述射线的端点为所述手的预设位置;
    图标配置单元,用于将所述操作指示线与人工现实中的显示面相交的点确定为人工现实的操作图标的位置,其中,所述显示器用于显示所述显示面;
    控制操作单元,用于若检测到手四指握拳并移动,则控制所述操作图标进行相应的移动,以将所述操作图标在所述显示面中移动至目标按键处;
    响应交互单元,用于若检测到手的拇指触接食指的动作,则确定所述操作图标点击所述目标按键以完成交互。
  9. 如权利要求8所述的基于人工现实的手势交互系统,其中,所述响应交互单元包括动作单元和生效单元,其中,所述响应交互单元在用于若检测到手的拇指触接食指的动作,则确定所述操作图标点击所述目标按键以完成交互时,具体用于:
    根据食指关节位置确定食指关节平面;
    将拇指的远端指腹与所述食指关节平面的距离确定为拇指-食指间距;
    若检测到拇指的远端指腹靠近食指关节平面,且拇指-食指间距小于预设的点击距离阈值,则触发所述动作单元,所述动作单元用于控制所述操作图标点击所述目标按键;
    若检测到拇指抬起,所述远端指腹远离所述食指关节平面,所述拇指-食指间距大于预设的生效距离阈值,则触发所述生效单元,所述生效单元用于控制完成所述操作图标点击所述目标按键的动作,并使所述操作图标点击所述目标 按键的动作生效,以完成一次交互动作。
  10. 如权利要求8所述的基于人工现实的手势交互系统,其中,还包括拖滑单元,所述拖滑单元用于:
    若检测到手在拇指-食指间距小于预设的点击距离阈值时移动,则确定接收到拖滑待拖滑界面的指令,所述待拖滑界面由所述拇指-食指间距小于所述点击距离阈值时手的位置确定;
    若检测到手停止移动,且拇指的远端指腹远离食指关节平面,拇指-食指间距大于预设的生效距离阈值时,则确定接收到停止拖滑所述待拖滑界面的指令。
PCT/CN2022/081494 2021-08-11 2022-03-17 基于人工现实的手势交互方法、系统 WO2023015886A1 (zh)

Priority Applications (2)

Application Number Priority Date Filing Date Title
EP22854891.3A EP4345583A1 (en) 2021-08-12 2022-03-17 Gesture interaction method and system based on artificial reality
US18/400,624 US20240134461A1 (en) 2021-08-11 2023-12-29 Gesture interaction method and system based on artificial reality

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN202110926659.X 2021-08-11
CN202110926659.XA CN113885695A (zh) 2021-08-12 2021-08-12 基于人工现实的手势交互方法、系统

Related Child Applications (1)

Application Number Title Priority Date Filing Date
US18/400,624 Continuation US20240134461A1 (en) 2021-08-11 2023-12-29 Gesture interaction method and system based on artificial reality

Publications (1)

Publication Number Publication Date
WO2023015886A1 true WO2023015886A1 (zh) 2023-02-16

Family

ID=79011028

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2022/081494 WO2023015886A1 (zh) 2021-08-11 2022-03-17 基于人工现实的手势交互方法、系统

Country Status (4)

Country Link
US (1) US20240134461A1 (zh)
EP (1) EP4345583A1 (zh)
CN (1) CN113885695A (zh)
WO (1) WO2023015886A1 (zh)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113885695A (zh) * 2021-08-12 2022-01-04 青岛小鸟看看科技有限公司 基于人工现实的手势交互方法、系统

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20150234467A1 (en) * 2014-02-18 2015-08-20 Sony Corporation Method and apparatus for gesture detection and display control
CN108052202A (zh) * 2017-12-11 2018-05-18 深圳市星野信息技术有限公司 一种3d交互方法、装置、计算机设备及存储介质
CN112000224A (zh) * 2020-08-24 2020-11-27 北京华捷艾米科技有限公司 一种手势交互方法及系统
CN113190109A (zh) * 2021-03-30 2021-07-30 青岛小鸟看看科技有限公司 头戴式显示设备的输入控制方法、装置及头戴式显示设备
CN113885695A (zh) * 2021-08-12 2022-01-04 青岛小鸟看看科技有限公司 基于人工现实的手势交互方法、系统

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20150234467A1 (en) * 2014-02-18 2015-08-20 Sony Corporation Method and apparatus for gesture detection and display control
CN108052202A (zh) * 2017-12-11 2018-05-18 深圳市星野信息技术有限公司 一种3d交互方法、装置、计算机设备及存储介质
CN112000224A (zh) * 2020-08-24 2020-11-27 北京华捷艾米科技有限公司 一种手势交互方法及系统
CN113190109A (zh) * 2021-03-30 2021-07-30 青岛小鸟看看科技有限公司 头戴式显示设备的输入控制方法、装置及头戴式显示设备
CN113885695A (zh) * 2021-08-12 2022-01-04 青岛小鸟看看科技有限公司 基于人工现实的手势交互方法、系统

Also Published As

Publication number Publication date
CN113885695A (zh) 2022-01-04
EP4345583A1 (en) 2024-04-03
US20240134461A1 (en) 2024-04-25

Similar Documents

Publication Publication Date Title
Wang et al. Detecting and leveraging finger orientation for interaction with direct-touch surfaces
JP6660309B2 (ja) ペンおよびタッチ感応コンピューティングデバイスのインタラクションのためのセンサ相関
Nacenta et al. Separability of spatial manipulations in multi-touch interfaces.
Weiss et al. SLAP widgets: bridging the gap between virtual and physical controls on tabletops
JP2017518572A (ja) ペンおよびコンピューティングデバイスのインタラクションのためのマルチデバイスマルチユーザセンサ相関
JPH07175587A (ja) 情報処理装置
US20240134461A1 (en) Gesture interaction method and system based on artificial reality
CN113608619A (zh) 增强现实中的裸手操作方法、系统
Dhawale et al. Bare-hand 3D gesture input to interactive systems
Wolf et al. PinchPad: performance of touch-based gestures while grasping devices
Tsai et al. Segtouch: Enhancing touch input while providing touch gestures on screens using thumb-to-index-finger gestures
TWI227445B (en) A method capable of promptly switching operation mode of touch device and device thereof
TW200807284A (en) Programmable touch system
US8327294B2 (en) Method and system to reduce workload and skills required in usage of mouse or other pointing devices
Fujinawa et al. Occlusion-aware hand posture based interaction on tabletop projector
Zand et al. TiltWalker: operating a telepresence robot with one-hand by tilt controls on a smartphone
TWI547862B (zh) Multi - point handwriting input control system and method
Yamada et al. A reactive presentation support system based on a slide object manipulation method
Kudale et al. Human computer interaction model based virtual whiteboard: A review
WO2012114791A1 (ja) ジェスチャー操作システム
Schlattmann et al. Efficient bimanual symmetric 3D manipulation for markerless hand-tracking
CN112799580A (zh) 一种显示控制方法及电子装置
Hisamatsu et al. A novel click-free interaction technique for large-screen interfaces
TWI780663B (zh) 互動式觸控系統的操作判斷方法
Schlattmann et al. Efficient bimanual symmetric 3d manipulation for bare-handed interaction

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 22854891

Country of ref document: EP

Kind code of ref document: A1

WWE Wipo information: entry into national phase

Ref document number: 2022854891

Country of ref document: EP

ENP Entry into the national phase

Ref document number: 2022854891

Country of ref document: EP

Effective date: 20231229

NENP Non-entry into the national phase

Ref country code: DE