WO2023016174A1 - 手势操作方法、装置、设备和介质 - Google Patents

手势操作方法、装置、设备和介质 Download PDF

Info

Publication number
WO2023016174A1
WO2023016174A1 PCT/CN2022/105375 CN2022105375W WO2023016174A1 WO 2023016174 A1 WO2023016174 A1 WO 2023016174A1 CN 2022105375 W CN2022105375 W CN 2022105375W WO 2023016174 A1 WO2023016174 A1 WO 2023016174A1
Authority
WO
WIPO (PCT)
Prior art keywords
ball
virtual
hand
fingertip
palm
Prior art date
Application number
PCT/CN2022/105375
Other languages
English (en)
French (fr)
Inventor
程文浩
王慧谱
Original Assignee
青岛小鸟看看科技有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 青岛小鸟看看科技有限公司 filed Critical 青岛小鸟看看科技有限公司
Priority to US18/088,213 priority Critical patent/US11803248B2/en
Publication of WO2023016174A1 publication Critical patent/WO2023016174A1/zh

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • G06F3/014Hand-worn input/output arrangements, e.g. data gloves
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/017Gesture based interaction, e.g. based on a set of recognized hand gestures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/033Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
    • G06F3/0354Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor with detection of 2D relative movements between the device, or an operating part thereof, and a plane or surface, e.g. 2D mice, trackballs, pens or pucks
    • G06F3/03549Trackballs
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T19/00Manipulating 3D models or images for computer graphics
    • G06T19/20Editing of 3D images, e.g. changing shapes or colours, aligning objects or positioning parts
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/50Depth or shape recovery
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/70Determining position or orientation of objects or cameras
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/70Arrangements for image or video recognition or understanding using pattern recognition or machine learning
    • G06V10/74Image or video pattern matching; Proximity measures in feature spaces
    • G06V10/761Proximity, similarity or dissimilarity measures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/20Scenes; Scene-specific elements in augmented reality scenes
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/20Movements or behaviour, e.g. gesture recognition
    • G06V40/28Recognition of hand or arm movements, e.g. recognition of deaf sign language
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10028Range image; Depth image; 3D point clouds
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2219/00Indexing scheme for manipulating 3D models or images for computer graphics
    • G06T2219/20Indexing scheme for editing of 3D models
    • G06T2219/2004Aligning objects, relative positioning of parts

Definitions

  • the present invention relates to the technical field of virtual reality, and more specifically, to a gesture operation method, device, equipment and medium.
  • virtual reality systems Due to the advancement of technology and the diversified development of market demand, virtual reality systems are becoming more and more common and applied in many fields, such as computer games, health and safety, industry and education and training. To give a few examples, mixed virtual reality systems are being integrated into mobile communication devices, game consoles, personal computers, movie theaters, theme parks, university laboratories, student classrooms, hospital exercise gyms and other corners of life.
  • the purpose of the present invention is to provide a gesture operation method to solve the current interactive gestures of "clicking" with the index finger of one hand, "pinch” with the thumb and index finger, and “confirm” by making a fist.
  • the precision requirements of the hand are high, so the human and financial resources invested in this are bound to be large, and when the virtual coordinates of the hand are collected, due to the high precision requirements, the positioning of the important joints of the hand is usually unstable, which makes the interaction accuracy and relatively poor experience.
  • a gesture operation method provided by the present invention includes:
  • determining the spatial coordinates of the virtual hand corresponding to the hand in the virtual space includes:
  • the palm ball is bound at the palm position of the virtual hand
  • the fingertip ball is bound at the fingertip position of the virtual hand, including:
  • a palm ball is set on the virtual position of the palm, and a fingertip ball is set on the virtual position of the fingertip;
  • the ball in the palm always moves along with the movement of the virtual position of the palm
  • the fingertip ball always moves along with the virtual position of the fingertip.
  • the fingertip balls include little finger balls, ring finger balls and middle finger balls.
  • performing corresponding operations in the virtual space according to the straight-line distance between the fingertip ball and the palm ball including:
  • the straight-line distance represents the straight-line distance formed between the fingertip ball and the palm ball when fingers other than the thumb move closer to the palm to make a fist;
  • it also includes: binding a thumb ball at the position of the thumb of the virtual hand, and binding a cuboid of the index finger at the position of the index finger of the virtual hand; wherein,
  • Binding the thumb ball on the position of the thumb of the virtual hand, and binding the index finger cuboid on the position of the index finger of the virtual hand including:
  • a thumb ball is set on the virtual position of the thumb, and a cuboid of the index finger is set on the virtual position of the index finger;
  • the thumb ball always moves along with the movement of the virtual position of the thumb
  • the cuboid of the index finger always moves along with the movement of the virtual position of the index finger.
  • it also includes:
  • thumb ball-index finger cuboid distance represents the distance formed between the thumb ball and the index finger cuboid when the thumb moves closer to the index finger for pinching
  • an operation cursor at a corresponding position of the virtual hand is triggered to perform a corresponding operation in the virtual space.
  • the present invention also provides a gesture operation device to implement the above-mentioned gesture operation method, including:
  • An information acquisition module configured to acquire depth information of the user's hand
  • a coordinate corresponding module configured to determine the spatial coordinates of the virtual hand corresponding to the hand in the virtual space according to the depth information of the hand;
  • a tracking binding module configured to bind a tracking ball on the virtual hand according to the spatial coordinates; wherein, binding the palm ball at the palm position of the virtual hand, and binding the fingertip at the fingertip position of the virtual hand a pointed ball, the volume of the palm ball is larger than the fingertip ball;
  • the interactive execution module is configured to execute corresponding operations in the virtual space according to the straight-line distance between the fingertip ball and the palm ball.
  • the coordinate correspondence module includes:
  • a real position calculation unit configured to obtain the relative distance between the hand and the sensor; obtain the real position of the wrist of the hand according to the position of the sensor and the relative distance;
  • the virtual coordinate corresponding unit is used to use the virtual coordinates of the sensor as a reference to map the real position of the wrist into the virtual space to form wrist space coordinates; perform calculation and filling according to the wrist space coordinates and hand joint information to form a virtual hand, and obtain the space coordinates of the virtual hand.
  • it also includes: a pinching operation unit;
  • the pinching operation unit is used to obtain the thumb ball-index finger cuboid distance, wherein the thumb ball-index finger cuboid distance represents the distance between the thumb ball and the index finger cuboid when the thumb moves closer to the index finger for pinching. According to the distance between the thumb ball and the index finger cuboid, the operation cursor at the corresponding position of the virtual hand is triggered to perform the corresponding operation in the virtual space.
  • the present invention also provides an electronic device, comprising:
  • a processor and a memory the memory is used to store a computer program, and the processor is used to call and run the computer program stored in the memory to execute the gesture operation method as described in the foregoing embodiments.
  • the present invention also provides a computer-readable storage medium for storing a computer program, and the computer program causes a computer to execute the gesture operation method as described in the foregoing embodiments.
  • the present invention also provides a computer program product containing program instructions.
  • the program instructions When the program instructions are run on the electronic device, the electronic device is made to execute the gesture operation method as described in the foregoing embodiments.
  • the gesture operation method, device, device and medium provided by the present invention determine the space of the virtual hand corresponding to the hand in the virtual space according to the depth information of the user's hand by obtaining the depth information of the hand. Coordinates, and then bind the tracking ball on the virtual hand according to the space coordinates, where the palm ball is bound at the palm position of the virtual hand, and the fingertip ball is bound at the fingertip position of the virtual hand.
  • the volume of the palm ball is larger than the fingertip ball, Then, according to the straight-line distance between the fingertip ball and the palm ball, the corresponding operation is performed in the virtual space.
  • This gesture operation method introduces a small ball that moves with the hand to achieve the purpose of bare-hand operation. It not only has higher stability, but also affects the accuracy The requirements are smaller, thereby reducing manpower and financial resources, and because the precision requirements are small, it is convenient to perform click operations, which greatly improves the user's interactive experience.
  • FIG. 1 is a flowchart of a gesture operation method according to an embodiment of the present invention
  • FIG. 2 is a schematic diagram of binding a trackball in a gesture operation method according to an embodiment of the present invention
  • FIG. 3 is a schematic diagram of a thumb ball and a cuboid forefinger in a gesture operation method according to an embodiment of the present invention
  • FIG. 4 is a schematic diagram of a virtual ray in a gesture operation method according to an embodiment of the present invention.
  • FIG. 5 is a schematic block diagram of a gesture operation device according to an embodiment of the present invention.
  • Fig. 6 is a schematic block diagram of an electronic device according to an embodiment of the present invention.
  • the present invention provides a gesture operation method, device, equipment and medium. Specific embodiments of the present invention will be described in detail below with reference to the accompanying drawings.
  • FIG. 1 shows the gesture operation of the embodiment of the present invention
  • FIG. 6 is an exemplary representation of an electronic device according to an embodiment of the present invention.
  • the gesture operation method provided by the present invention includes:
  • S2 Determine the spatial coordinates of the virtual hand corresponding to the hand in the virtual space according to the depth information of the hand;
  • S3 Bind the tracking ball on the virtual hand according to the spatial coordinates; among them, bind the palm ball at the palm position of the virtual hand, and bind the fingertip ball at the fingertip position of the virtual hand, and the volume of the palm ball is larger than the fingertip ball;
  • the ball at the center of the palm when determining that the ball at the center of the palm is larger than the ball at the fingertip, it can be realized not only based on volume, but also based on the diameter or radius of the ball, that is, the diameter of the ball at the center of the palm is greater than the diameter of the ball at the fingertip, or the center of the palm
  • the radius of the ball is greater than the radius of the fingertip ball, etc., which is not specifically limited in the present application.
  • step S1 obtains the depth information of the user's hand, which can be obtained through a depth camera or any other camera capable of obtaining depth information, which is not specifically limited in this application.
  • step S2 determines the spatial coordinates of the virtual hand corresponding to the hand in the virtual space according to the depth information of the hand, which can be realized by the following steps:
  • S14 Perform calculation and filling according to the wrist space coordinates and hand joint information to form a virtual hand, and obtain the space coordinates of the virtual hand.
  • the sensor refers to the sensor in the VR system.
  • the present application may first obtain the positional relationship between the hand and the sensor through a depth camera or other types of cameras, so as to determine the relative distance between the hand and the sensor according to the positional relationship. Because the real position of the sensor and the virtual coordinates of the sensor are known, the real position of the wrist of the user's hand can be calculated according to the real position and relative distance of the sensor; secondly, based on the known virtual coordinates of the sensor, the wrist The real position of the wrist is mapped to the virtual space, so that the coordinates of the wrist space are obtained.
  • the size of the hand and the positional relationship between the joints of the hand are known, so the coordinates of the joints of the hand in the virtual space can be calculated according to the coordinates of the wrist space, so it can be filled Form a virtual hand; further, obtain the spatial coordinates of the entire virtual hand in the VR system.
  • step S3 is the process of binding the trackball on the virtual hand.
  • the process of binding the palm ball (big ball) at the palm position of the virtual hand as shown in Figure 2 and binding the fingertip ball (small ball) at the fingertip position of the virtual hand includes:
  • the palm ball always moves with the movement of the palm virtual position coordinates
  • the fingertip ball always moves with the movement of the fingertip virtual position coordinates
  • the fingertip balls include little finger balls, ring finger fingertip balls and middle finger fingertip balls, that is, the positional relationship between the little finger, ring finger, middle finger and the palm is judged by the fingertip balls, thereby judging whether the user is making a fist.
  • step S4 is to perform corresponding operations in the virtual space according to the straight-line distance between the fingertip ball and the palm ball.
  • the implementation process of S4 in this embodiment includes:
  • S32 Perform a corresponding operation in the virtual space according to the straight-line distance between the fingertip ball and the palm ball.
  • the fingertip ball and the palm ball are virtual balls, which may be colored or colorless.
  • the palm ball and fingertip ball are shown in accompanying drawing 2, but in this embodiment, the palm ball and fingertip ball are colorless, transparent, invisible virtual spheres, which can be The part of the bound hand moves to increase the stability of the coordinate information of the virtual hand, so as to ensure the accuracy of determining the posture of the virtual hand.
  • the present application acquires the straight-line distance between the fingertip ball and the palm ball in real time, and compares the straight-line distance between the fingertip ball and the palm ball with a preset trigger threshold. If the straight-line distance between the fingertip ball and the palm ball is less than the preset trigger threshold, it means that the user needs to perform an operation interaction, and at this time the first trigger condition is satisfied. If the first trigger condition is met, the application can automatically and immediately execute any operation corresponding to the first trigger condition, that is, perform any corresponding operation in the virtual space. In this way, the response speed during bare-hand operation interaction can be improved.
  • the preset trigger threshold can be flexibly set according to actual application needs, and there is no specific limitation here.
  • the VR system automatically executes the interaction corresponding to the first trigger condition.
  • the correspondence between the interactive operation and the first trigger condition is preset in advance, and the specific preset process will not be described here; and the first trigger condition can correspond to any interactive operation, such as entering page and exit the page, click any cursor or logo on the interface of the monitor in the VR system, and even open and close operations.
  • the turning on and off includes but not limited to turning on or off the display, that is, when the user makes a fist action, the VR system can automatically perform the above-mentioned entry page and exit page, click any cursor on the interface of the display in the VR system Or identification, even any interactive operation such as opening and closing operations.
  • the gesture operation method of the embodiment of the present invention further includes: S5: bind the thumb ball at the position of the thumb of the virtual hand, and bind the cuboid of the index finger at the position of the index finger of the virtual hand;
  • the application binds the thumb ball (the oblate circle on the thumb in Fig. 3 ) at the position of the thumb of the virtual hand, and binds the cuboid of the index finger at the position of the index finger of the virtual hand, including:
  • S512 setting the thumb ball on the virtual position of the thumb, and setting the cuboid of the index finger on the virtual position of the index finger;
  • the thumb ball always moves with the coordinates of the virtual position of the thumb;
  • the cuboid of the index finger always moves with the coordinates of the virtual position of the index finger;
  • the cuboid for the index finger is not a traditional cuboid in the actual sense, but a cuboid-shaped marker that wraps the index finger as shown in Figure 3, which may be slightly flatter or irregular than the cuboid. This application does not specifically limit this.
  • the gesture operation method of the present application further includes:
  • the operation cursor at the corresponding position of the virtual hand is triggered to perform corresponding operations in the virtual space.
  • the specific implementation process is: if the thumb ball-index finger cuboid distance is smaller than the preset pinching threshold, then Two trigger conditions are established. Then, according to the second trigger condition, the operation cursor at the corresponding position of the virtual hand in the display on the VR system is triggered to perform VR interactive operation. If the distance between the ball of the thumb and the cuboid of the index finger is greater than or equal to the preset pinching threshold, the second trigger condition is not satisfied, and S521 is continued.
  • the preset pinching threshold can be flexibly set according to actual application needs, and there is no specific limitation here.
  • the interactive operation corresponding to the second trigger condition is directly and automatically started.
  • the interactive operation corresponding to the second trigger condition may be any operation with properties such as "click”, "press”, etc., which is not specifically limited here.
  • a virtual ray is determined by the specific position of the user's joints and hands, and the virtual ray can be colored or colorless in practical applications.
  • the virtual ray is shown as a shaped and colored line starting from the hand and ending at the display, but in this embodiment, the virtual ray intersects the display in the VR system, and the intersection is The location of the virtual cursor in the display.
  • the virtual ray also moves with the hand, so that the user moves the virtual cursor in the display by moving the hand, that is, selects which position in the display to click by moving the hand.
  • the virtual cursor clicks any clickable page button on the display, such as clicking the page APP icon in , click OK, click Cancel, after the virtual cursor is clicked, I won’t repeat them here.
  • the second trigger condition can also be compared with dragging and sliding the display
  • the content in the interface corresponds to the interactive operation, that is, the user performs a pinch action, and the virtual cursor clicks on the content in the display.
  • the user can move the entire hand, so that the virtual cursor drives the clicked content to move with the movement of the hand.
  • the user lifts the thumb to terminate the pinching action, and the dragging or sliding operation can take effect.
  • the interactive operations corresponding to the first trigger condition and the second trigger condition can be set in advance, and the object of the interactive operation can be any target button or interface that can be operated with bare hands in any VR system. More detailed operations The details are not repeated here.
  • the gesture operation method obtains the depth information of the user's hand, determines the spatial coordinates of the virtual hand corresponding to the hand in the virtual space according to the depth information of the hand, and then determines the spatial coordinates of the virtual hand corresponding to the hand according to the spatial coordinates. Bind the tracking ball above, where the palm ball is bound at the palm of the virtual hand, and the fingertip ball is bound at the fingertip of the virtual hand. The volume of the palm ball is larger than the fingertip ball, and then according to the Straight-line distance, to perform corresponding operations in the virtual space.
  • This kind of gesture operation method introduces a small ball that moves easily to achieve the purpose of bare-hand operation. It not only has higher stability, but also requires less precision, thereby reducing manpower and financial resources. , and because the accuracy requirements are small, it is convenient to perform click operations, which greatly improves the user's interactive experience.
  • the present invention also provides a gesture operation device 100 to implement the above-mentioned gesture operation method, including:
  • An information acquisition module 101 configured to acquire depth information of the user's hand
  • a coordinate corresponding module 102 configured to determine the spatial coordinates of the virtual hand corresponding to the hand in the virtual space according to the depth information of the hand;
  • the tracking and binding module 103 is used to bind the tracking ball on the virtual hand according to the space coordinates; wherein, the palm ball is bound at the palm position of the virtual hand, and the fingertip is bound at the fingertip position of the virtual hand. fingertip ball, the volume of the palm ball is larger than the fingertip ball;
  • the interaction execution module 104 is configured to execute corresponding operations in the virtual space according to the straight-line distance between the fingertip ball and the palm ball.
  • the coordinate correspondence module 102 includes:
  • the real position calculation unit 102-1 is used to obtain the relative distance between the hand and the sensor; according to the position of the sensor and the relative distance, obtain the real position of the wrist of the hand;
  • the virtual coordinate corresponding unit 102-2 is configured to use the virtual coordinates of the sensor as a reference to map the real position of the wrist into the virtual space to form wrist space coordinates; according to the wrist space coordinates and hand joint information Computational filling is performed to form a virtual hand, and spatial coordinates of the virtual hand are obtained.
  • the gesture operation device 100 provided by the present invention further includes: a pinch operation unit 105 (not shown in the figure);
  • the pinching operation unit 105 is used to obtain the thumb ball-index finger cuboid distance, wherein, the thumb ball-index finger cuboid distance represents the distance between the thumb ball and the index finger cuboid when the thumb moves closer to the index finger for pinching. According to the distance between the thumb ball and the index finger cuboid, the operation cursor at the corresponding position of the virtual hand is triggered to perform the corresponding operation in the virtual space.
  • the gesture operation device determines the spatial coordinates of the virtual hand corresponding to the hand in the virtual space according to the depth information of the user's hand by obtaining the depth information of the user's hand, and then according to the spatial coordinates
  • the coordinates bind the tracking ball on the virtual hand, where the palm ball is bound at the palm position of the virtual hand, and the fingertip ball is bound at the fingertip position of the virtual hand.
  • the volume of the palm ball is larger than the fingertip ball, and then according to the fingertip ball
  • the straight-line distance from the ball in the palm is used to perform corresponding operations in the virtual space.
  • This gesture operation method introduces a small ball that moves with the hand to achieve the purpose of bare-hand operation. It not only has higher stability, but also requires less precision. It reduces manpower and financial resources, and because the accuracy requirements are small, it is convenient to perform click operations, which greatly improves the user's interactive experience.
  • Fig. 6 is a schematic block diagram of an electronic device according to an embodiment of the present invention. As shown in FIG. 6, the electronic device 200 may include:
  • a memory 210 and a processor 220 the memory 210 is used to store computer programs and transmit the program codes to the processor 220 .
  • the processor 220 can invoke and run a computer program from the memory 210 to implement the gesture operation method in the embodiment of the present application.
  • the processor 220 may be configured to execute the above embodiment of the gesture operation method according to the instructions in the computer program.
  • the processor 220 may include but not limited to:
  • DSP Digital Signal Processor
  • ASIC Application Specific Integrated Circuit
  • FPGA Field Programmable Gate Array
  • the memory 210 includes but is not limited to:
  • non-volatile memory can be read-only memory (Read-Only Memory, ROM), programmable read-only memory (Programmable ROM, PROM), erasable programmable read-only memory (Erasable PROM, EPROM), electronically programmable Erase Programmable Read-Only Memory (Electrically EPROM, EEPROM) or Flash.
  • the volatile memory can be Random Access Memory (RAM), which acts as external cache memory.
  • RAM Static Random Access Memory
  • SRAM Static Random Access Memory
  • DRAM Dynamic Random Access Memory
  • Synchronous Dynamic Random Access Memory Synchronous Dynamic Random Access Memory
  • SDRAM double data rate synchronous dynamic random access memory
  • Double Data Rate SDRAM, DDR SDRAM double data rate synchronous dynamic random access memory
  • Enhanced SDRAM, ESDRAM enhanced synchronous dynamic random access memory
  • SLDRAM synchronous connection dynamic random access memory
  • Direct Rambus RAM Direct Rambus RAM
  • the computer program can be divided into one or more modules, and the one or more modules are stored in the memory 210 and executed by the processor 220 to complete the Gesture operation method.
  • the one or more modules may be a series of computer program instruction segments capable of accomplishing specific functions, and the instruction segments are used to describe the execution process of the computer program in the electronic device.
  • the electronic device may also include:
  • the transceiver 230 can be connected to the processor 220 or the memory 210 .
  • the processor 220 can control the transceiver 230 to communicate with other devices, specifically, can send information or data to other devices, or receive information or data sent by other devices.
  • Transceiver 230 may include a transmitter and a receiver.
  • the transceiver 230 may further include antennas, and the number of antennas may be one or more.
  • bus system includes not only a data bus, but also a power bus, a control bus and a status signal bus.
  • the present application also provides a computer storage medium, on which a computer program is stored, and when the computer program is executed by a computer, the computer can execute the methods of the above method embodiments.
  • the embodiment of the present application also provides a computer program product including program instructions, which, when the program instructions are run on the electronic device, cause the electronic device to execute the method of the above method embodiment.
  • the computer program product includes one or more computer instructions.
  • the computer can be a general purpose computer, a special purpose computer, a computer network, or other programmable device.
  • the computer instructions may be stored in or transmitted from one computer-readable storage medium to another computer-readable storage medium, for example, the computer instructions may be transferred from a website, computer, server, or data center by wire (such as coaxial cable, optical fiber, digital subscriber line (DSL)) or wireless (such as infrared, wireless, microwave, etc.) to another website site, computer, server or data center.
  • the computer-readable storage medium may be any available medium that can be accessed by a computer, or a data storage device such as a server or a data center integrated with one or more available media.
  • the available medium may be a magnetic medium (such as a floppy disk, a hard disk, or a magnetic tape), an optical medium (such as a digital video disc (digital video disc, DVD)), or a semiconductor medium (such as a solid state disk (solid state disk, SSD)), etc.
  • a magnetic medium such as a floppy disk, a hard disk, or a magnetic tape
  • an optical medium such as a digital video disc (digital video disc, DVD)
  • a semiconductor medium such as a solid state disk (solid state disk, SSD)
  • modules and algorithm steps of the examples described in conjunction with the embodiments disclosed herein can be implemented by electronic hardware, or a combination of computer software and electronic hardware. Whether these functions are executed by hardware or software depends on the specific application and design constraints of the technical solution. Those skilled in the art may use different methods to implement the described functions for each specific application, but such implementation should not be regarded as exceeding the scope of the present application.
  • the disclosed devices and methods may be implemented in other ways.
  • the device embodiments described above are only illustrative.
  • the division of the modules is only a logical function division. In actual implementation, there may be other division methods.
  • multiple modules or components can be combined or can be Integrate into another system, or some features may be ignored, or not implemented.
  • the mutual coupling or direct coupling or communication connection shown or discussed may be through some interfaces, and the indirect coupling or communication connection of devices or modules may be in electrical, mechanical or other forms.
  • a module described as a separate component may or may not be physically separated, and a component displayed as a module may or may not be a physical module, that is, it may be located in one place, or may also be distributed to multiple network units. Part or all of the modules can be selected according to actual needs to achieve the purpose of the solution of this embodiment. For example, each functional module in each embodiment of the present application may be integrated into one processing module, each module may exist separately physically, or two or more modules may be integrated into one module.

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Multimedia (AREA)
  • Software Systems (AREA)
  • General Health & Medical Sciences (AREA)
  • Health & Medical Sciences (AREA)
  • Evolutionary Computation (AREA)
  • Artificial Intelligence (AREA)
  • Medical Informatics (AREA)
  • Databases & Information Systems (AREA)
  • Computing Systems (AREA)
  • Architecture (AREA)
  • Computer Graphics (AREA)
  • Computer Hardware Design (AREA)
  • Psychiatry (AREA)
  • Social Psychology (AREA)
  • Position Input By Displaying (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

一种手势操作方法、装置、设备和介质,涉及虚拟现实技术领域。其中该方法包括:获取用户手部的深度信息(S1);根据手部的深度信息,确定虚拟空间中和手部对应的虚拟手的空间坐标(S2);根据空间坐标在虚拟手上绑定追踪球,其中,在虚拟手的掌心位置绑定掌心球,在虚拟手的指尖位置绑定指尖球,掌心球的体积大于指尖球(S3);根据指尖球与掌心球的直线距离,在虚拟空间执行对应的操作(S4)。该手势操作方法引进了随手而动的小球,以达到裸手操作的目的,不仅稳定性更高,对精度要求更小,从而可减少人力、财力,并且由于精度要求较小使得便于进行点击操作,大大提高用户的交互体验度。

Description

手势操作方法、装置、设备和介质
相关申请的交叉引用
本申请要求于2021年08月12日提交的,申请号为202110926646.2、发明名称为“增强现实中的裸手操作方法、系统”的中国专利申请的优先权,该申请的全部内容通过引用结合在本申请中。
技术领域
本发明涉及虚拟现实技术领域,更为具体地,涉及一种手势操作方法、装置、设备和介质。
背景技术
由于科技的进步,市场需求的多元化发展,虚拟现实系统正变得越来越普遍,应用在许多领域,如电脑游戏,健康和安全,工业和教育培训。举几个例子,混合虚拟现实系统正在被整合到移动通讯设备、游戏机、个人电脑、电影院,主题公园,大学实验室,学生教室,医院锻炼健身室等生活各个角落。
随着人工现实领域的发展,用户在VR、AR以及MR场景下与内容的交互必不可少,同时操作便捷的“裸手”手势交互成为今日发展的趋势。目前该场景下已有的手势交互大部分为单手指食指“点击”手势、拇指与食指的“捏取”手势、握拳“确定”手势等。而单纯的采用单手食指“点击”、拇指与食指的“捏取”、握拳进行“确定”的交互手势,对手势识别的精度要求较高,因而对此产生的人力、财力投入势必较大,并且在进行手的虚拟坐标采集时,由于精度要求较高,通常手的重要关节定位不稳定,从而使得交互准确度与体验度相对较差。
因此,亟需一种能够减少人财投入,提高手势识别精度,提高手势操作稳定性的方案。
发明内容
鉴于上述问题,本发明的目的是提供一种手势操作方法,以解决目前单纯的采用单手食指“点击”、拇指与食指的“捏取”、握拳进行“确定”的交互手势,对手势识别的精度要求较高,因而对此产生的人力、财力投入势必较大,并且在进行手的虚拟坐标采集时,由于精度要求较高,通常手的重要关节定位不稳定,从而使得交互准确度与体验度相对较差的问题。
本发明提供的一种手势操作方法,其中,包括:
获取用户手部的深度信息;
根据所述手部的深度信息,确定虚拟空间中和所述手部对应的虚拟手的空间坐标;
根据所述空间坐标在所述虚拟手上绑定追踪球;其中,在所述虚拟手的掌心位置绑定掌心球,在所述虚拟手的指尖位置绑定指尖球,所述掌心球的体积大于所述指尖球;
根据所述指尖球与所述掌心球的直线距离,在所述虚拟空间执行对应的操作。
优选地,根据所述手部的深度信息,确定虚拟空间中和所述手部对应的虚拟手的空间坐标,包括:
获取所述手部与传感器的相对距离;
根据所述传感器的位置和所述相对距离,获取所述手部的手腕真实位置;
以所述传感器的虚拟坐标为参照,将所述手腕真实位置映射至所述虚拟空间内以形成手腕空间坐标;
根据所述手腕空间坐标和手部关节信息进行计算填充以形成虚拟手,并获取所述虚拟手的空间坐标。
优选地,在所述虚拟手的掌心位置绑定掌心球,在所述虚拟手的指尖位置绑定指尖球,包括:
获取所述虚拟手的掌心虚拟位置和指尖虚拟位置;
在所述掌心虚拟位置上设置掌心球,在所述指尖虚拟位置上设置指尖球;
其中,所述掌心球始终随着所述掌心虚拟位置的移动而移动;
所述指尖球始终随着所述指尖虚拟位置的移动而移动。
优选地,所述指尖球包括小指指尖球、无名指指尖球和中指指尖球。
优选地,根据所述指尖球与所述掌心球的直线距离,在所述虚拟空间执行对应的操作,包括:
获取指尖球-掌心球的直线距离,其中,所述直线距离表征除拇指外的手指向所述掌心靠拢做握拳动作时所述指尖球与所述掌心球之间所形成的直线距离;
根据所述指尖球-掌心球的直线距离,在所述虚拟空间执行对应的操作。
优选地,还包括:在所述虚拟手的拇指位置上绑定拇指球,在所述虚拟手的食指位置上绑定食指长方体;其中,
所述虚拟手的拇指位置上绑定拇指球,在所述虚拟手的食指位置上绑定食指长方体,包括:
获取所述拇指虚拟位置和食指虚拟位置;
在所述拇指虚拟位置上设置拇指球,在所述食指虚拟位置上设置食指长方体;
其中,所述拇指球始终随着所述拇指虚拟位置的移动而移动;
所述食指长方体始终随着所述食指虚拟位置的移动而移动。
优选地,还包括:
获取拇指球-食指长方体间距,其中,所述拇指球-食指长方体间距表征所述拇指向所述食指靠拢做捏取动作时所述拇指球与所述食指长方体之间所形成的间距;
根据所述拇指球-食指长方体间距,触发所述虚拟手对应位置处的操作光标以在虚拟空间执行对应的操作。
本发明还提供一种手势操作装置,实现如前所述的手势操作方法,包括:
信息获取模块,用于获取用户手部的深度信息;
坐标对应模块,用于根据所述手部的深度信息,确定虚拟空间中和所述手部对应的虚拟手的空间坐标;
追踪绑定模块,用于根据所述空间坐标在所述虚拟手上绑定追踪球;其中,在所述虚拟手的掌心位置绑定掌心球,在所述虚拟手的指尖位置绑定指尖球,所述掌心球的体积大于所述指尖球;
互动执行模块,用于根据所述指尖球与所述掌心球的直线距离,在所述虚拟空间执行对应的操作。
优选地,所述坐标对应模块包括:
真实位置计算单元,用于获取所述手部与传感器的相对距离;根据所述传感器的位置和所述相对距离,获取所述手部的手腕真实位置;
虚拟坐标对应单元,用于以所述传感器的虚拟坐标为参照,将所述手腕真实位置映射至所述虚拟空间内以形成手腕空间坐标;根据所述手腕空间坐标和手部关节信息进行计算填充以形成虚拟手,并获取所述虚拟手的空间坐标。
优选地,还包括:捏取操作单元;
所述捏取操作单元,用于获取拇指球-食指长方体间距,其中,所述拇指球-食指长方体间距表征所述拇指向所述食指靠拢做捏取动作时所述拇指球与所述食指长方体之间所形成的间距;根据所述拇指球-食指长方体间距,触发所述虚拟手对应位置处的操作光标以在虚拟空间执行对应的操作。
本发明还提供了一种电子设备,包括:
处理器和存储器,所述存储器用于存储计算机程序,所述处理器用于调用并运行所述存储器中存储的计算机程序,以执行如前述实施例所述的手势操作方法。
本发明还提供了一种计算机可读存储介质,用于存储计算机程序,所述计算机程序使得计算机执行如前述实施例所述的手势操作方法。
本发明还提供了一种包含程序指令的计算机程序产品,当所述程序指令在电子设备上 运行时,使得所述电子设备执行如前述实施例所述的手势操作方法。
从上面的技术方案可知,本发明提供的手势操作方法、装置、设备及介质,通过获取用户手部的深度信息,根据手部的深度信息,确定虚拟空间中和手部对应的虚拟手的空间坐标,再根据空间坐标在虚拟手上绑定追踪球,其中,在虚拟手的掌心位置绑定掌心球,在虚拟手的指尖位置绑定指尖球,掌心球的体积大于指尖球,然后根据指尖球与掌心球的直线距离,在虚拟空间执行对应操作,该种手势操作方式由于引进了随手而动的小球,以达到裸手操作的目的,不仅稳定性更高,对精度要求更小,从而减少人力、财力,并且由于精度要求较小使得便于进行点击操作,大大提高用户的交互体验。
附图说明
通过参考以下结合附图的说明书内容,并且随着对本发明的更全面理解,本发明的其它目的及结果将更加明白及易于理解。在附图中:
图1为根据本发明实施例的手势操作方法的流程图;
图2为根据本发明实施例的手势操作方法中绑定追踪球的示意图;
图3为根据本发明实施例的手势操作方法中拇指球与食指长方体的示意图;
图4为根据本发明实施例的手势操作方法中虚拟射线的示意图;
图5为根据本发明实施例的手势操作装置的示意图框图;
图6为根据本发明实施例的电子设备的示意图框图。
具体实施方式
下面将结合本申请实施例中的附图,对本申请实施例中的技术方案进行清楚、完整地描述,显然,所描述的实施例仅仅是本申请一部分实施例,而不是全部的实施例。根据本申请中的实施例,本领域普通技术人员在没有做出创造性劳动的前提下所获得的所有其他实施例,都属于本申请保护的范围。
需要说明的是,本申请的说明书和权利要求书及上述附图中的术语“第一”、“第二”等是用于区别类似的对象,而不必用于描述特定的顺序或先后次序。应该理解这样使用的数据在适当情况下可以互换,以便这里描述的本申请的实施例能够以除了在这里图示或描述的那些以外的顺序实施。此外,术语“包括”和“具有”以及他们的任何变形,意图在于覆盖不排他的包含,例如,包含了一系列步骤或单元的过程、方法、系统、产品或服务器不必限于清楚地列出的那些步骤或单元,而是可包括没有清楚地列出的或对于这些过程、方法、产品或设备固有的其它步骤或单元。
目前该场景下已有的手势交互大部分为单手指食指“点击”手势、拇指与食指的“捏 取”手势、握拳“确定”手势等。而单纯的采用单手食指“点击”、拇指与食指的“捏取”、握拳进行“确定”的交互手势,对手势识别的精度要求较高,因而对此产生的人力、财力投入势必较大,并且在进行手的虚拟坐标采集时,由于精度要求较高,通常手的重要关节定位不稳定,从而使得交互准确度与体验度相对较差。
针对上述问题,本发明提供一种手势操作方法、装置、设备和介质,以下将结合附图对本发明的具体实施例进行详细描述。
为了说明本发明提供的手势操作方法、装置、设备和介质,图1、图2、图3、图4对本发明实施例的手势操作方法进行了示例性表示;图5对本发明实施例的手势操作装置进行了示例性表示,以及图6对本发明实施例的电子设备进行了示例性表示。
以下示例性实施例的描述实际上仅仅是说明性的,决不作为对本发明及其应用或使用的任何限制。对于相关领域普通技术人员已知的技术和设备可能不作详细讨论,但在适当情况下,所述技术和设备应当被视为说明书的一部分。
如图1所示,本发明提供的手势操作方法,包括:
S1:获取用户手部的深度信息;
S2:根据手部的深度信息,确定虚拟空间中和手部对应的虚拟手的空间坐标;
S3:根据空间坐标在虚拟手上绑定追踪球;其中,在虚拟手的掌心位置绑定掌心球,在虚拟手的指尖位置绑定指尖球,掌心球的体积大于指尖球;
S4:根据指尖球与掌心球的直线距离,在虚拟空间执行对应的操作。
需要说明的,本申请实施例中确定掌心球大于指尖球时,除了基于体积实现之外,还可基于球的直径或半径来实现,即掌心球的直径大于指尖球的直径,或者掌心球的半径大于指尖球的半径等,本申请对此不做具体限制。
如图1所示,步骤S1获取用户手部的深度信息,可通过深度相机或者其他任意能够获取到深度信息的相机来获取,本申请对此不做具体限制。
在本实施例中,步骤S2根据所述手部的深度信息,确定虚拟空间中和所述手部对应的虚拟手的空间坐标,可通过如下步骤实现:
S11:获取用户手部与传感器的相对距离;
S12:根据传感器的位置和相对距离,获取手部的手腕真实位置;
S13:以传感器的虚拟坐标为参照,将手腕真实位置映射至虚拟空间内以形成手腕空间坐标;
S14:根据手腕空间坐标和手部关节信息进行计算填充以形成虚拟手,并获取虚拟手在空间坐标。
其中,传感器是指VR系统中的传感器。
具体的,本申请可首先通过深度相机或其他类型相机,获取手部与传感器的之间的位置关系,以根据位置关系确定手部与传感器的相对距离。因为传感器的真实位置以及传感器的虚拟坐标均是已知的,故可根据传感器的真实位置和相对距离推算出用户手部的手腕真实位置;其次基于已知的传感器的虚拟坐标,即可将手腕的真实位置映射至虚拟空间中,如此获取到手腕空间坐标。并且,手的大小以及手的各个关节之间的位置关系(手部关节信息)均是已知的,故可根据手腕空间坐标计算推导出手部的各个关节在虚拟空间中的坐标,因此得以填充形成虚拟手;进而,获取整个虚拟手在VR系统内的空间坐标。
如图1、图2共同所示,步骤S3为在虚拟手上绑定追踪球的过程。其中,在如图2所示的虚拟手的掌心位置绑定掌心球(大球),在虚拟手的指尖位置绑定指尖球(小球)的过程,包括:
S21:获取虚拟手的掌心虚拟位置和指尖虚拟位置;
S22:在掌心虚拟位置上设置掌心球,在指尖虚拟位置上设置指尖球;
其中,该掌心球始终随着掌心虚拟位置坐标的移动而移动;该指尖球始终随着指尖虚拟位置坐标的移动而移动;
如此,通过判断掌心球与指尖球的距离即可判断手指的指尖与掌心的距离,进而判断手指的弯曲程度以辨别整个手部是否发生了握拳动作。在本实施例中,该指尖球包括小指指尖球、无名指指尖球和中指指尖球,即通过指尖球判断小指、无名指、中指与掌心的位置关系,从而判断用户是否握拳。该种通过绑定小球判断用户手部状态的方式解决了传统裸手操作精度要求高的问题,提高了虚拟手定位的稳定性,且降低了精准度要求,进而提高用户沉浸式体验。
如图1、图2共同所示,步骤S4是通过根据指尖球和掌心球的直线距离,在虚拟空间执行对应的操作。示例性的,在本实施例中S4的实现过程包括:
S31:获取指尖球-掌心球的直线距离,其中,该指尖球-掌心球的直线距离表征除拇指外的手指向掌心靠拢做握拳动作时指尖球与掌心球之间所形成的直线距离;
S32:根据指尖球-掌心球的直线距离,在虚拟空间执行对应的操作。
需要说明的是,该指尖球和掌心球为虚拟球,其可以为有色的,也可以为无色的。为了便于示意,在附图2中示出了掌心球和指尖球,但在本实施例中,该掌心球和指尖球为无色透明、肉眼不可见的虚拟球体,其可随着所绑定的手的部位移动而移动,以增加虚拟手的坐标信息的稳定性,从而确保对虚拟手的姿势的判定的准确性。
可选的,本申请通过实时获取指尖球-掌心球的直线距离,并将指尖球-掌心球的直线距离与预设的触发阈值进行比较。若指尖球-掌心球直线距离小于预设的触发阈值,则说明用户需要进行操作交互,此时第一触发条件成立。如果第一触发条件成立,则本申请可自动 立即执行任何与该第一触发条件相对应的操作,即在虚拟空间执行对应的任意操作。如此,可提高进行裸手操作交互时的响应速度。
其中,预设的触发阈值可根据实际应用需要进行灵活设置,此处对其不做具体限制。
如图1、图3共同所示,若用户进行了一次握拳,则满足一次该第一触发条件,相应该第一触发条件成立,此时VR系统自动执行与该第一触发条件相对应的交互操作,该交互操作与该第一触发条件的对应关系为提前预设的,具体的预设过程在此不做赘述;并且该第一触发条件可以与任何具有交互性质的操作相对应,比如进入页面和退出页面,点击VR系统中的显示器中界面上的任何光标或标识,甚至进行开启和关闭操作等。其中,该开启和关闭包括但不限于对显示器的开启或关闭,即当用户握拳动作发生,该VR系统即可自动进行上述进入页面和退出页面,点击VR系统中的显示器中界面上的任何光标或标识,甚至进行开启和关闭操作等任何交互性质的操作。
此外,如图1、图3共同所示,本发明实施例的手势操作方法,还包括:S5:在虚拟手的拇指位置上绑定拇指球,在虚拟手的食指位置上绑定食指长方体;
可选的,本申请在虚拟手的拇指位置上绑定拇指球(图3中拇指上的扁圆形),在虚拟手的食指位置上绑定食指长方体的过程,包括:
S511:获取拇指虚拟位置和食指虚拟位置;
S512:在拇指虚拟位置上设置拇指球,在食指虚拟位置上设置食指长方体;
其中,该拇指球始终随着拇指虚拟位置坐标的移动而移动;该食指长方体始终随着食指虚拟位置坐标的移动而移动;
需要说明的是,该食指长方体并非单指实际意义中传统的长方体形状,而是如图3所示包裹食指的类似长方体形状的标志物,可以较长方体略扁、略不规则。本申请对此不做具体限制。
在图1、图3共同所示的实施例中,本申请手势操作方法,还包括:
S521:获取拇指球-食指长方体间距,其中,该拇指球-食指长方体间距表征拇指向食指靠拢做捏取动作时拇指球与食指长方体之间所形成的间距;
S522:根据拇指球-食指长方体间距,触发虚拟手对应位置处的操作光标以在虚拟空间执行对应的操作。
其中,根据拇指球-食指长方体间距,触发虚拟手对应位置处的操作光标以在虚拟空间执行对应的操作,具体实现过程为:如果拇指球-食指长方体间距小于预设的捏取阈值,则第二触发条件成立。然后,根据第二触发条件触发VR系统上显示器中虚拟手对应位置处的操作光标以进行VR互动操作。如果拇指球-食指长方体间距大于或等于预设的捏取阈值,则第二触发条件不成立,并继续执行S521。其中,预设的捏取阈值可根据实际应用需要进 行灵活设置,此处对其不做具体限制。
在图1、图3、图4共同所示的实施例中,通过在虚拟手的拇指位置绑定拇指球,在食指的位置绑定食指长方体,以便于获取到拇指球-食指长方体间距,当用户有一定的需求,通过拇指捏取食指的方式触发第二触发条件。即,用户做拇指与食指的捏取动作使拇指向食指靠拢以形成拇指球-食指长方体间距,如果拇指球-食指长方体间距小于预设的捏取阈值,则第二触发条件成立,并且该第二触发条件成立后直接自动启动与该第二触发条件相对应的交互操作。在本实施例中,与该第二触发条件相对应的交互操作可以为任意具有“点击”、“按下”等性质的操作,在此不作具体限制。
在图1、图3、图4共同所示的实施例中,通过用户的关节和手部的特定位置确定一条虚拟射线,该虚拟射线在实际应用中可以为有色,也可以为无色。在图4中为便于示意故将该虚拟射线示为有形状、有色的自手部开始、至显示器结束的线条,但在本实施例中该虚拟射线与VR系统中的显示器相交,相交处为显示器中虚拟光标的地方。当用户移动手部时,该虚拟射线也会随手部移动,从而用户通过移动手部的方式移动显示器中的虚拟光标,即通过移动手部的方式选择想要点击显示器中的哪一位置。如果将虚拟光标移动到想要点击的位置,并通过拇指捏取食指的方式触发第二触发条件,使得第二触发条件成立,则虚拟光标点击显示器中任意一个能够点击的页面按键,比如点击页面中的APP图标、点击确定、点击取消,虚拟光标点击之后,在此不做赘述。在本实施例中,若第二触发条件成立对显示器的界面中的内容进行点击之后,当用户松开拇指该点击动作即刻生效,因此,该第二触发条件还能够与拖拽、滑动显示器的界面中的内容这一交互操作相对应,即用户进行捏取动作,虚拟光标点击显示器中的内容,此时用户可移动整个手,使该虚拟光标带动所点击的内容随着手的运动而移动,当把所点击的内容移动至目标位置之后,用户抬起拇指终止捏取动作,即可使拖拽或滑动操作生效。
需要说明的是,第一触发条件和第二触发条件所分别对应交互操作均可提前设置,交互操作的对象可以为任意VR系统中能够进行裸手操作的目标按键或界面,更为详细的操作细节在此不作赘述。
如上所述,本发明提供的手势操作方法,通过获取用户手部的深度信息,根据手部的深度信息,确定虚拟空间中和手部对应的虚拟手的空间坐标,再根据空间坐标在虚拟手上绑定追踪球,其中,在虚拟手的掌心位置绑定掌心球,在虚拟手的指尖位置绑定指尖球,掌心球的体积大于指尖球,然后根据指尖球与掌心球的直线距离,在虚拟空间执行对应操作,该种手势操作方式由于引进了随手而动的小球,以达到裸手操作的目的,不仅稳定性更高,对精度要求更小,从而减少人力、财力,并且由于精度要求较小使得便于进行点击操作,大大提高用户的交互体验。
如图5所示,本发明还提供一种手势操作装置100实现如上所述的手势操作方法,包括:
信息获取模块101,用于获取用户手部的深度信息;
坐标对应模块102,用于根据所述手部的深度信息,确定虚拟空间中和所述手部对应的虚拟手的空间坐标;
追踪绑定模块103,用于根据所述空间坐标在所述虚拟手上绑定追踪球;其中,在所述虚拟手的掌心位置绑定掌心球,在所述虚拟手的指尖位置绑定指尖球,所述掌心球的体积大于所述指尖球;
互动执行模块104,用于根据所述指尖球与所述掌心球的直线距离,在所述虚拟空间执行对应的操作。
在图5所示的实施例中,该坐标对应模块102包括:
真实位置计算单元102-1,用于获取所述手部与传感器的相对距离;根据所述传感器的位置和所述相对距离,获取所述手部的手腕真实位置;
虚拟坐标对应单元102-2,用于以所述传感器的虚拟坐标为参照,将所述手腕真实位置映射至所述虚拟空间内以形成手腕空间坐标;根据所述手腕空间坐标和手部关节信息进行计算填充以形成虚拟手,并获取所述虚拟手的空间坐标。
此外,本发明提供的手势操作装置100,还包括:捏取操作单元105(图中未示出);
该捏取操作单元105,用于获取拇指球-食指长方体间距,其中,所述拇指球-食指长方体间距表征所述拇指向所述食指靠拢做捏取动作时所述拇指球与所述食指长方体之间所形成的间距;根据所述拇指球-食指长方体间距,触发所述虚拟手对应位置处的操作光标以在虚拟空间执行对应的操作。
如此通过根据指尖球与掌心球的直线距离,以及拇指球-食指长方体间距分别设置对应不同的功能,以便于用户通过单手可实现两类交互方式,具体的互动操作方式在此不作具体对应,可根据用户需求而定,以提高用户进行裸手操作的趣味性和稳定性。
通过上述实施方式可以看出,本发明提供的手势操作装置,通过获取用户手部的深度信息,根据手部的深度信息,确定虚拟空间中和手部对应的虚拟手的空间坐标,再根据空间坐标在虚拟手上绑定追踪球,其中,在虚拟手的掌心位置绑定掌心球,在虚拟手的指尖位置绑定指尖球,掌心球的体积大于指尖球,然后根据指尖球与掌心球的直线距离,在虚拟空间执行对应操作,该种手势操作方式由于引进了随手而动的小球,以达到裸手操作的目的,不仅稳定性更高,对精度要求更小,从而减少人力、财力,并且由于精度要求较小使得便于进行点击操作,大大提高用户的交互体验。
应理解的是,装置实施例与前述方法实施例可以相互对应,类似的描述可以参照方法 实施例。为避免重复,此处不再赘述。
图6为根据本发明实施例的电子设备的示意性框图。如图6所示,该电子设备200可包括:
存储器210和处理器220,该存储器210用于存储计算机程序,并将该程序代码传输给该处理器220。换言之,该处理器220可以从存储器210中调用并运行计算机程序,以实现本申请实施例中的手势操作方法。
例如,该处理器220可用于根据该计算机程序中的指令执行上述手势操作方法实施例。
在本申请的一些实施例中,该处理器220可以包括但不限于:
通用处理器、数字信号处理器(Digital Signal Processor,DSP)、专用集成电路(Application Specific Integrated Circuit,ASIC)、现场可编程门阵列(Field Programmable Gate Array,FPGA)或者其他可编程逻辑器件、分立门或者晶体管逻辑器件、分立硬件组件等等。
在本申请的一些实施例中,该存储器210包括但不限于:
易失性存储器和/或非易失性存储器。其中,非易失性存储器可以是只读存储器(Read-Only Memory,ROM)、可编程只读存储器(Programmable ROM,PROM)、可擦除可编程只读存储器(Erasable PROM,EPROM)、电可擦除可编程只读存储器(Electrically EPROM,EEPROM)或闪存。易失性存储器可以是随机存取存储器(Random Access Memory,RAM),其用作外部高速缓存。通过示例性但不是限制性说明,许多形式的RAM可用,例如静态随机存取存储器(Static RAM,SRAM)、动态随机存取存储器(Dynamic RAM,DRAM)、同步动态随机存取存储器(Synchronous DRAM,SDRAM)、双倍数据速率同步动态随机存取存储器(Double Data Rate SDRAM,DDR SDRAM)、增强型同步动态随机存取存储器(Enhanced SDRAM,ESDRAM)、同步连接动态随机存取存储器(synch link DRAM,SLDRAM)和直接内存总线随机存取存储器(Direct Rambus RAM,DR RAM)。
在本申请的一些实施例中,该计算机程序可以被分割成一个或多个模块,该一个或者多个模块被存储在该存储器210中,并由该处理器220执行,以完成本申请提供的手势操作方法。该一个或多个模块可以是能够完成特定功能的一系列计算机程序指令段,该指令段用于描述该计算机程序在该电子设备中的执行过程。
如图6所示,该电子设备还可包括:
收发器230,该收发器230可连接至该处理器220或存储器210。
其中,处理器220可以控制该收发器230与其他设备进行通信,具体地,可以向其他设备发送信息或数据,或接收其他设备发送的信息或数据。收发器230可以包括发射机和接收机。收发器230还可以进一步包括天线,天线的数量可以为一个或多个。
应当理解,该电子设备中的各个组件通过总线系统相连,其中,总线系统除包括数据总线之外,还包括电源总线、控制总线和状态信号总线。
本申请还提供了一种计算机存储介质,其上存储有计算机程序,该计算机程序被计算机执行时使得该计算机能够执行上述方法实施例的方法。
本申请实施例还提供一种包含程序指令的计算机程序产品,当所述程序指令在电子设备上运行时,使得所述电子设备执行上述方法实施例的方法。
当使用软件实现时,可以全部或部分地以计算机程序产品的形式实现。该计算机程序产品包括一个或多个计算机指令。在计算机上加载和执行该计算机程序指令时,全部或部分地产生按照本申请实施例所述的流程或功能。该计算机可以是通用计算机、专用计算机、计算机网络、或者其他可编程装置。该计算机指令可以存储在计算机可读存储介质中,或者从一个计算机可读存储介质向另一个计算机可读存储介质传输,例如,该计算机指令可以从一个网站站点、计算机、服务器或数据中心通过有线(例如同轴电缆、光纤、数字用户线(digital subscriber line,DSL))或无线(例如红外、无线、微波等)方式向另一个网站站点、计算机、服务器或数据中心进行传输。该计算机可读存储介质可以是计算机能够存取的任何可用介质或者是包含一个或多个可用介质集成的服务器、数据中心等数据存储设备。该可用介质可以是磁性介质(例如,软盘、硬盘、磁带)、光介质(例如数字视频光盘(digital video disc,DVD))、或者半导体介质(例如固态硬盘(solid state disk,SSD))等。
本领域普通技术人员可以意识到,结合本文中所公开的实施例描述的各示例的模块及算法步骤,能够以电子硬件、或者计算机软件和电子硬件的结合来实现。这些功能究竟以硬件还是软件方式来执行,取决于技术方案的特定应用和设计约束条件。专业技术人员可以对每个特定的应用来使用不同方法来实现所描述的功能,但是这种实现不应认为超出本申请的范围。
在本申请所提供的几个实施例中,应该理解到,所揭露的装置和方法,可以通过其它的方式实现。例如,以上所描述的装置实施例仅仅是示意性的,例如,该模块的划分,仅仅为一种逻辑功能划分,实际实现时可以有另外的划分方式,例如多个模块或组件可以结合或者可以集成到另一个系统,或一些特征可以忽略,或不执行。另一点,所显示或讨论的相互之间的耦合或直接耦合或通信连接可以是通过一些接口,装置或模块的间接耦合或通信连接,可以是电性,机械或其它的形式。
作为分离部件说明的模块可以是或者也可以不是物理上分开的,作为模块显示的部件可以是或者也可以不是物理模块,即可以位于一个地方,或者也可以分布到多个网络单元上。可以根据实际的需要选择其中的部分或者全部模块来实现本实施例方案的目的。例如, 在本申请各个实施例中的各功能模块可以集成在一个处理模块中,也可以是各个模块单独物理存在,也可以两个或两个以上模块集成在一个模块中。
如上参照附图以示例的方式描述了根据本发明提出的手势操作方法、装置、设备和介质。但是,本领域技术人员应当理解,对于上述本发明所提出的手势操作方法、装置、设备和介质,还可以在不脱离本发明内容的基础上做出各种改进。因此,本发明的保护范围应当由所附的权利要求书的内容确定。

Claims (11)

  1. 一种手势操作方法,其特征在于,包括:
    获取用户手部的深度信息;
    根据所述手部的深度信息,确定虚拟空间中和所述手部对应的虚拟手的空间坐标;
    根据所述空间坐标在所述虚拟手上绑定追踪球;其中,在所述虚拟手的掌心位置绑定掌心球,在所述虚拟手的指尖位置绑定指尖球,所述掌心球的体积大于所述指尖球;
    根据所述指尖球与所述掌心球的直线距离,在所述虚拟空间执行对应的操作。
  2. 如权利要求1所述的手势操作方法,其特征在于,根据所述手部的深度信息,确定虚拟空间中和所述手部对应的虚拟手的空间坐标,包括:
    获取所述手部与传感器的相对距离;
    根据所述传感器的位置和所述相对距离,获取所述手部的手腕真实位置;
    以所述传感器的虚拟坐标为参照,将所述手腕真实位置映射至所述虚拟空间内以形成手腕空间坐标;
    根据所述手腕空间坐标和手部关节信息进行计算填充以形成虚拟手,并获取所述虚拟手的空间坐标。
  3. 如权利要求1所述的手势操作方法,其特征在于,在所述虚拟手的掌心位置绑定掌心球,在所述虚拟手的指尖位置绑定指尖球,包括:
    获取所述虚拟手的掌心虚拟位置和指尖虚拟位置;
    在所述掌心虚拟位置上设置掌心球,在所述指尖虚拟位置上设置指尖球;
    其中,所述掌心球始终随着所述掌心虚拟位置的移动而移动;
    所述指尖球始终随着所述指尖虚拟位置的移动而移动。
  4. 如权利要求1所述的手势操作方法,其特征在于,
    所述指尖球包括小指指尖球、无名指指尖球和中指指尖球。
  5. 如权利要求4所述的手势操作方法,其特征在于,根据所述指尖球与所述掌心球的直线距离,在所述虚拟空间执行对应的操作,包括:
    获取指尖球-掌心球的直线距离,其中,所述直线距离表征除拇指外的手指向所述掌心靠拢做握拳动作时所述指尖球与所述掌心球之间所形成的直线距离;
    根据所述指尖球-掌心球的直线距离,在所述虚拟空间执行对应的操作。
  6. 如权利要求1所述的手势操作方法,其特征在于,还包括:在所述虚拟手的拇指位置上绑定拇指球,在所述虚拟手的食指位置上绑定食指长方体;其中,
    所述虚拟手的拇指位置上绑定拇指球,在所述虚拟手的食指位置上绑定食指长方体,包括:
    获取所述拇指虚拟位置和食指虚拟位置;
    在所述拇指虚拟位置上设置拇指球,在所述食指虚拟位置上设置食指长方体;
    其中,所述拇指球始终随着所述拇指虚拟位置的移动而移动;
    所述食指长方体始终随着所述食指虚拟位置的移动而移动。
  7. 如权利要求6所述的手势操作方法,其特征在于,还包括:
    获取拇指球-食指长方体间距,其中,所述拇指球-食指长方体间距表征所述拇指向所述食指靠拢做捏取动作时所述拇指球与所述食指长方体之间所形成的间距;
    根据所述拇指球-食指长方体间距,触发所述虚拟手对应位置处的操作光标以在虚拟空间执行对应的操作。
  8. 一种手势操作装置,实现如权利要求1-7任一所述的手势操作方法,包括:
    信息获取模块,用于获取用户手部的深度信息;
    坐标对应模块,用于根据所述手部的深度信息,确定虚拟空间中和所述手部对应的虚拟手的空间坐标;
    追踪绑定模块,用于根据所述空间坐标在所述虚拟手上绑定追踪球;其中,在所述虚拟手的掌心位置绑定掌心球,在所述虚拟手的指尖位置绑定指尖球,所述掌心球的体积大于所述指尖球;
    互动执行模块,用于根据所述指尖球与所述掌心球的直线距离,在所述虚拟空间执行对应的操作。
  9. 一种电子设备,其特征在于,包括:
    处理器和存储器,所述存储器用于存储计算机程序,所述处理器用于调用并运行所述存储器中存储的计算机程序,以执行如权利要求1-7任一所述的手势操作方法。
  10. 一种计算机可读存储介质,其特征在于,用于存储计算机程序,所述计算机程序使得计算机执行如权利要求1-7任一所述的手势操作方法。
  11. 一种包含程序指令的计算机程序产品,其特征在于,当所述程序指令在电子设备上运行时,使得所述电子设备执行如权利要求1-7任一所述的手势操作方法。
PCT/CN2022/105375 2021-08-12 2022-07-13 手势操作方法、装置、设备和介质 WO2023016174A1 (zh)

Priority Applications (1)

Application Number Priority Date Filing Date Title
US18/088,213 US11803248B2 (en) 2021-08-12 2022-12-23 Gesture operation method, apparatus, device and medium

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN202110926646.2 2021-08-12
CN202110926646.2A CN113608619A (zh) 2021-08-12 2021-08-12 增强现实中的裸手操作方法、系统

Related Child Applications (1)

Application Number Title Priority Date Filing Date
US18/088,213 Continuation US11803248B2 (en) 2021-08-12 2022-12-23 Gesture operation method, apparatus, device and medium

Publications (1)

Publication Number Publication Date
WO2023016174A1 true WO2023016174A1 (zh) 2023-02-16

Family

ID=78340552

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2022/105375 WO2023016174A1 (zh) 2021-08-12 2022-07-13 手势操作方法、装置、设备和介质

Country Status (3)

Country Link
US (1) US11803248B2 (zh)
CN (1) CN113608619A (zh)
WO (1) WO2023016174A1 (zh)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117021117A (zh) * 2023-10-08 2023-11-10 电子科技大学 一种基于混合现实的移动机器人人机交互与定位方法

Families Citing this family (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113608619A (zh) 2021-08-12 2021-11-05 青岛小鸟看看科技有限公司 增强现实中的裸手操作方法、系统
US11995780B2 (en) 2022-09-09 2024-05-28 Snap Inc. Shooting interaction using augmented reality content in a messaging system
US20240087246A1 (en) * 2022-09-09 2024-03-14 Snap Inc. Trigger gesture for selection of augmented reality content in messaging systems

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20170140552A1 (en) * 2014-06-25 2017-05-18 Korea Advanced Institute Of Science And Technology Apparatus and method for estimating hand position utilizing head mounted color depth camera, and bare hand interaction system using same
CN108334198A (zh) * 2018-02-09 2018-07-27 华南理工大学 基于增强现实的虚拟雕塑方法
CN109976519A (zh) * 2019-03-14 2019-07-05 浙江工业大学 一种基于增强现实的交互显示装置及其交互显示方法
CN112000224A (zh) * 2020-08-24 2020-11-27 北京华捷艾米科技有限公司 一种手势交互方法及系统
CN113608619A (zh) * 2021-08-12 2021-11-05 青岛小鸟看看科技有限公司 增强现实中的裸手操作方法、系统

Family Cites Families (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8232990B2 (en) * 2010-01-05 2012-07-31 Apple Inc. Working with 3D objects
US9182596B2 (en) * 2010-02-28 2015-11-10 Microsoft Technology Licensing, Llc See-through near-eye display glasses with the optical assembly including absorptive polarizers or anti-reflective coatings to reduce stray light
US20120194549A1 (en) * 2010-02-28 2012-08-02 Osterhout Group, Inc. Ar glasses specific user interface based on a connected external device type
US9152306B2 (en) * 2011-03-29 2015-10-06 Intel Corporation Techniques for touch and non-touch user interaction input
KR102471447B1 (ko) * 2016-02-03 2022-11-28 엘지전자 주식회사 미러형 디스플레이 장치 및 그 제어방법
KR101826911B1 (ko) * 2017-05-31 2018-02-07 주식회사 네비웍스 햅틱 인터랙션 기반 가상현실시뮬레이터 및 그 동작 방법
US10586434B1 (en) * 2017-10-25 2020-03-10 Amazon Technologies, Inc. Preventing unauthorized access to audio/video recording and communication devices
KR102269414B1 (ko) * 2019-03-07 2021-06-24 재단법인 실감교류인체감응솔루션연구단 핸드 모션 캡쳐 장치를 기반으로 가상/증강 현실에서의 객체 조작 방법 및 장치
US11992934B2 (en) * 2021-01-13 2024-05-28 MediVis, Inc. Stereo video in augmented reality

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20170140552A1 (en) * 2014-06-25 2017-05-18 Korea Advanced Institute Of Science And Technology Apparatus and method for estimating hand position utilizing head mounted color depth camera, and bare hand interaction system using same
CN108334198A (zh) * 2018-02-09 2018-07-27 华南理工大学 基于增强现实的虚拟雕塑方法
CN109976519A (zh) * 2019-03-14 2019-07-05 浙江工业大学 一种基于增强现实的交互显示装置及其交互显示方法
CN112000224A (zh) * 2020-08-24 2020-11-27 北京华捷艾米科技有限公司 一种手势交互方法及系统
CN113608619A (zh) * 2021-08-12 2021-11-05 青岛小鸟看看科技有限公司 增强现实中的裸手操作方法、系统

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117021117A (zh) * 2023-10-08 2023-11-10 电子科技大学 一种基于混合现实的移动机器人人机交互与定位方法
CN117021117B (zh) * 2023-10-08 2023-12-15 电子科技大学 一种基于混合现实的移动机器人人机交互与定位方法

Also Published As

Publication number Publication date
US20230125393A1 (en) 2023-04-27
US11803248B2 (en) 2023-10-31
CN113608619A (zh) 2021-11-05

Similar Documents

Publication Publication Date Title
WO2023016174A1 (zh) 手势操作方法、装置、设备和介质
JP5698733B2 (ja) 三空間入力の検出、表現、および解釈:自由空間、近接、および表面接触モードを組み込むジェスチャ連続体
EP3129871B1 (en) Generating a screenshot
US10592050B2 (en) Systems and methods for using hover information to predict touch locations and reduce or eliminate touchdown latency
Kim et al. Touch and hand gesture-based interactions for directly manipulating 3D virtual objects in mobile augmented reality
US11483376B2 (en) Method, apparatus, and computer-readable medium for transmission of files over a web socket connection in a networked collaboration workspace
Qian et al. Portal-ble: Intuitive free-hand manipulation in unbounded smartphone-based augmented reality
US7427980B1 (en) Game controller spatial detection
CN108616712B (zh) 一种基于摄像头的界面操作方法、装置、设备及存储介质
US20210051374A1 (en) Video file playing method and apparatus, and storage medium
US10528145B1 (en) Systems and methods involving gesture based user interaction, user interface and/or other features
KR102021851B1 (ko) 가상현실 환경에서의 사용자와 객체 간 상호 작용 처리 방법
US11630633B1 (en) Collaborative system between a streamer and a remote collaborator
US20170131785A1 (en) Method and apparatus for providing interface interacting with user by means of nui device
CN110493125A (zh) 即时通信方法、设备及计算机可读存储介质
US20240134461A1 (en) Gesture interaction method and system based on artificial reality
CN108874141B (zh) 一种体感浏览方法和装置
JP2021517302A (ja) ネットワーク化された共同ワークスペースにおけるウェブ・ソケット接続を介したファイルの送信のための方法、装置、及びコンピュータ可読媒体
WO2023174097A1 (zh) 交互方法、装置、设备及计算机可读存储介质
Chen Immersive Analytics Interaction: User Preferences and Agreements by Task Type
Nguyen Integrating in-hand physical objects in mixed reality interactions
Dzhoroev et al. Comparison of Face Tracking and Eye Tracking for Scrolling a Web Browser on Mobile Devices
CN117170488A (zh) 交互方法、装置、设备、存储介质和程序产品
US20190325657A1 (en) Operating method and device applicable to space system, and storage medium
CN117170489A (zh) 交互方法、装置、设备、存储介质和程序产品

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 22855157

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE