CN115657844A - Method and device for interacting with head-mounted display equipment and interaction system - Google Patents

Method and device for interacting with head-mounted display equipment and interaction system Download PDF

Info

Publication number
CN115657844A
CN115657844A CN202211242490.7A CN202211242490A CN115657844A CN 115657844 A CN115657844 A CN 115657844A CN 202211242490 A CN202211242490 A CN 202211242490A CN 115657844 A CN115657844 A CN 115657844A
Authority
CN
China
Prior art keywords
value
virtual
coordinate system
determining
joystick
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202211242490.7A
Other languages
Chinese (zh)
Inventor
吴维
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shining Reality Wuxi Technology Co Ltd
Original Assignee
Shining Reality Wuxi Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shining Reality Wuxi Technology Co Ltd filed Critical Shining Reality Wuxi Technology Co Ltd
Priority to CN202211242490.7A priority Critical patent/CN115657844A/en
Publication of CN115657844A publication Critical patent/CN115657844A/en
Pending legal-status Critical Current

Links

Images

Landscapes

  • Position Input By Displaying (AREA)

Abstract

The embodiment of the disclosure discloses a method, a device and an interaction system for interacting with a head-mounted display device, wherein the method comprises the following steps: activating a joystick mode in response to detecting that a user wearing the head-mounted display device has performed a preset joystick mode gesture action; determining an origin position of the virtual joystick coordinate system based on at least one of a hand position of the user and a position of a display screen of the head-mounted display device in a state where the joystick mode is activated, and determining the virtual joystick coordinate system based on the origin position of the virtual joystick coordinate system; and determining the input key value of the virtual joystick based on the projection of the target hand of the user on the virtual joystick coordinate system. The embodiment of the disclosure can operate the joystick on the head-mounted display device under the condition that the real joystick is not equipped, and the user experience is improved.

Description

Method and device for interacting with head-mounted display equipment and interaction system
Technical Field
The present disclosure relates to the field of virtual reality technologies, and in particular, to a method, an apparatus, and an interaction system for interacting with a head-mounted display device.
Background
In scenes such as Virtual Reality (VR), augmented Reality (AR), mixed Reality (MR), or extended display, the terminal provides interactive immersive experience to the user by constructing a Virtual environment.
In some interaction scenarios with a head mounted display device, it is often desirable to interact with the head mounted display device through a joystick, but some head mounted display devices are not equipped with a joystick. How to realize the operation of the joystick of the head-mounted display device under the condition that the joystick is not equipped is a problem to be solved urgently.
Disclosure of Invention
The present disclosure is proposed to solve the above technical problems. Embodiments of the present disclosure provide a method, apparatus, and interaction system for interacting with a head-mounted display device.
According to a first aspect of embodiments of the present disclosure, there is provided a method for interacting with a head-mounted display device, comprising:
activating a joystick mode in response to detecting that a user wearing the head-mounted display device has performed a preset gesture action;
determining an origin position of the virtual joystick coordinate system based on at least one of a hand position of the user and a position of a display screen of the head-mounted display device in a state where the joystick mode is activated, and determining the virtual joystick coordinate system based on the origin position of the virtual joystick coordinate system;
the input key value of the virtual joystick is determined based on the projection of the target hand of the user on the virtual joystick coordinate system.
According to a second aspect of embodiments of the present disclosure, there is provided an apparatus for interacting with a head-mounted display device, comprising:
the joystick mode activation module is used for activating the joystick mode when detecting that a user carries out preset gesture actions;
a virtual joystick coordinate system determination module for determining an origin position of a virtual joystick coordinate system based on at least one of a hand position of a user and a position of a display screen of the head-mounted display device in a state where the joystick mode is activated, and determining the virtual joystick coordinate system based on the origin position of the virtual joystick coordinate system;
and the input key value determining module is used for determining the input key value of the virtual joystick based on the projection of the target hand of the user on the coordinate system of the virtual joystick.
According to a third aspect of embodiments of the present disclosure, there is provided a computer-readable storage medium storing a computer program for executing the method for interacting with a head mounted display device of the first aspect.
According to a fourth aspect of an embodiment of the present disclosure, there is provided an electronic apparatus including:
a processor;
a memory for storing processor-executable instructions;
a processor for reading executable instructions from the memory and executing the instructions to implement the method for interacting with a head mounted display device of the first aspect described above.
The technical solution of the present disclosure is further described in detail by the accompanying drawings and embodiments.
Drawings
The above and other objects, features and advantages of the present disclosure will become more apparent from the following detailed description of the embodiments of the present disclosure when taken in conjunction with the accompanying drawings. The accompanying drawings are included to provide a further understanding of the embodiments of the disclosure and are incorporated in and constitute a part of this specification, illustrate embodiments of the disclosure and together with the description serve to explain the principles of the disclosure and not to limit the disclosure. In the drawings, like reference numbers generally indicate like parts or steps.
FIG. 1 is an exemplary system architecture diagram of an embodiment of a method or apparatus for interacting with a head mounted display device to which the present disclosure may be applied;
FIG. 2 is a flow diagram of a method for interacting with a head mounted display device in one embodiment of the present disclosure;
FIG. 3 is a schematic flow chart of step S4 in one embodiment of the present disclosure;
FIG. 4 is a schematic flow chart of step S4 in another embodiment of the present disclosure;
FIG. 5 is a schematic flow chart of step S6 in one embodiment of the present disclosure;
FIG. 6 is a schematic illustration of normalization processing based on input values in one example of the disclosure;
FIG. 7 is a schematic diagram of gestural actions controlling a virtual handle or virtual joystick in one example of the present disclosure;
FIG. 8 is a block diagram of an apparatus for interacting with a head mounted display device in one embodiment of the present disclosure;
fig. 9 is a block diagram of an electronic device in one embodiment of the present disclosure.
Detailed Description
Hereinafter, example embodiments according to the present disclosure will be described in detail with reference to the accompanying drawings. It is to be understood that the described embodiments are merely a subset of the embodiments of the present disclosure and not all embodiments of the present disclosure, with the understanding that the present disclosure is not limited to the example embodiments described herein.
It should be noted that: the relative arrangement of parts and steps, numerical expressions and numerical values set forth in these embodiments do not limit the scope of the present disclosure unless specifically stated otherwise.
It will be understood by those of skill in the art that the terms "first," "second," and the like in the embodiments of the present disclosure are used merely to distinguish one element from another, and are not intended to imply any particular technical meaning, nor is the necessary logical order between them.
It is also understood that in embodiments of the present disclosure, "a plurality" may refer to two or more than two, and "at least one" may refer to one, two or more than two.
It is also to be understood that any reference to any component, data, or structure in the embodiments of the present disclosure may be generally understood as one or more, unless explicitly defined otherwise or indicated to the contrary hereinafter.
In addition, the term "and/or" in the present disclosure is only one kind of association relationship describing the association object, and indicates that three relationships may exist, for example, a and/or B, may indicate: a exists alone, A and B exist simultaneously, and B exists alone. In addition, the character "/" in the present disclosure generally indicates that the former and latter associated objects are in an "or" relationship.
It should also be understood that the description of the various embodiments of the present disclosure emphasizes the differences between the various embodiments, and the same or similar parts may be referred to each other, so that the descriptions thereof are omitted for brevity.
The following description of at least one exemplary embodiment is merely illustrative in nature and is in no way intended to limit the disclosure, its application, or uses.
Techniques, methods, and apparatus known to those of ordinary skill in the relevant art may not be discussed in detail but are intended to be part of the specification where appropriate.
It should be noted that: like reference numbers and letters refer to like items in the following figures, and thus, once an item is defined in one figure, further discussion thereof is not required in subsequent figures.
The disclosed embodiments may be applied to electronic devices such as terminal devices, computer systems, servers, etc., which are operational with numerous other general purpose or special purpose computing system environments or configurations. Examples of well known terminal devices, computing systems, environments, and/or configurations that may be suitable for use with electronic devices, such as terminal devices, computer systems, servers, and the like, include, but are not limited to: personal computer systems, server computer systems, thin clients, thick clients, hand-held or laptop devices, microprocessor-based systems, set-top boxes, programmable consumer electronics, network pcs, minicomputer systems, mainframe computer systems, distributed cloud computing environments that include any of the above, and the like.
Electronic devices such as terminal devices, computer systems, servers, etc. may be described in the general context of computer system-executable instructions, such as program modules, being executed by a computer system. Generally, program modules may include routines, programs, objects, components, logic, data structures, etc. that perform particular tasks or implement particular abstract data types. The computer system/server may be practiced in distributed cloud computing environments where tasks are performed by remote processing devices that are linked through a communications network. In a distributed cloud computing environment, program modules may be located in both local and remote computer system storage media including memory storage devices.
FIG. 1 is an exemplary system architecture diagram of an embodiment of a method or apparatus for interacting with a head mounted display device to which the present disclosure may be applied.
As shown in fig. 1, the system architecture may include a head mounted display device 1, a network 2, and a server 3. The network 2 may be the medium between the head mounted display device 1 and the server 3 for the communication link. The network 2 may include various connection types, such as wired, wireless communication links, or fiber optic cables, among others.
The head-mounted display device 1 may be an electronic device with an image display function, including, but not limited to, AR smart glasses, VR smart glasses, XR smart glasses, and the like. Because can follow the user motion with head-mounted display device such as AR intelligence glasses, VR intelligence glasses, XR intelligence glasses, therefore the image display in-process generally need with wear the relevant calculation of display device location. The head-mounted display device 1 may be an all-in-one machine, that is, the head-mounted display device 1 may independently perform a computing function, may be installed with various client applications, and may display images, video data, and the like provided by itself. Or, the head-mounted display device 1 may also be used in cooperation with a terminal device to implement a split device with an image display function, at this time, the terminal device may bear a calculation function of the head-mounted display device, may install various client applications, and may provide images and video data to be displayed to the head-mounted display device. Here, for convenience of representation, the all-in-one machine and the split type device may be collectively referred to as a head-mounted display device.
The head-mounted display device 1 described above may have a sensor that collects self-movement data and environmental data. The self-motion data may be, for example, angular velocity, acceleration, magnetic field data, and the like of the head-mounted display device 1. Accordingly, the sensors may be IMUs and magnetometers, etc. The environment data may be image data of the environment in which the head mounted display device is located, distance data to surrounding objects, magnetic field data of the surrounding environment, etc., and the corresponding sensor may be a camera, a distance sensor, a magnetometer, etc. The data measured by these sensors may be used for positioning calculations of the head mounted display device 1 itself, the body part of the user, and the surrounding environment, etc.
The head-mounted display device 1 described above can provide various services, for example, after the head-mounted display device is started, a six-degree-of-freedom operation can be performed on a display screen of the head-mounted display device according to a gesture motion of a user wearing the head-mounted display device.
It should be noted that the method for interacting with the head-mounted display device provided by the embodiment of the present disclosure is generally performed by the head-mounted electronic device 1, and accordingly, the apparatus for interacting with the head-mounted display device is generally disposed in the head-mounted electronic device 1.
Optionally, the method for interacting with the head-mounted display device provided by the embodiment of the present disclosure may also be performed by a terminal device connected with the head-mounted electronic device 1, and accordingly, an apparatus for interacting with the head-mounted display device may also be disposed in the terminal device.
Optionally, the method for interacting with the head-mounted display device provided by the embodiment of the present disclosure may be performed by the head-mounted electronic device 1 and a terminal device connected to the head-mounted electronic device 1 together. For example, the head mounted display device may undertake acquiring data with its own sensors for subsequent calculations. For example, acquisition of data for detection and recognition of gesture motions, calculation of hand positions and projection of target heads, etc. calculations may be undertaken. The terminal device may undertake the work of the calculation. For example, correspondingly, the apparatus for interacting with the head-mounted display device may also be disposed in the head-mounted display device by a department, and partially disposed in the terminal device.
The method for interacting with the head-mounted display device provided by the embodiment of the disclosure can establish a virtual handle and a virtual joystick for a user wearing the head-mounted display device. The user can operate the virtual handle and the virtual operating lever through gesture actions, and therefore interaction of a display picture of the head-mounted display device is achieved. Wherein the virtual joystick may be arranged on the virtual handle such that the virtual handle and the virtual joystick form one whole. The virtual joystick and the virtual handle may be provided separately.
The joystick mode may be activated when a user makes a preset joystick mode gesture motion. In a state where the joystick mode is activated, an input key value to the joystick may be determined by detecting a hand position of the user. Alternatively, the virtual joystick coordinate system may be established based on at least one of a position of a handle of the user and a position of a display screen of the head-mounted display device. Alternatively, the input key values of the virtual joystick may be determined based on a projection of the target hand of the user on the virtual joystick coordinate system.
The grip mode may be activated when a user makes a preset grip mode gesture motion. In the state of the handle mode, the virtual handle input determines an input key value to the virtual handle by detecting a handle mode gesture motion of a user.
It is to be appreciated that in embodiments of the present disclosure, at least one of a joystick mode and a grip mode may be implemented.
Alternatively, the position of the virtual handle in the world coordinate system can also be determined based on the designated hand joint position of the user, and a rotation matrix of the virtual handle coordinate system relative to the world coordinate system is determined for the user. It will be appreciated that in embodiments of the present disclosure, the position and rotation of the virtual handle may be determined by capturing images of the user's hand without performing the above-described joystick mode and handle mode activation operations.
The "hand position" in the above description may refer to a position of a hand of a user performing a virtual joystick operation, and may include a palm position and a finger position of the hand. The "target hand" in the above may refer to a hand that performs a virtual joystick operation or a handle operation. The "virtual joystick" in the above may refer to a virtual object formed in a virtual world for a user to perform a joystick operation on a display screen of a head-mounted display device. The "virtual handle" in the above may refer to a virtual object formed in the virtual world for a user to handle a display screen of the head-mounted display device. The "virtual joystick" and the "virtual handle" may be displayed on the display screen or may not be displayed, as long as the corresponding operation can be triggered in response to the user's action.
It should also be noted that, although the solution of the present disclosure may be applied to a head-mounted display device, it is not excluded that the solution may also be applied to the server 3, and the server 3 may be a background server. In case the solution of the present disclosure is applied to the server 3, the server activates the joystick mode in response to detecting that the user wearing the head mounted display device has performed a preset joystick mode gesture action; determining an origin position of the virtual joystick coordinate system based on at least one of a hand position of the user and a display screen of the head-mounted display device in a state where the joystick mode is activated, and determining the virtual joystick coordinate system based on the origin position of the virtual joystick coordinate system; and determining the input key value of the virtual joystick based on the projection of the target hand of the user on the virtual joystick coordinate system. In this case, the method for interacting with the head-mounted display device may be performed by the server 3, and accordingly, the means for interacting with the head-mounted display device may also be provided in the server 3.
Exemplary method
FIG. 2 is a flow diagram of a method for interacting with a head mounted display device in one embodiment of the present disclosure. The embodiment can be applied to an electronic device, as shown in fig. 2, and includes the following steps:
s0: the virtual handle and the virtual joystick which can carry out six-degree-of-freedom control on the head-mounted display device are mapped, and the operation of the virtual handle or the virtual joystick is mapped with the designated gesture action of the user. Any operation of the virtual handle or the virtual joystick, such as a key operation, a trigger operation or a joystick operation, is mapped to a corresponding gesture motion.
S2: the joystick mode is activated in response to detecting that a user wearing the head mounted display device has performed a preset joystick mode gesture action.
The head-mounted display device may be provided with a camera, the camera may capture an image including a hand of a user, and the joystick mode may be activated when the image recognizes that a preset joystick mode gesture motion for activating the joystick mode is performed by the user. Illustratively, the preset gesture motion for activating the joystick mode may be a single fixed gesture motion, such as a gesture motion representing "six" or a gesture motion representing "make phone". The preset gesture motion for activating the joystick mode may also be a set of consecutive motions, such as a rotation of the left thumb of the user in a counter-clockwise direction, or a rotation in a clockwise direction.
S4: in a state where the joystick mode is activated, an origin position of the virtual joystick coordinate system is determined based on at least one of a hand position of the user and a position of a display screen of the head-mounted display device, and the virtual joystick coordinate system is determined based on the origin position of the virtual joystick.
The origin position of the virtual joystick coordinate system may be determined based on the hand designated position of the user, for example, the joint position of one designated finger of the user may be taken as the origin position of the virtual joystick coordinate system, or the average position of the joint positions of a plurality of fingers may be taken as the origin position of the virtual joystick coordinate system. The origin position of the virtual joystick may also be determined based on the position of the display screen of the head-mounted display device. For example, for a head-mounted display device to provide a two-dimensional display screen, the position of the center point of the display screen, or the preset position of the lower left corner of the display screen, or the preset position of the lower right corner of the display screen may be used as the origin position of the virtual joystick coordinate system. For example, a three-dimensional display screen is provided for the head-mounted display device, a two-dimensional display plane may be selected from the three-dimensional display screen, and the position of the center point of the display plane, or the preset position of the lower left corner of the display plane, or the preset position of the lower right corner of the display plane is used as the origin position of the virtual joystick coordinate system.
After determining the origin position of the virtual joystick, a virtual joystick coordinate system may be established based on the origin position. For example, an X-axis and a Y-axis perpendicular to the X-axis of the virtual joystick coordinate system may be defined according to determining the origin position of the virtual joystick coordinate system.
S6: and determining the input key value of the virtual joystick based on the projection of the target hand of the user on the virtual joystick coordinate system. The "target hand" in the present embodiment may refer to a hand that manipulates the virtual joystick.
The projected direction can be used as the manipulation direction of the virtual joystick, the operation key value of the joystick is determined based on the projected tail end position, and then the input key value of the virtual joystick is determined according to the manipulation direction of the virtual joystick and the operation key value.
In this embodiment, after a user wearing the head-mounted display device activates the joystick mode through a preset gesture, a virtual joystick coordinate system is determined based on at least one of a hand position of the user and a display screen of the head-mounted display device, and then an input key value of the virtual joystick is determined according to a projection of a target hand of the user on the virtual joystick coordinate system, so that the joystick operation can be performed on the head-mounted display device without a real joystick, and user experience is improved.
Fig. 3 is a schematic flow chart of step S4 in an embodiment of the present disclosure. As shown in fig. 3, step S4 may include:
S4-A-2: the origin position of the virtual joystick coordinate system is determined based on the finger joint position of the user when the joystick mode is activated.
The position of the root joint of the user's left index finger at the moment when the joystick mode is activated may be determined as the origin position of the virtual joystick coordinate system. The position of the root joint of the left thumb of the user at the moment when the joystick mode is activated can also be determined as the origin position of the virtual joystick coordinate system. One position associated with the user's designated finger joint position at the moment of activating the joystick mode may also be determined as the origin position of the virtual joystick coordinate system. The finger joint position of the user and the origin position of the virtual joystick coordinate system are mapped in advance, and the mapping relation can be adjusted. For example, for a user accustomed to controlling the joystick with the right hand, the position of the root joint of the left index finger may be replaced by the position of the root joint of the right index finger, or the position of the root joint of the left thumb may be replaced by the position of the root joint of the right thumb.
S4-A-4: and determining a direction which passes through the origin position of the virtual joystick coordinate system and is parallel to a first coordinate axis of the coordinate system of the head-mounted display device as a first coordinate axis direction of the virtual joystick coordinate system, and determining a direction which passes through the origin position of the virtual joystick coordinate system and is parallel to a second coordinate axis of the coordinate system of the head-mounted display device as a second coordinate axis direction of the virtual joystick coordinate system.
A direction passing through the origin position of the virtual joystick coordinate system and parallel to the X-axis of the coordinate system of the head mounted display device may be taken as the X-axis direction of the virtual joystick coordinate system, and a direction passing through the origin position of the virtual joystick coordinate system and parallel to the Z-axis of the coordinate system of the head mounted display device may be taken as the Y-axis direction of the virtual joystick coordinate system.
In this embodiment, a mapping relationship may be established between the finger joint position of the user and the virtual joystick, so as to determine a virtual joystick coordinate system, so that the user may implement control of the virtual joystick through finger joint movement, and the user experience is good. By using the finger joint position at the activation moment as the origin of the virtual joystick coordinate system, the key value of the joystick can be directly mapped by the user hand motion after the joystick mode is activated, the operation process is continuous, and the response speed is high.
Fig. 4 is a schematic flowchart of step S4 in another embodiment of the present disclosure. As shown in fig. 4, step S4 may include:
S4-B-2: the origin position of the virtual joystick coordinate system is determined based on a preset position in a preset display plane of the display screen.
When the display screen of the head-mounted display device is a three-dimensional display screen, the preset display plane may be a two-dimensional display plane in the three-dimensional display screen. When the display picture of the head-mounted display device is a two-dimensional display picture, the preset display plane is the two-dimensional display picture.
The predetermined display plane may include a rectangular picture, a circular picture, or other regular-shaped pictures, such as a picture with central symmetry. The positions of all pixel points included in the preset display plane can be determined based on the data of the preset display plane, and the position of any one pixel point in the preset display plane can be determined as the original point position of the virtual control rod coordinate system. For example, the preset display plane center position may be determined as the origin position of the virtual joystick coordinate system, or the position of one corner point in the preset display plane may be determined as the origin position of the virtual joystick coordinate system.
S4-B-4: and determining a first preset direction of a preset display plane as a first coordinate axis direction of the virtual joystick coordinate system, and determining a second preset direction of the preset display plane as a second coordinate axis direction of the virtual joystick coordinate system.
When the predetermined display plane is a rectangular screen, a direction passing through the origin position of the virtual joystick coordinate system and parallel to one side of the rectangular screen may be taken as a first predetermined direction, i.e., an X-axis direction of the virtual joystick coordinate system. And taking the direction which passes through the origin position of the virtual joystick coordinate system and is parallel to the other side of the rectangular picture as a second preset direction, namely the Y-axis direction of the virtual joystick coordinate system. One side of the rectangular picture is perpendicular to the other side of the rectangular picture.
When the display screen is a circular screen, a diameter direction of the circular screen passing through the origin position may be used as a first preset direction, that is, an X-axis direction of the virtual joystick coordinate system. And taking a diameter direction passing through the origin position and perpendicular to the X axis of the virtual joystick coordinate system as a second preset direction, namely the Y axis direction of the virtual joystick coordinate system.
When the display screen is a trapezoid, a direction passing through the origin position and parallel to the bottom edge of the trapezoid screen may be taken as a first preset direction, i.e., the X-axis direction of the virtual joystick coordinate system. And taking the direction which passes through the original position and is perpendicular to the bottom edge of the trapezoid screen as a second preset direction, namely the Y-axis direction of the virtual joystick coordinate system.
In this embodiment, the virtual joystick coordinate system is determined based on the display screen of the head-mounted display device, so that when the user operates the virtual joystick, the feeling similar to touch operation can be brought, and the user experience is good.
Fig. 5 is a schematic flow chart of step S6 in an embodiment of the present disclosure. As shown in fig. 5, step S6 may include:
s6-2: and acquiring orthogonal projection coordinates of the target finger of the target hand in a virtual joystick coordinate system. The orthographic projection coordinates of the target finger on the virtual joystick coordinate system may include orthographic projection coordinates of the fingertip of the target finger on the virtual joystick coordinate system.
S6-4: and carrying out normalization processing on the orthogonal projection coordinates, and determining the input key value of the virtual joystick.
The manipulation direction and the manipulation distance to the virtual joystick are determined based on orthogonal projection coordinates of the fingertip of the target finger on the virtual joystick coordinate system. The steering direction may be represented by an angle, such as 90 ° for a user dialing up the virtual joystick, 270 ° for a user dialing down the virtual joystick, and so on. After the manipulation distance is normalized, the input key value of the virtual joystick can be determined according to the manipulation direction and the normalized manipulation distance.
In this embodiment, the input key value of the virtual joystick can be obtained quickly and accurately by performing the orthogonal projection coordinates of the target finger of the user in the virtual joystick coordinate system and performing the normalization processing.
In one embodiment of the present disclosure, step S6-4 may include: and based on a preset minimum coordinate absolute value, a preset numerical value processing width and a preset overshoot coefficient of the virtual control lever coordinate system, carrying out normalization processing on the orthogonal projection coordinate, and determining an input key value of the virtual control lever.
The preset minimum coordinate absolute value may define a minimum value of absolute values of the orthogonal projection coordinates, for example, when the minimum coordinate absolute value is V, and a coordinate value of one coordinate axis (for example, an x axis or a y axis) of the orthogonal projection coordinates is located between [ -V, V ], which is likely to be caused by a slight shake of a finger of a user, the user may not want the head-mounted display device to respond to a manipulation caused by the slight shake of the finger, and thus, the probability of a valid misoperation may be provided by the preset minimum coordinate absolute value.
The preset numeric processing width may define a value range of the input key value, for example, when the preset numeric processing width takes 1, the value range of the input key value may be [ -1,1].
The preset overshoot coefficient can limit that the small amplitude of the coordinate value of the orthogonal projection coordinate exceeds the range of the standard coordinate value, and the input key value corresponding to the coordinate value of the orthogonal projection coordinate is still determined as the input key value corresponding to the boundary value of the range of the standard coordinate value. For example, the standard coordinate value range is [ -K, K ], and when the coordinate value of a coordinate axis (for example, x-axis or y-axis) of the orthogonal projection coordinate is located at [ -K-L, -K), the input key value corresponding to the coordinate axis is equal to the input key value corresponding to the coordinate value-K. The method comprises the steps that when a coordinate value of a certain coordinate axis (such as an x axis or a y axis) of an orthogonal projection coordinate is located at (K, K + L), an input key value corresponding to the coordinate value is equal to an input key value corresponding to the coordinate value K, and the K and the L can be integers larger than 0.
In this embodiment, the input key value of the virtual joystick is determined based on the preset minimum coordinate absolute value, the preset numerical value processing width and the preset overshoot coefficient of the virtual joystick coordinate system, so that the probability of misoperation of a user can be reduced, the response probability of the user's intended operation is improved, and the user experience is good.
In an embodiment of the present disclosure, normalizing the orthogonal projection coordinate based on a preset minimum coordinate absolute value, a preset numerical value processing width, and a preset overshoot coefficient of a virtual joystick coordinate system, and determining an input key value of the virtual joystick may include:
s6-4-2: input values are determined based on the orthogonal projection coordinates. The orthogonal projection coordinate value between the fingers on the x axis is marked as x i . Can convert x into i As an input value.
S6-4-4: and performing normalization processing based on the input value, and determining the input key value of the virtual joystick.
In one embodiment of the present disclosure, step S6-4-4 may include:
determining a first numerical value processing range based on the preset minimum coordinate absolute value, determining a second numerical value processing range based on the preset minimum coordinate absolute value and the preset numerical value processing width, and determining a third numerical value processing range based on the preset minimum coordinate absolute value, the preset numerical value processing width and the preset overcharge coefficient. Wherein a maximum numerical boundary of the first numerical range is less than a maximum numerical boundary of the second numerical range, and a minimum numerical boundary of the first numerical range is greater than a minimum numerical boundary of the second numerical range, and a maximum numerical boundary of the second numerical range is less than a maximum numerical boundary of the third numerical range, and a minimum numerical boundary of the second numerical range is greater than a minimum numerical boundary of the third numerical range.
And if the input value is within the first numerical value processing range, determining the input key value to be 0.
And if the input value is positioned outside the first numerical value processing range and within the second numerical value processing range, determining an input key value based on the input value, a preset minimum coordinate absolute value and the preset numerical value processing width.
And if the input value is positioned outside the second numerical value processing range and within the third numerical value processing range, determining the input key value to be one of 1 and-1 based on the size relation between the input value and 0.
And if the input value is outside the third numerical value processing range, determining that the input key value is 0.
Fig. 6 is a schematic diagram of normalization processing based on input values in one example of the present disclosure. As shown in fig. 6, in the present example, the preset minimum coordinate absolute value is 0.2, the preset numerical processing width is 1, and the preset overcharge factor is 1.2.
When x is i Within the rectangle filled by the vertical bars (i.e. the first predetermined range of values), i.e. -0.2<x i And if the key value is less than or equal to 0.2, inputting the key value =0.
When-1.2<x i Less than or equal to-0.2 or 0.2<x i ≦ 1.2 (i.e., the input value is outside the first numerical processing range and within the second numerical processing range, when x is present i Located outside the rectangle filled by the vertical bars and inside the rectangle filled by the horizontal bars), the numeric value determination mode of the input key value is as follows: when-1.2<x i When the value is less than or equal to-0.2, the numerical value of the input key value = x i +0.2; when 0.2<x i When the input key value is less than or equal to 1.2, the numerical value = x of the input key value i -0.2. This makes it possible to locate the key value at [ -1,1] when the input key value is located]And (4) the following steps.
When-1.4<x i Less than or equal to-1.2 or 1.2<x i ≦ 1.4 (i.e., the input value is outside the second numerical processing range and within the third numerical processing range, when x is present i Located outside the rectangle filled with horizontal bars and inside the rectangle filled with oblique squares), the numeric value determination mode of the input key value is as follows: when-1.4<x i When the value is less than or equal to-1.2, inputting the value of the key value which is less than or equal to-1; when 1.2<x i And if the input key value is less than or equal to 1.4, the numerical value of the input key value =1.
When x is i <-1.4 or x i >1.4 (i.e., the input value is outside the third numerical processing range, where x i Located outside the rectangle filled with oblique squares), the numeric value determination method of the input key value is as follows: when x is i <At-1.4, the value of the input key value =0; when x is i >1.4, the numeric value of the key is input =0.
Note that, the orthogonal projection coordinate y is i And for the orthogonal projection coordinate x i The process is similar.
In the present embodiment, the first numerical processing range, the second numerical processing range, and the third numerical processing range are determined based on a preset minimum coordinate absolute value, a preset numerical processing width, and a preset overshoot coefficient of the virtual joystick coordinate system. Through the first numerical processing range, when the finger of the user slightly shakes, the operation of the virtual control lever is not controlled, and the probability of misoperation can be controlled. The second numerical processing range may allow a response when the user performs a basic manipulation of the virtual joystick. The user's small-amplitude out-of-range operation can be treated as normal operation by the third numerical processing range.
So that when the user has a large manipulation range of the virtual joystick, the user can respond with a large manipulation range allowed by the virtual joystick. When the input value is larger than the third numerical processing range, the user can be represented to not operate the virtual joystick at the moment, and the operation of the virtual joystick is not responded at the moment, so that misoperation can be effectively avoided.
The method for interacting with a head-mounted display device of the embodiments of the present disclosure may further include:
s8: in response to detecting that the user has performed a preset grip mode gesture action, a grip mode is activated.
S10: in a state where the grip mode is activated, an input key value of the virtual grip is determined based on the grip mode gesture motion. Wherein the grip mode gesture action comprises at least one of a key action and a grip action. The input key values of the virtual handle comprise input key values of key actions and input key values of gripping actions.
Before the handle mode is started, the mapping relation between the virtual handle or the virtual joystick controlled by the gesture motion can be preset. Wherein the gesture actions in the mapping relationship may include a grip mode gesture action and a joystick mode gesture action. The key operation and the trigger operation of the virtual handle can be realized through the handle mode gesture action. The joystick operation of the virtual joystick may be achieved by a joystick mode gesture motion.
FIG. 7 is a schematic diagram of gestural actions controlling a virtual handle or a virtual joystick in one example of the present disclosure. As shown in fig. 7, the gesture motion in (a) in fig. 7 may be mapped with a first key operation of the virtual handle (e.g., a key operation of a key a); the gesture motion of fig. 7 (B) may be mapped to a grip key that grips the virtual handle; the gesture motion of (C) in fig. 7 may be mapped to a trigger pressing the virtual handle; the gesture action in (D) in fig. 7 may be mapped with a second key operation of the virtual handle (e.g., a key operation of the Home key); the gesture motion of (E) in fig. 7 may be mapped to activating a joystick mode; the gesture motion in fig. 7 (F) may be mapped to a third key operation of the virtual handle (e.g., a key operation of the B key).
After the display screen is provided through the head-mounted display device, if it is detected that the user performs a preset handle mode gesture motion, the handle mode may be activated.
In the state of the handle mode being turned on, if it is detected that the user performs one of handle mode gesture actions as in (a), (B), (C), (D), and (F) of fig. 7, an input key value of the virtual handle may be determined based on the handle mode gesture action.
In this embodiment, when the user uses the wearable device, the grip mode may be activated when the user is detected to have performed a preset grip mode gesture action. In the state of activating the handle mode, the input key values of the virtual handle can be determined according to the gesture actions of the user, so that the wearable device can be subjected to handle operation without a real handle.
In an embodiment of the present disclosure, determining an input key value of a virtual handle based on a handle mode gesture motion may specifically include:
s10-2: an input distance value to the virtual handle is determined based on the handle mode gesture motion.
The distance value between the two finger joints when the user performs the handle mode gesture motion can be recognized in an image recognition mode and used as the input distance value. For example, a distance value between two finger joints when a trigger of the virtual handle is operated or a distance value between two hand positions when a grip is operated is used as an input distance value.
S10-4: and based on the input distance value, the preset critical coefficient, the maximum processing distance threshold value and the minimum processing distance threshold value, carrying out normalization processing on the input distance value, and determining an input key value of the virtual handle.
A critical coefficient threshold may be defined as a fraction between 0 and 1 and a maximum processing distance threshold, denoted max, and a minimum processing distance threshold, denoted min. The difference between max and min can be used as the processing distance, expressed as length, and length is not equal to 0.
If the length is not greater than 0, outputting 0 (without triggering any operation), representing that at least one of the maximum processing distance threshold value and the minimum processing distance threshold value is set with error, and stopping; if the input distance value is less than or equal to max-length threshold, representing that the user presses the key of the virtual handle and the maximum key depth is reached, outputting 1, and stopping; if the input distance value is less than max, outputting (max-input distance value)/(threshold length), representing that the user presses the key of the virtual handle but does not reach the maximum key depth, and stopping; if the input distance value is larger than or equal to max, 0 is output, the threshold value representing the maximum processing distance is set to be wrong, and the process is stopped. In one example of the present disclosure, min =5mm, max =15mm, threshold =0.5.
In this embodiment, based on the handle mode gesture motion, the input distance value of the virtual handle can be accurately normalized based on the input distance value, the preset critical coefficient, the maximum processing distance threshold value and the minimum processing distance threshold value, so that the input distance value is subjected to fuzzy matching, the input key value of the virtual handle can be in a reasonable range, the operation success rate of the virtual handle is improved, and the user experience is good.
The method for interacting with a head-mounted display device of the embodiments of the present disclosure may further include:
s12: the position of the virtual handle is determined based on a designated hand joint position of a user wearing the head mounted display device.
The palm center position, the index finger root joint position or the middle finger root joint position can be determined as the position of the virtual handle. The position of the virtual handle may be determined as an average value of joint positions of a plurality of fingers.
S14: a virtual handle coordinate system is established for the user, and a rotation matrix of the virtual handle coordinate system relative to a world coordinate system is determined.
First, a forward vector and a secondary vector are determined in a world coordinate system, the forward vector and the secondary vector being non-parallel. And then, cross multiplication of the secondary vector and the forward vector is calculated to obtain a vector which is respectively orthogonal to the forward vector and the secondary vector and is marked as a left vector. Then, cross product of the forward vector and the left vector is calculated, and a vector which is orthogonal to the forward vector and the secondary vector in pairs can be obtained and is marked as an upward vector. And then a 3 x 3 matrix is created based on the forward, left and up vectors. The first row element of the matrix is a forward vector, the second row element of the matrix is a left vector, and the third row element of the matrix is an upward vector, and the matrix can be used as a rotation matrix of the handle coordinate system relative to the world coordinate system.
In this embodiment, the position of the virtual handle may be determined based on the designated hand joint position of the user, and the rotation matrix of the handle coordinate system of the virtual handle with respect to the world coordinate system (i.e., the posture of the virtual handle) may be determined, and further, the designated function of the virtual handle may be realized based on the position and posture of the virtual handle, for example, a function of issuing a ray pointing to the display screen of the head-mounted display device from a set ray-emitting position in the virtual handle based on the position of the virtual handle, and selecting or manipulating a target object in the display screen by the ray.
In one embodiment of the present disclosure, step S14 may include:
S14-A: a rotation matrix of the handle coordinate system relative to the world coordinate system is determined based on the virtual shoulder position of the user, the hand position, and the coordinate system of the head mounted display device.
The position from the right shoulder (or the left shoulder) to the palm center of the user under the coordinate system of the head-mounted display device can be used as a forward vector, the y-axis direction of the coordinate system of the head-mounted display device can be used as a secondary vector, and then the rotation matrix of the handle coordinate system relative to the world coordinate system can be determined.
S14-B: a rotation matrix of the handle coordinate system relative to the world coordinate system is determined based on the virtual shoulder position of the user, the finger joint position, and the coordinate system of the head mounted display device.
The position from the right shoulder (or the left shoulder) to the palm center of the user in the coordinate system of the head-mounted display device can be used as a forward vector, the position from the root of the little finger to the root of the forefinger can be used as a secondary vector, and then the rotation matrix of the handle coordinate system relative to the world coordinate system can be determined.
S14-C: a rotation matrix of the handle coordinate system relative to the world coordinate system is determined based on the hand position of the user.
The position from the wrist center of the user to the root of the middle finger can be used as a forward vector, the position from the root of the little finger to the root of the forefinger can be used as a secondary vector, and then the rotation matrix of the handle coordinate system relative to the world coordinate system is determined.
S14-D: a rotation matrix of the handle coordinate system relative to the world coordinate system is determined based on the palm center and finger joint positions of the user.
The rotation of the user's palm with respect to the index finger root position (or middle finger root position) may be used to determine a rotation matrix of the handle coordinate system with respect to the world coordinate system.
In the embodiment, the handle coordinate system can be selected individually according to the user requirement, and then the handle coordinate system is determined relative to the world coordinate system according to the handle coordinate system, so that the degree of freedom is high.
The method for interacting with a head-mounted display device of the embodiments of the present disclosure may further include:
s16: and acquiring an operation input value of the virtual handle by a user.
The gesture actions of the user can be recognized in an image recognition mode, and when the gesture actions of the user comprise gesture actions which are mapped in advance and operate the virtual handle, operation input values of the virtual handle are obtained based on the recognized gesture actions.
S18: and if the operation input value is smaller than the minimum boundary threshold value of the operation input value, updating the minimum boundary threshold value to the input value.
S20: and if the operation input value is larger than the maximum boundary threshold value of the operation input value, updating the maximum boundary threshold value to the input value.
In this embodiment, the operation input value of the virtual handle may be compared with the minimum boundary threshold and the maximum boundary threshold, respectively, and whether to update the boundary thresholds is determined according to the comparison result, so that the setting of the minimum boundary threshold and the maximum boundary threshold may be more suitable for the operation habit of the user.
In one embodiment of the present disclosure, step S10 may include: in response to detecting that the user performs a trigger operation through the virtual handle, an input value of the trigger operation, such as a floating-point type key value, is determined based on a distance between a fingertip position of a first preset finger and a fingertip position of a second preset finger at which the user performs the trigger operation. The first pre-set digit may be the thumb of the hand performing the trigger operation and the second pre-set digit may be the index or middle finger performing the trigger operation.
In this embodiment, when the user performs a trigger operation on the virtual handle, the distance between the two fingers of the hand performing the trigger operation is determined as the operation input value of the trigger operation, and it is highly reasonable.
In another embodiment of the present disclosure, step S14 may include: in response to detecting that the user performs the gripping operation on the virtual handle, determining the sum of distances from a plurality of preset finger tips of the hand performing the gripping operation to the palm center as an input value of the gripping operation.
In this embodiment, when the user performs the grasping operation on the virtual handle, the sum of the distances from the fingertips of the plurality of preset fingers of the hand performing the grasping operation to the palm center can be used to reasonably determine the input value of the grasping operation.
The method for interacting with a head-mounted display device of the embodiments of the present disclosure may further include: the operation input value is filtered. For example, when determining an input distance value based on finger joint positions, the finger joint positions may be used for filtering; when determining the input distance value based on the average position of the plurality of finger joints, filtering may be performed using the plurality of finger joints.
In this embodiment, by filtering the operation input value, the accuracy of the operation input value can be improved, and then the accuracy of the input key value to the virtual handle or the virtual joystick can be improved, so that the accuracy between the gesture action of the user and the response of the head-mounted display device can be improved.
Any of the methods for interacting with a head-mounted display device provided by the embodiments of the present disclosure may be performed by any suitable device having data processing capabilities, including but not limited to: terminal equipment, a server and the like. Alternatively, any of the methods for interacting with a head-mounted display device provided by the embodiments of the present disclosure may be executed by a processor, such as the processor executing any of the methods for interacting with a head-mounted display device mentioned by the embodiments of the present disclosure by calling corresponding instructions stored in a memory. And will not be described in detail below.
Exemplary devices
FIG. 8 is a block diagram of an apparatus for interacting with a head mounted display device in one embodiment of the present disclosure. As shown in fig. 8, an apparatus for interacting with a head-mounted display device includes:
a joystick mode activation module 100, configured to activate a joystick mode when a preset gesture motion performed by a user is detected;
a virtual joystick coordinate system determination module 200 for determining an origin position of a virtual joystick coordinate system based on at least one of a hand position of a user and a display screen of the head-mounted display device in a state where the joystick mode is activated, and determining the virtual joystick coordinate system based on the origin position of the virtual joystick coordinate system;
an input key value determining module 300, configured to determine an input key value of the virtual joystick based on a projection of the target hand of the user on the virtual joystick coordinate system.
In one embodiment of the present disclosure, the virtual joystick coordinate system determination module 200 is configured to determine an origin position of the virtual joystick coordinate system based on a finger joint position of the user when the joystick mode is activated; the virtual joystick coordinate system determining module 200 is further configured to determine a direction passing through the origin position and parallel to a first coordinate axis of the coordinate system of the head-mounted display device as a first coordinate axis direction of the virtual joystick coordinate system, and determine a direction passing through the origin position and parallel to a second coordinate axis of the coordinate system of the head-mounted display device as a second coordinate axis direction of the virtual joystick coordinate system.
In another embodiment of the present disclosure, the virtual joystick coordinate system determination module 200 is configured to determine an origin position of the virtual joystick coordinate system based on a preset position in a preset display plane of the display screen; the virtual joystick coordinate system determining module 200 is further configured to determine a first preset direction of the preset display plane as a first coordinate axis direction of the virtual joystick coordinate system, and determine a second preset direction of the preset display plane as a second coordinate axis direction of the virtual joystick coordinate system.
In an embodiment of the present disclosure, the input key value determination module 300 is configured to obtain an orthogonal projection coordinate of a target finger of the target hand in the virtual joystick coordinate system; the input key value determining module 300 is further configured to perform normalization processing on the orthogonal projection coordinates, and determine an input key value of the virtual joystick.
In an embodiment of the present disclosure, the input key value determining module 300 is configured to perform normalization processing on the orthogonal projection coordinate based on a preset minimum coordinate absolute value, a preset numerical value processing width, and a preset overshoot coefficient of the virtual joystick coordinate system, and determine the input key value of the virtual joystick, where the preset minimum coordinate absolute value is a coordinate value of the orthogonal projection coordinate.
In one embodiment of the present disclosure, the input key value determination module 300 is configured to determine an input value based on the orthogonal projection coordinates; the input key value determining module 300 is further configured to determine a first numerical value processing range based on the preset minimum coordinate absolute value, determine a second numerical value processing range based on the preset minimum coordinate absolute value and the preset numerical value processing width, and determine a third numerical value processing range based on the preset minimum coordinate absolute value, the preset numerical value processing width, and the preset overcharge coefficient, where a maximum numerical value boundary of the first numerical value range is smaller than a maximum numerical value boundary of the second numerical value range, a minimum numerical value boundary of the first numerical value range is greater than a minimum numerical value boundary of the second numerical value range, a maximum numerical value boundary of the second numerical value range is smaller than a maximum numerical value boundary of the third numerical value range, and a minimum numerical value boundary of the second numerical value range is greater than a minimum numerical value boundary of the third numerical value range; the input key value determining module 300 is further configured to determine that the input key value is 0 if the input value is within the first numerical value processing range; the input key value determining module 300 is further configured to determine the input key value based on the input value, the preset minimum coordinate absolute value, and the preset numerical value processing width if the input value is outside the first numerical value processing range and within the second numerical value processing range; the input key value determining module 300 is further configured to determine that the input key value is one of 1 and-1 based on a magnitude relationship between the input value and 0 if the input value is outside the second numerical value processing range and within the third numerical value processing range; the input key value determining module 300 is further configured to determine that the input key value is 0 if the input value is outside the third numerical value processing range.
In one embodiment of the present disclosure, the apparatus for interacting with a head-mounted display device may further include a virtual handle processing module for activating a handle mode in response to detecting that the user has performed a preset handle mode gesture action; the virtual handle processing module is further configured to determine an input key value of the virtual handle based on the handle mode gesture action in a state where the handle mode is activated, where the handle mode gesture action includes at least one of a key action and a grip action, and the input key value of the virtual handle includes an input key value of the key action and an input key value of the grip action.
In one embodiment of the present disclosure, the virtual handle processing module is to determine an input distance value to the virtual handle based on the handle mode gesture action; the virtual handle processing module is further configured to perform normalization processing on the input distance value based on the input distance value, a preset critical coefficient, a maximum processing distance threshold value and a minimum processing distance threshold value, and determine an input key value of the virtual handle.
In one embodiment of the present disclosure, the virtual handle processing module is configured to obtain an operation input value of the virtual handle by the user; the virtual handle processing module is further configured to update the minimum boundary threshold to the input value if the operation input value is smaller than the minimum boundary threshold of the operation input value; the virtual handle processing module is further configured to update the maximum boundary threshold to the input value if the operation input value is greater than the maximum boundary threshold of the operation input value.
In an embodiment of the disclosure, the virtual handle processing module is configured to determine an input value of a trigger operation based on a distance between a fingertip position of a first preset finger and a fingertip position of a second preset finger of the trigger operation performed by the user in response to detection of the trigger operation performed by the user through the virtual handle.
In another embodiment of the present disclosure, the virtual handle processing module is configured to, in response to detecting that the user performs a gripping operation on the virtual handle, determine a sum of distances from a fingertip to a palm center of a plurality of preset fingers of a hand performing the gripping operation as an input value of the gripping operation.
It should be noted that, a specific implementation of the apparatus for interacting with a head-mounted display device in the embodiment of the present disclosure is similar to a specific implementation of the method for interacting with a head-mounted display device in the embodiment of the present disclosure, and for specific reference, a part of the method for interacting with a head-mounted display device is specifically referred to, and details are not repeated for reducing redundancy.
Exemplary electronic device
Next, an electronic apparatus according to an embodiment of the present disclosure is described with reference to fig. 9. As shown in fig. 9, the electronic device includes one or more processors 10 and a memory 20.
The processor 10 may be a Central Processing Unit (CPU) or other form of processing unit having data processing capabilities and/or instruction execution capabilities, and may control other components in the electronic device to perform desired functions.
Memory 20 may include one or more computer program products that may include various forms of computer-readable storage media, such as volatile memory and/or non-volatile memory. The volatile memory may include, for example, random Access Memory (RAM), cache memory (cache), and/or the like. The non-volatile memory may include, for example, read Only Memory (ROM), hard disk, flash memory, etc. One or more computer program instructions may be stored on the computer-readable storage medium and executed by the processor 10 to implement the methods for interacting with head mounted display devices of the various embodiments of the present disclosure described above and/or other desired functionality. Various contents such as an input signal, a signal component, a noise component, etc. may also be stored in the computer-readable storage medium.
In one example, the electronic device may further include: an input device 30 and an output device 40, which are interconnected by a bus system and/or other form of connection mechanism (not shown). The input device 30 may be, for example, a keyboard, a mouse, or the like. Output device 40 may include, for example, a display, speakers, a printer, and a communication network and remote output devices connected thereto, among others.
Of course, for simplicity, only some of the components of the electronic device relevant to the present disclosure are shown in fig. 9, omitting components such as buses, input/output interfaces, and the like. In addition, the electronic device may include any other suitable components, depending on the particular application.
Exemplary computer readable storage Medium
The computer-readable storage medium may take any combination of one or more readable media. The readable medium may be a readable signal medium or a readable storage medium. A readable storage medium may include, for example, but not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or a combination of any of the foregoing. More specific examples (a non-exhaustive list) of the readable storage medium include: an electrical connection having one or more wires, a portable diskette, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing.
In addition, the embodiment of the present disclosure further provides a head mounted display device, which includes the apparatus for interacting with the head mounted display device of the above-mentioned embodiment.
The foregoing describes the general principles of the present disclosure in conjunction with specific embodiments, however, it is noted that the advantages, effects, etc. mentioned in the present disclosure are merely examples and are not limiting, and they should not be considered essential to the various embodiments of the present disclosure. Furthermore, the foregoing disclosure of specific details is for the purpose of illustration and description and is not intended to be limiting, since the disclosure is not intended to be limited to the specific details so described.
In the present specification, the embodiments are described in a progressive manner, each embodiment focuses on differences from other embodiments, and the same or similar parts in the embodiments are referred to each other. For the system embodiment, since it basically corresponds to the method embodiment, the description is relatively simple, and for the relevant points, reference may be made to the partial description of the method embodiment.
The block diagrams of devices, apparatuses, devices, systems involved in the present disclosure are only given as illustrative examples and are not intended to require or imply that the connections, arrangements, configurations must be made in the manner shown in the block diagrams. These devices, apparatuses, devices, systems may be connected, arranged, configured in any manner, as will be appreciated by those skilled in the art. Words such as "including," "comprising," "having," and the like are open-ended words that mean "including, but not limited to," and are used interchangeably therewith. The words "or" and "as used herein mean, and are used interchangeably with, the word" and/or, "unless the context clearly dictates otherwise. The word "such as" is used herein to mean, and is used interchangeably with, the phrase "such as but not limited to".
The methods and apparatus of the present disclosure may be implemented in a number of ways. For example, the methods and apparatus of the present disclosure may be implemented by software, hardware, firmware, or any combination of software, hardware, and firmware. The above-described order for the steps of the method is for illustration only, and the steps of the method of the present disclosure are not limited to the order specifically described above unless specifically stated otherwise. Further, in some embodiments, the present disclosure may also be embodied as programs recorded in a recording medium, the programs including machine-readable instructions for implementing the methods according to the present disclosure. Thus, the present disclosure also covers a recording medium storing a program for executing the method according to the present disclosure.
It is also noted that in the apparatus, devices, and methods of the present disclosure, various components or steps may be broken down and/or re-combined. These decompositions and/or recombinations are to be considered equivalents of the present disclosure.
The previous description of the disclosed aspects is provided to enable any person skilled in the art to make or use the present disclosure. Various modifications to these aspects will be readily apparent to those skilled in the art, and the generic principles defined herein may be applied to other aspects without departing from the scope of the disclosure. Thus, the present disclosure is not intended to be limited to the aspects shown herein but is to be accorded the widest scope consistent with the principles and novel features disclosed herein.
The foregoing description has been presented for purposes of illustration and description. Furthermore, the description is not intended to limit embodiments of the disclosure to the form disclosed herein. While a number of example aspects and embodiments have been discussed above, those of skill in the art will recognize certain variations, modifications, alterations, additions and sub-combinations thereof.

Claims (15)

1. A method for interacting with a head mounted display device, comprising:
activating a joystick mode in response to detecting that a user wearing the head mounted display device has performed a preset joystick mode gesture action;
determining an origin position of a virtual joystick coordinate system based on at least one of a hand position of the user and a position of a display screen of the head-mounted display device in a state in which the joystick mode is activated, and determining the virtual joystick coordinate system based on the origin position of the virtual joystick coordinate system;
and determining an input key value of the virtual joystick based on the projection of the target hand of the user on the virtual joystick coordinate system.
2. The method of claim 1, wherein the determining an origin position of a virtual joystick coordinate system based on at least one of the user's hand position and the position of the display, and the determining the virtual joystick coordinate system based on the origin position of the virtual joystick coordinate system, comprises:
determining a position of an origin of the virtual joystick coordinate system based on a finger joint position of the user when the joystick mode is activated;
and determining a direction which passes through the origin position and is parallel to a first coordinate axis of a coordinate system of the head-mounted display device as a first coordinate axis direction of the virtual joystick coordinate system, and determining a direction which passes through the origin position and is parallel to a second coordinate axis of the coordinate system of the head-mounted display device as a second coordinate axis direction of the virtual joystick coordinate system.
3. The method of claim 1, wherein the determining an origin position of a virtual joystick coordinate system based on at least one of the user's hand position and a position of a display, and the determining the virtual joystick coordinate system based on the origin position of the virtual joystick coordinate system, comprises:
determining the position of an origin of the virtual joystick coordinate system based on a preset position in a preset display plane of the display picture;
and determining a first preset direction of the preset display plane as a first coordinate axis direction of the virtual operating lever coordinate system, and determining a second preset direction of the preset display plane as a second coordinate axis direction of the virtual operating lever coordinate system.
4. The method of any one of claims 1-3, wherein the determining input key values for a virtual joystick based on a projection of the user's target hand on the virtual joystick coordinate system comprises:
acquiring orthogonal projection coordinates of a target finger of the target hand in the virtual control rod coordinate system;
and carrying out normalization processing on the orthogonal projection coordinates, and determining the input key value of the virtual control lever.
5. The method of claim 4, wherein the normalizing the orthogonal projection coordinates to determine the input key value of the virtual joystick comprises:
and based on a preset minimum coordinate absolute value, a preset numerical value processing width and a preset overshoot coefficient of the virtual control lever coordinate system, carrying out normalization processing on the orthogonal projection coordinate, and determining an input key value of the virtual control lever, wherein the preset minimum coordinate absolute value is a coordinate value of the orthogonal projection coordinate.
6. The method of claim 5, wherein the normalizing the orthogonal projection coordinates based on a preset minimum coordinate absolute value, a preset numerical processing width and a preset overshoot coefficient of the virtual joystick coordinate system to determine the input key value of the virtual joystick comprises:
determining an input value based on the orthogonal projection coordinates;
determining a first numerical processing range based on the preset minimum coordinate absolute value, determining a second numerical processing range based on the preset minimum coordinate absolute value and the preset numerical processing width, and determining a third numerical processing range based on the preset minimum coordinate absolute value, the preset numerical processing width and the preset overcharge factor, wherein a maximum numerical boundary of the first numerical range is smaller than a maximum numerical boundary of the second numerical range, a minimum numerical boundary of the first numerical range is larger than a minimum numerical boundary of the second numerical range, a maximum numerical boundary of the second numerical range is smaller than a maximum numerical boundary of the third numerical range, and a minimum numerical boundary of the second numerical range is larger than a minimum numerical boundary of the third numerical range;
if the input value is within the first numerical value processing range, determining that the input key value is 0;
if the input value is outside the first numerical value processing range and within the second numerical value processing range, determining the input key value based on the input value, the preset minimum coordinate absolute value and the preset numerical value processing width;
if the input value is located outside the second numerical value processing range and within the third numerical value processing range, determining that the input key value is one of 1 and-1 based on the magnitude relation between the input value and 0;
and if the input value is positioned outside the third numerical value processing range, determining that the input key value is 0.
7. The method of claim 1, further comprising:
activating a handle mode in response to detecting that the user has performed a preset handle mode gesture action;
determining an input key value of the virtual handle based on a handle mode gesture action in a state of activating the handle mode, wherein the handle mode gesture action comprises at least one of a key action and a gripping action, and the input key value of the virtual handle comprises an input key value of the key action and an input key value of the gripping action.
8. The method of claim 7, wherein the determining an input key value for the virtual handle based on the handle mode gesture action comprises:
determining an input distance value to the virtual handle based on the handle mode gesture action;
and based on the input distance value, a preset critical coefficient, a maximum processing distance threshold value and a minimum processing distance threshold value, carrying out normalization processing on the input distance value, and determining an input key value of the virtual handle.
9. The method of claim 8, further comprising:
acquiring an operation input value of the user on the virtual handle;
if the operation input value is smaller than the minimum boundary threshold value of the operation input value, updating the minimum boundary threshold value to the input value;
and if the operation input value is larger than the maximum boundary threshold value of the operation input value, updating the maximum boundary threshold value to the input value.
10. The method according to claim 8 or 9, wherein the obtaining of the operation input value of the virtual handle by the user comprises:
and when the trigger operation of the user through the virtual handle is detected, determining an input value of the trigger operation based on the distance between the fingertip position of a first preset finger and the fingertip position of a second preset finger of the trigger operation of the user.
11. The method according to claim 8 or 9, wherein the acquiring of the operation input value of the virtual handle by the user comprises:
when the user is detected to carry out the grasping operation on the virtual handle, determining the sum of the distances from the fingertips of a plurality of preset fingers to the center of the palm of the hand carrying out the grasping operation as the input value of the grasping operation.
12. An apparatus for interacting with a head-mounted display device, comprising:
the joystick mode activation module is used for activating a joystick mode when detecting that the user carries out preset joystick mode gesture actions;
a virtual joystick coordinate system determination module for determining an origin position of a virtual joystick coordinate system based on at least one of a hand position of the user and a position of a display screen of the head-mounted display device in a state where the joystick mode is activated, and determining the virtual joystick coordinate system based on the origin position of the virtual joystick coordinate system;
and the input key value determining module is used for determining the input key value of the virtual joystick based on the projection of the target hand of the user on the virtual joystick coordinate system.
13. An electronic device, the electronic device comprising:
a processor;
a memory for storing the processor-executable instructions;
the processor is configured to read the executable instructions from the memory and execute the instructions to implement the method for interacting with a head-mounted display device as claimed in any one of claims 1 to 11.
14. A computer-readable storage medium having stored thereon a computer program for executing the method for interacting with a head-mounted display device of any one of claims 1-11.
15. An interactive system, comprising:
a head mounted display device, and the apparatus for interacting with a head mounted display device of claim 12.
CN202211242490.7A 2022-10-11 2022-10-11 Method and device for interacting with head-mounted display equipment and interaction system Pending CN115657844A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202211242490.7A CN115657844A (en) 2022-10-11 2022-10-11 Method and device for interacting with head-mounted display equipment and interaction system

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202211242490.7A CN115657844A (en) 2022-10-11 2022-10-11 Method and device for interacting with head-mounted display equipment and interaction system

Publications (1)

Publication Number Publication Date
CN115657844A true CN115657844A (en) 2023-01-31

Family

ID=84988101

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202211242490.7A Pending CN115657844A (en) 2022-10-11 2022-10-11 Method and device for interacting with head-mounted display equipment and interaction system

Country Status (1)

Country Link
CN (1) CN115657844A (en)

Similar Documents

Publication Publication Date Title
EP3629129A1 (en) Method and apparatus of interactive display based on gesture recognition
US9721396B2 (en) Computer and computer system for controlling object manipulation in immersive virtual space
KR101981822B1 (en) Omni-spatial gesture input
KR101890459B1 (en) Method and system for responding to user&#39;s selection gesture of object displayed in three dimensions
JP6350772B2 (en) Information processing system, information processing apparatus, control method, and program
KR20140010616A (en) Apparatus and method for processing manipulation of 3d virtual object
KR102170638B1 (en) Method for controlling interaction in virtual reality by tracking fingertips and VR system using it
CN110472396B (en) Somatosensory gesture touch method, system, platform and storage medium
JP6549066B2 (en) Computer program and computer system for controlling object operation in immersive virtual space
CN115657844A (en) Method and device for interacting with head-mounted display equipment and interaction system
Tran et al. Wireless data glove for gesture-based robotic control
CN113467625A (en) Virtual reality control device, helmet and interaction method
CN108803862B (en) Account relation establishing method and device used in virtual reality scene
US20230367403A1 (en) Terminal device, virtual object manipulation method, and virtual object manipulation program
Decle et al. A study of direct versus planned 3d camera manipulation on touch-based mobile phones
CN109254671B (en) Interactive method, device and equipment for controlling object posture in AR/VR application
WO2023176034A1 (en) Control device and control method
US20230085330A1 (en) Touchless image-based input interface
US20230041519A1 (en) Tracking system, tracking method and non-transitory computer-readable storage medium
WO2023181549A1 (en) Control device, control method, and program
Tao et al. Human-Computer Interaction Using Fingertip Based on Kinect
Ahn et al. A VR/AR Interface Design based on Unaligned Hand Position and Gaze Direction
CN116339566A (en) Method and apparatus for displaying application program interface for head-mounted display device
JP2017228216A (en) Information processing apparatus, control method therefor, program, and storage medium
Matei et al. Creating enhanced interfaces for cyber-physical-social systems: the remote drone control experiment.

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination