CN106873767B - Operation control method and device for virtual reality application - Google Patents

Operation control method and device for virtual reality application Download PDF

Info

Publication number
CN106873767B
CN106873767B CN201611260768.8A CN201611260768A CN106873767B CN 106873767 B CN106873767 B CN 106873767B CN 201611260768 A CN201611260768 A CN 201611260768A CN 106873767 B CN106873767 B CN 106873767B
Authority
CN
China
Prior art keywords
input event
event
user
head
user input
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201611260768.8A
Other languages
Chinese (zh)
Other versions
CN106873767A (en
Inventor
杨雷
李晓鸣
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
SuperD Co Ltd
Original Assignee
SuperD Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by SuperD Co Ltd filed Critical SuperD Co Ltd
Priority to CN201611260768.8A priority Critical patent/CN106873767B/en
Publication of CN106873767A publication Critical patent/CN106873767A/en
Application granted granted Critical
Publication of CN106873767B publication Critical patent/CN106873767B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • G06F3/012Head tracking input arrangements
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/16Sound input; Sound output
    • G06F3/167Audio in a user interface, e.g. using voice commands for navigating, audio feedback
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T19/00Manipulating 3D models or images for computer graphics
    • G06T19/006Mixed reality
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2200/00Indexing scheme for image data processing or generation, in general
    • G06T2200/08Indexing scheme for image data processing or generation, in general involving all processing steps from image acquisition to 3D model generation

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Physics & Mathematics (AREA)
  • Human Computer Interaction (AREA)
  • Software Systems (AREA)
  • Computer Hardware Design (AREA)
  • Computer Graphics (AREA)
  • Multimedia (AREA)
  • Health & Medical Sciences (AREA)
  • Audiology, Speech & Language Pathology (AREA)
  • General Health & Medical Sciences (AREA)
  • User Interface Of Digital Computer (AREA)
  • Processing Or Creating Images (AREA)

Abstract

The embodiment of the invention discloses an operation control method of a virtual reality application, which comprises the following steps: tracking and acquiring a current head posture parameter of a user, and determining a motion state of the head of the user according to the current head posture parameter; determining a user input event corresponding to the motion state of the head of the user according to a pre-established corresponding relation between the head motion state and the user input event; and executing a function corresponding to the user input event according to the user input event corresponding to the motion state of the head of the user. According to the invention, by establishing the mapping relation between the head motion state and the user input event, the user can track the head gesture to establish the virtual reality scene and can realize the control and interaction of the virtual reality application content, so that the user is provided with stronger immersion and more comfortable and convenient VR interaction experience.

Description

Operation control method and device for virtual reality application
Technical Field
The embodiment of the invention relates to the field of Virtual Reality (VR), in particular to a method and a device for controlling the operation of virtual reality application.
Background
In the current leading-edge technical research, Virtual Reality (VR) is a highly concerned technical research direction. VR presents the user with fully virtual content, with the user wearing a head-mounted VR device, immersed in the world of fully virtual content, and completely isolated from reality. User interaction with the virtual content world is achieved through external interaction devices, such as head tracking sensing modules, handles, and other wearable devices.
For virtual reality devices, existing head mounted VR devices typically include a head mounted display and a VR content generation device.
The head mounted display may be worn on the user's head and provide the user with an immersive field of view of the virtual scene. In addition, the head mounted display contains sensors for head tracking positioning.
The VR content generation device includes a computation module, a storage module, and a head positioning module. The head positioning module obtains data from a head positioning sensor in the head-mounted display in real time, and the head positioning module can obtain the head posture of the current user after the data are processed by a sensor fusion related algorithm.
The head attitude parameter generally refers to theta around three axes of X, Y and Z by using the head as an origin for modelingPitchyawrollThe angle of rotation.
The VR content generation equipment obtains the current head posture from the head positioning module, obtains materials required by rendering the virtual scene from the storage module, renders the virtual scene with the current user head posture as a visual angle through the processing of the calculation module, and displays the virtual scene in front of eyes of a user through the head-mounted display. The head-mounted display and the VR content generation device may be embedded and integrated (e.g., VR all-in-one) or connected together through a display data line (e.g., HDMI) (e.g., HTC Vive).
In the operating environment of existing head-mounted VR devices, interaction of VR virtual content, and in particular VR games, with a user is typically accomplished through an external device, such as a bluetooth handle. When using external device to interact, because the user head has head mounted display, can't see real environment for the user is difficult to decide to accomplish complicated button input on external controller, and current external controller is inconvenient in the operation in the VR environment and because the reason that the sight sheltered from is accurate inadequately. Meanwhile, the immersion feeling brought to the user by the VR equipment can be influenced by using an external controller for complex input, and the user experience is poor.
In addition, the traditional external equipment layout, such as a handle, needs four keys, namely an upper key, a lower key, a left key and a right key, for controlling the game direction, and the four keys occupy a large layout space on the external equipment.
Accordingly, the prior art is yet to be improved and developed.
Disclosure of Invention
The method and the virtual reality equipment enable a user to track the head gesture to establish a virtual reality scene and simultaneously realize the control and interaction of the virtual reality application by establishing the mapping relation between the motion state of the head and a user input event, so that the user has stronger immersion and more comfortable and convenient VR interaction experience.
In order to solve the above technical problem, in a first aspect, an embodiment of the present invention adopts a technical solution that:
the running control method of the virtual reality application comprises the following steps:
tracking and acquiring a current head posture parameter of a user, and determining a motion state of the head of the user according to the current head posture parameter;
determining a user input event corresponding to the motion state of the head of the user according to a pre-established corresponding relation between the head motion state and the user input event;
and executing a function corresponding to the user input event according to the user input event corresponding to the motion state of the head of the user.
In order to supplement the interaction of the head motion state and determine the head motion state of the user before the corresponding user input event, the method further comprises:
reading an input event of an external device of a user;
the determining the user input event corresponding to the motion state of the head of the user according to the pre-established corresponding relationship between the head motion state and the user input event comprises the following steps:
and determining the motion state of the head of the user and a user input event corresponding to the input event of the external equipment of the user according to the pre-established corresponding relationship between the head motion state and the input event of the external equipment of the user and the user input event.
As a specific embodiment, the user input event comprises a direction control event, the direction control event is used for controlling the movement direction of a virtual object in a virtual reality scene displayed by the virtual reality application; according to the user input event corresponding to the motion state of the head of the user, the virtual reality application executes the function corresponding to the user input event and comprises the following steps:
and controlling the virtual object in the virtual scene to move according to the movement direction corresponding to the movement direction control event according to the direction control event.
The directional control event includes at least one of:
an up event, a down event, a left event, a right event, a forward event, a backward event, a left down event, a right down event, a left up event, a right up event.
The motion state of the user's head includes a head rotation direction and/or a head rotation angle.
For the source, the virtual reality application is converted from a native 3D application;
the user input event is a perspective control event;
the controlling the virtual reality application to execute the function corresponding to the user input event according to the user input event corresponding to the motion state of the head of the user comprises the following steps:
and transforming the original observation matrix or the original projection matrix of the virtual scene according to the current head posture parameter, and constructing and displaying a virtual scene image by using the transformed observation matrix or projection matrix, thereby transforming the observation angle of the virtual scene and realizing the synchronization of the observation angle of the virtual scene and the observation angle of the user after the head moves.
To solve the above-described problems, according to a second aspect, another technique adopted in an embodiment of the present invention
The technical scheme is as follows: provided is an operation control device for a virtual reality application, including:
the head tracking sensing unit is used for tracking and acquiring the current head posture parameter of the user and determining the motion state of the head of the user according to the current head posture parameter;
the event confirmation unit is used for determining a user input event corresponding to the motion state of the head of the user according to the pre-established corresponding relation between the head motion state and the user input event;
and the control unit is used for executing the function corresponding to the user input event by the virtual reality application according to the user input event corresponding to the motion state of the head of the user.
Preferably, the external interactive device is further configured to read an input event of the external device before determining a user input event corresponding to the motion state of the head of the user, where the event confirmation unit determines the motion state of the head of the user and the user input event corresponding to the input event of the external device of the user according to a pre-established correspondence between the head motion state, the input event of the external device, and the user input event.
In a specific implementation, the user input event includes a direction control event, where the direction control event is used to control a movement direction of a virtual object in a virtual reality scene displayed by the virtual reality application, and the control unit is specifically used to control the event according to the direction, where the virtual reality application controls the virtual object in the virtual scene to move according to the movement direction corresponding to the movement direction control event.
The directional control event includes at least one of:
an up event, a down event, a left event, a right event, a forward event, a backward event, a left down event, a right down event, a left up event, a right up event.
The motion state of the user's head includes a head rotation direction and/or a head rotation angle.
The virtual reality application is converted from a native 3D application;
the user input event is a perspective control event;
the control unit is used for:
and transforming the original observation matrix or the original projection matrix of the virtual scene according to the current head posture parameter, and constructing and displaying a virtual scene image by using the transformed observation matrix or projection matrix, thereby transforming the observation angle of the virtual scene and realizing the synchronization of the observation angle of the virtual scene and the observation angle of the user after the head moves.
In order to solve the above technical problem, in a third aspect, another technical solution adopted by the embodiments of the present invention is: there is provided a virtual reality device comprising:
a display, at least one processor; and the number of the first and second groups,
a memory communicatively coupled to the at least one processor; wherein the content of the first and second substances,
the memory stores instructions executable by the at least one processor to enable the at least one processor to perform the method described above.
In order to solve the above technical problem, in a fourth aspect, another technical solution adopted by the embodiment of the present invention is: a non-transitory computer storage medium is provided that stores computer-executable instructions for causing the computer to perform the above-described method.
The beneficial effects of the embodiment of the invention are as follows: according to the method and the virtual reality equipment, a mapping relation between the motion state of the head and the user input event is established, so that a part of the user input event can be represented through the head motion, namely, in the virtual reality application, the head posture can be used for tracking a virtual reality scene corresponding to the visual angle of the user, and can also be used for simulating external equipment to perform man-machine interaction on the virtual reality application, so that the user can control the virtual reality application, namely, the user can control and interact the virtual reality application while tracking the head posture to establish the virtual reality scene, and does not need to completely depend on external equipment to perform man-machine interaction, so that the immersion feeling of the user is stronger, and the VR interaction experience is more comfortable and convenient to use. Moreover, the head posture can simulate the external equipment, so that the hardware configuration of the external equipment of the virtual reality equipment can be effectively simplified.
Drawings
FIG. 1 is a hardware block diagram of a virtual reality device of an embodiment of the invention;
FIG. 2 is a software block diagram of a virtual reality device of an embodiment of the invention;
fig. 3 is a main flowchart of an operation control method of a virtual reality application according to an embodiment of the present invention;
fig. 4 is a specific flowchart of an operation control method of a virtual reality application according to an embodiment of the present invention;
FIG. 5 is a schematic diagram of an embodiment of a method for controlling the operation of a virtual reality application according to an embodiment of the present invention, in which a head of the method performs directional control in conjunction with a button;
fig. 6 is a schematic diagram of a hardware structure of an electronic device of an operation control method of a virtual reality application according to an embodiment of the present invention;
FIG. 7 is a mapping table of the minimum effective angle and the upward direction of the virtual object according to the embodiment of the present invention;
FIG. 8 is a table comparing the minimum effective angle to the downward mapping of the virtual object direction according to an embodiment of the present invention;
FIG. 9 is a table of mapping relationships between the minimum effective angle and the virtual object direction to the left according to the embodiment of the present invention;
FIG. 10 is a table of mapping relationships between minimum effective angles and virtual object directions to the right according to an embodiment of the present invention;
FIG. 11 is a table showing mapping relationships between the minimum effective angle and the virtual object in the up-and-left direction according to the embodiment of the present invention;
FIG. 12 is a table showing mapping relationships between the minimum effective angle and the virtual object in the up-and-right direction according to the embodiment of the present invention;
FIG. 13 is a table showing a mapping relationship between the minimum effective angle and the virtual object in the left-right direction according to the embodiment of the present invention; and
fig. 14 is a mapping relation comparison table between the minimum effective angle and the virtual object in the lower right direction according to the embodiment of the present invention.
Detailed Description
In order to make the objects, technical solutions and advantages of the embodiments of the present invention more apparent, the embodiments of the present invention are further described in detail below with reference to the accompanying drawings. The exemplary embodiments and descriptions of the present invention are provided to explain the present invention, but not to limit the present invention.
In the embodiment of the invention, the head movement of the user is corresponding to the user input event, so that the aim of controlling the virtual reality equipment or the virtual reality application by using the head movement of the user is fulfilled. Like this, the user can utilize the user input of head motion simulation external device to come to carry out human-computer interaction with virtual reality equipment and application, that is to say, the user need not rely on external device to carry out human-computer interaction completely to it is stronger to take the user to immerse the sense, uses more comfortable and convenient VR interactive experience. Moreover, the head posture can simulate the external equipment, so that the hardware configuration of the external equipment of the virtual reality equipment can be effectively simplified.
In the embodiment of the present invention, the user input event is used to control the virtual reality device or the virtual reality application, and the control relationship of the user input event to the virtual reality device or the virtual reality application is inherent and is predefined, for example, when the user clicks a certain key, the user clicks a certain key corresponding to a user input event, and according to the preset, the function corresponding to the event is executed, that is, the previous menu is returned. For example, in an embodiment of the present invention, a user nods his head, and obtains a current head posture parameter of the user by tracking, and determines that the user performs a motion of "nodding" corresponding to the input event of clicking a certain key, and then executes a corresponding function to return to the previous menu.
The following provides a detailed description of examples of the present invention.
Referring to fig. 1, from the aspect of hardware, the virtual reality device of the present embodiment includes a calculation control unit, a head-mounted display, a storage module, an external device, a sensor, a head tracking and positioning sensor, and a head positioning module. In the embodiment of the application, the external device may be a gamepad for user interaction.
The head tracking sensing unit acquires data acquired by the head tracking positioning sensor and other sensors as head posture parameters, and determines the motion state of the head of the user according to the current head posture parameters. The motion state of the head of the user can refer to whether the head of the user moves, the motion direction, the motion amplitude and the like, and the conversion is embodied in parameters as follows: the motion state of the user's head includes a head rotation direction and/or a head rotation angle.
The external device is used for generating an external device input event, such as a key click event, before determining a user input event corresponding to the motion state of the head of the user, wherein the external device input event can control the virtual reality device, namely, control a virtual display application set in the virtual reality device, and also can control the virtual reality application by matching with the motion state of the head, namely, the external device input event determines a final user input event together with the motion state of the head.
Referring to fig. 2, it is explained from the aspect of software that the virtual reality device of the present embodiment includes a plurality of virtual reality applications, and the virtual display application includes a head tracking sensing unit, an event confirmation unit, and a control unit, where the control unit may include a rendering module for displaying a virtual reality scene for a user and an image display module, and the rendering module may include an image rendering module and a sound rendering module, and may further include a display module for displaying, and optionally, the display module may perform three-dimensional display.
The head tracking sensing unit tracks and acquires current head posture parameters of a user, and determines the motion state of the head of the user according to the current head posture parameters;
the event confirmation unit determines a user input event corresponding to the motion state of the head of the user according to a pre-established corresponding relation between the head motion state and the user input event;
the control unit executes a function corresponding to the user input event according to the user input event corresponding to the motion state of the head of the user.
The event confirmation unit can also determine the motion state of the head of the user and the user input event corresponding to the external equipment input event of the user according to the pre-established corresponding relationship between the head motion state and the external equipment input event and the user input event.
In the following embodiments, the virtual reality application is, for example, a game program.
And determining a user input event corresponding to the motion state according to the motion state of the head of the user, wherein the user input event can be used for game control, and the virtual reality application executes a function corresponding to the user input event, namely, the game control.
For example. In this embodiment, the user input event is a directional control event. The direction control event is used for controlling the movement direction of a virtual object in a virtual reality scene displayed by the virtual reality application, and according to the direction control event, the virtual reality application controls the virtual object in the virtual scene to move according to the movement direction corresponding to the movement direction control event. For example, the game scene includes a controllable virtual character, and the user can control the movement direction of the virtual character through head movement, or control the behavior of the virtual character, such as shooting, walking, jumping, punching, kicking, and the like.
In other embodiments, the virtual reality application is translated from a native 3D application. The native 3D application refers to a 3D application that has not undergone external modification other than a developer after the developer develops. The native 3D application is used to be convertible into a virtual display application, but the native 3D application has no view tracking function, i.e., cannot change display contents according to a user view.
To this end, in order for the virtual reality application to have a tracking perspective function, in this embodiment, the user input event is a perspective control event.
The rendering module transforms an original observation matrix or an original projection matrix of a virtual scene according to the current head posture parameter, constructs a virtual scene image by using the transformed observation matrix or projection matrix, and the display module displays the virtual scene image, so that the observation angle of the virtual scene is transformed, and the observation angle under the virtual scene is synchronous with the observation angle after the head of a user moves.
As is well known to those skilled in the art, the projection matrix and the viewing matrix are transformation parameters used when performing coordinate transformation in the rendering process.
In the embodiment, the head movement state is converted into a visual angle control event, the observation matrix or the projection matrix is adjusted by using the head posture parameter pair, a virtual scene constructed by using the new projection matrix or the observation matrix is displayed, and the synchronization of the observation visual angle in the virtual scene and the observation visual angle after the head of the user moves is realized.
The invention does not limit how to modify the projection matrix or the observation matrix by using the head motion state, and can be reasonably configured by the technical personnel in the field.
In order to enable the 3D game of the native monocular vision, which is usually a first person game, to have good immersion under the running environment of the virtual reality device, the 3D game of the native monocular vision is modified by positioning and adjusting the game visual angle according to the head posture parameters. In the original 3D game, the change of the game view angle is generally controlled by the interactive devices such as a handle, a keyboard, a mouse, a touch screen, and the like. In order to realize that the user controls the view angle transformation through the head posture parameters, the related data of the head posture positioning needs to be converted into an interactive mode supported by the game to complete the control of the head posture positioning on the game view angle. At this time, the user input event is a viewing angle control event.
On the basis of a developed native 3D game, the operation of mapping the head positioning to the native game is that the head posture parameters of the head positioning are mapped to the operation functions of a mouse, a keyboard and a touch screen, the operation functions of the mouse and the like are simulated by using the head positioning, for example, the game per se is subjected to view angle conversion by using equipment such as the mouse and the like, and now, the scene is subjected to view angle conversion along with the motion of the head under the control of the head positioning, so that the immersion feeling is stronger. The user does not need to utilize peripheral equipment for complex operation, the operation is convenient, and the immersion feeling is increased.
In the present embodiment, the head-tracking sensing unit can acquire the 6-dimensional head posture parameters generated by the head-tracking positioning sensor, which means that the head-tracking positioning sensor is not limited to the tilt angle θ of the headpitchYaw angle thetayawAnd roll angle thetarollThese three dimensions, and the other three dimensions, represent the spatial position of the user's head, which is not detailed here in relation to the present solution. Wherein the application relates to the angle of inclination thetapitchYaw angle thetayawAnd roll angle thetarollThe parameters model X, Y, Z axis with the origin of the user's head wearing the virtual reality device helmet (see FIG. 5,510), which isThe three-dimensional head pose parameter is θ about X, Y, Z three axesPitchyawrollThe angle of rotation.
The head tracking sensing unit acquires head posture parameters, and the control unit can build a specific virtual reality scene on the image display screen for a user based on the head posture parameters. After the user opens the game content, the head tracking sensing unit starts to work to acquire the required head attitude parameter inclination angle thetapitchYaw angle thetayawAnd roll angle thetaroll. Wherein the game content includes at least one game virtual object.
The event confirming unit stores a mapping relationship set of the head motion state and the virtual object direction, and the specific mapping relationship is described later.
In a traditional native game, the change of the game view angle is generally controlled by interaction devices such as a handle, a keyboard, a mouse, a touch screen and the like. According to the technical scheme, the user can control the visual angle transformation and interaction of the game through the head gesture when playing the traditional native game by utilizing the virtual reality equipment. In order to realize the function, the head posture parameters of the user are converted into interaction modes supported by the game according to the mapping relation set so as to control and interact the head posture to the visual angle direction of the game.
On the basis of the virtual game, the head posture parameters are mapped to the direction control of the original game, namely the head posture parameters are mapped to the traditional mouse, keyboard and touch screen operation, so that the interactive operation functions of the mouse and the like are simulated by the head posture, for example, the game is controlled by the head posture through the visual angle conversion of the mouse and other equipment operation, in the embodiment, the visual angle conversion can be carried out on the specific game scene along with the change of the head posture, and the VR immersion sense is stronger. The user does not need to utilize peripheral equipment to complete complex operation, and the method is convenient and provides the maximum comfort.
The event confirmation unit can determine the motion state of the user, namely the motion direction of the virtual object in the game in the virtual reality scene according to the mapping relation set and the current head posture parameters of the user.
After the movement direction is determined, the virtual object is controlled to move according to the movement direction, namely, the rendering module renders the virtual reality scene corresponding to the movement direction of the virtual object and the virtual object according to the movement direction of the virtual object in the game determined by the event confirmation unit.
To enhance the live immersion to provide a better VR experience, the rendering module further includes a sound rendering module that provides rendering sounds of the corresponding virtual objects according to the direction of motion of the virtual objects in the game determined by the head pose. The specific rendering mode depends on the content, such as a tsunami sound of a plane diving downwards, and a booming engine sound of a plane flying off.
Therefore, the head tracking sensing unit can provide scene information for the system and can realize the control and interaction of game contents.
Referring to fig. 5, the user may complete the operation and interaction of the game through the head-mounted device, and the embodiment may also provide a simple auxiliary peripheral device to relieve the discomfort of the user's head operation due to the content being violent or the head being frequently swayed. That is to say, the virtual reality equipment of this application can supply the user to play the virtual game through the head gesture alone, or with the head gesture interaction as main and supplementary mode of establishing as supplementary and playing the virtual game, strengthens the sense of immersion with alleviating the head discomfort of swaying simultaneously as far as possible.
The auxiliary key of the present embodiment is a binary key, and the binary key is used to change the moving direction of the virtual object in the opposite direction. One embodiment of the binary key is a conventional front and back key (fig. 5, 520). The handles of the front and rear keys are simple in design and easy for users to complete direction switching operation. Compared with the traditional gamepad with four directions, the four keys on the upper, lower, left and right occupy a large layout space on the gamepad, and the gamepads of the front key and the rear key form an interface layout.
According to the embodiment of the application, the control and interaction of the game can be realized by independently using the head gesture, the front button and the rear button of the direction can be simultaneously switched, the control of 8 directions of the four buttons in the existing game can be realized, and a user can use the virtual reality device comfortably and immersive. The game direction control and interaction mode of the embodiment simplifies the number of times of deliberate interaction, is simple and convenient to use, and avoids operating a plurality of keys under the condition of wearing a virtual reality helmet; on the other hand, the auxiliary peripheral is only provided with 2 direction keys, and four keys, namely an upper key, a lower key, a left key and a right key, are not needed to be arranged, so that the space occupation of the input equipment is reduced, and the interaction experience of a user is enhanced.
As one embodiment of the set of mapping relationships between the head posture parameter and the virtual object direction, the head posture parameter indicates the X-axis inclination angle θ centered on the headpitchY-axis yaw angle thetayawZ-axis roll angle thetarollThe set of mapping relationships between the parameters and the directions of the virtual objects in the game may express the directions of motion of the eight virtual objects, specifically, up, down, left, right, up left, up right, down left, down right. The conventional game input device is provided with four direction keys, which can express up, down, left, right, up left, up right, down left, down right. In this embodiment, the head gestures are used to achieve game interaction, assist the forward trigger key and the backward trigger key, and complete the same function definition.
According to the embodiment of the application, the head attitude parameters of the head tracking sensing unit are acquired through the event confirmation unit, and theta around the X axis, the Y axis and the Z axisPitchyawrollThe angle of rotation. For clarity of illustration, the following example uses only the X-axis tilt angle θpitchYaw angle θ of Y-axisyawThe current direction is positioned, and then a forward trigger key and a backward trigger key are assisted to finish the expression of 8 directions of the traditional gamepad.
The set of mapping relationships between the head pose parameters and the virtual object directions is as follows:
defining a minimum effective angle thetaminWhen it satisfies 90 °>||θup||>θminWhen, it is directed upwards. When the handle is used, the forward trigger key is clicked to indicate the direction, the backward trigger key is clicked to indicate the direction, and the reference point shows that the direction is downwardFIG. 7
Such asFIG. 8Shown, a minimum effective angle θ is definedminWhen it satisfies 90 °>||θdown||>θminAnd indicates a downward direction. When the handle is used, the forward trigger key is clicked to indicate the direction downwards, and the backward trigger key is clicked to indicate the direction upwards.
Such asFIG. 9Shown, a minimum effective angle θ is definedminWhen it satisfies 90 °>||θleft||>θminIndicating a direction to the left. When the handle is used, the forward trigger key is clicked to indicate that the direction is leftward, and the backward trigger key is clicked to indicate that the direction is rightward.
Such asFIG. 10 shows a schematic view of aShown, a minimum effective angle θ is definedminWhen it satisfies 90 °>||θright||>θminIndicating a direction to the right. When the handle is used, the forward trigger key is clicked to indicate that the direction is right, and the backward trigger key is clicked to indicate that the direction is left.
Such asFIG. 11Shown, a minimum effective angle θ is definedminWhen it satisfies 90 °>||θup||>θmin,90°>||θleft||>θminIndicating a direction up to the left. When the handle is used, the forward trigger key is clicked to indicate that the direction is upward and the backward trigger key is clicked to indicate that the direction is downward and rightward.
Such asFIG. 12Shown, a minimum effective angle θ is definedminWhen it satisfies 90 °>||θup||>θmin,90°>||θright||>θminIndicating a direction up to the right. When the handle is used, the forward trigger key is clicked to indicate that the direction is upward and right, and the backward trigger key is clicked to indicate that the direction is downward and left.
Such asFIG. 13Shown, a minimum effective angle θ is definedminWhen it satisfies 90 °>||θdown||>θmin,90°>||θleft||>θminAnd indicates a direction down to the left. When the handle is used, the forward trigger key is clicked to indicate that the direction is downward and leftward, and the backward trigger key is clicked to indicate that the direction is upward and rightward.
Such asFIG. 14Shown, a minimum effective angle θ is definedminWhen it satisfies 90 °>||θdown||>θmin,90°>||θright||>θminIndicating a direction down and to the right. When the handle is used, the forward trigger key is clicked to indicate that the direction is downward and right, and the backward trigger key is clicked to indicate that the direction is upward and left.
For example, when a user plays an airplane game by using the head-mounted virtual reality device, the flight direction of the airplane needs to be controlled by the head-mounted virtual reality device, and then the user can control the airplane by acquiring the head attitude parameters. When the user does not want to frequently shake the head, the user can combine the forward trigger key and the backward trigger key of the auxiliary peripheral handle to complete the interaction and the control of the game. The user clicks the forward trigger key to execute the forward function and control the airplane to fly forward, the user clicks the backward trigger key to execute the backward function and control the airplane to fly backward, when the head of the user is detected to rotate left, the left forward function is executed, the airplane is controlled to fly left, and when the head of the user is detected to rotate right, the right forward function is executed and the airplane is controlled to fly right. The control of the flying direction of the upper left, the upper right, the lower left and the lower right can be controlled by pressing keys and simultaneously performing head movement.
Referring to fig. 3, an embodiment of the present application further relates to a method for controlling operation of a virtual reality application, which mainly includes:
step 302: tracking and acquiring a current head posture parameter of a user, and determining a motion state of the head of the user according to the current head posture parameter;
step 304: determining a user input event corresponding to the motion state of the head of the user according to a pre-established corresponding relation between the head motion state and the user input event;
step 306: and executing a function corresponding to the user input event by the virtual reality application according to the user input event corresponding to the motion state of the head of the user.
Reading an external device input event of a user in order to support interactive input of the external device when needed; and determining the motion state of the head of the user and a user input event corresponding to the input event of the external equipment of the user according to the pre-established corresponding relationship between the head motion state and the input event of the external equipment of the user and the user input event.
In this embodiment, the user input event includes a direction control event, and the direction control event is used to control a movement direction of a virtual object in a virtual reality scene displayed by the virtual reality application. And according to the direction control event, the virtual reality application controls the virtual object in the virtual scene to move according to the movement direction corresponding to the direction control event.
The directional control events include at least one of:
an up event, a down event, a left event, a right event, a forward event, a backward event, a left down event, a right down event, a left up event, a right up event.
The motion state of the user's head includes a head rotation direction and/or a head rotation angle.
In the embodiment that the user input event is a direction control event, the specific steps are as follows;
tracking and acquiring current head attitude parameters of a user, and acquiring only an X-axis inclination angle thetapitchY-axis yaw angle thetayawZ-axis roll angle thetarollAttitude parameters, independent of spatial location parameters.
The control unit determines a virtual reality scene displayed on the three-dimensional image display screen according to the head posture parameters.
And establishing a mapping relation set of the head posture parameters and the virtual object direction.
The direction of movement of the virtual object in the virtual reality scene is determined from the current head pose parameters of the user according to the set of mapping relationships.
And rendering the virtual reality scene and the virtual object corresponding to the motion direction by the stereo image according to the motion direction of the virtual object. Therefore, the user can track the head gesture to establish a virtual reality scene and can control and interact game content, so that the user is more immersed, and more comfortable and convenient VR interaction experience is achieved.
Referring to fig. 4, the method further includes determining whether to use an input event of the external device. In this embodiment, the external device is a binary key.
Step 407 of determining whether there is an input event of the external device, i.e. the binary key.
If the user has used a binary key, the direction of motion of the virtual object is changed to the opposite direction step 408.
If the user does not use the binary key, step 409 is entered.
Step 409: stereoscopic image rendering is carried out according to the direction of movement of the virtual object determined by the direction control unit through the head pose parameters, or according to the opposite direction of movement changed by the handle, the rendering being directed to the virtual reality scene corresponding to the direction of movement and to the virtual object itself.
Step 410: providing corresponding rendering sound according to the motion direction of the virtual object
In this embodiment, the directions of the virtual object that can be expressed by the mapping relation set of the head posture parameter and the virtual object direction include up, down, left, right, up left, up right, down left, down right.
In the technical scheme: 1. according to the method, the mapping relation between the head gesture and the direction of the virtual object in the game content is established, so that the user can track the head gesture to establish a virtual reality scene and can realize the control and interaction of the game content, and the user is provided with stronger immersion and more comfortable and convenient VR interaction experience; 2. the invention sets binary keys, and combines with a game interaction mode of head gestures, so that the control in 8 directions of four keys at the front, back, left and right in the traditional game can be realized, a user can comfortably and immerse the virtual reality equipment, and the binary keys enhance the interaction experience of the user and reduce the space occupation of an external handle; 3. the game direction control and interaction mode set by the invention simplifies the times of the user's intentional interaction on one hand, avoids operating a plurality of keys under the condition of wearing a virtual reality helmet, and is simple and convenient to use.
Fig. 6 is a schematic hardware structure diagram of an electronic device 600 of an operation control method of a virtual reality application provided in an embodiment of the present application, and as shown in fig. 6, the electronic device 600 includes:
a plurality of processors 610, 620, a memory 630, an external device 640, and a display device 650, wherein the plurality of processors in fig. 6 are a CPU and a GPU, respectively, for example.
The processors 610, 620 and the memory 630 may be connected by a bus or other means, such as by a bus in FIG. 6.
The memory 630, as a non-volatile computer-readable storage medium, may be used to store non-volatile software programs, non-volatile computer-executable programs, and modules, such as program instructions/modules (e.g., the event confirmation unit, the head tracking sensing unit, and the rendering module shown in fig. 2) corresponding to the operation control method of the virtual reality application in the embodiment of the present application. The processors 610 and 620 execute various functional applications and data processing of the terminal or the server by running the nonvolatile software programs, instructions and modules stored in the memory 630, that is, implement the running control method of the virtual reality application in the above method embodiment.
The memory 630 may include a storage program area and a storage data area, wherein the storage program area may store an operating system, an application program required for at least one function; the storage data area may store data created according to the virtual reality device usage, and the like. Further, the memory 630 may include high speed random access memory, and may also include non-volatile memory, such as at least one magnetic disk storage device, flash memory device, or other non-volatile solid state storage device. In some embodiments, the memory 630 optionally includes memory located remotely from the processors 610, 620, which may be connected to the virtual reality device via a network. Examples of such networks include, but are not limited to, the internet, intranets, local area networks, mobile communication networks, and combinations thereof.
The one or more modules are stored in the memory 630, and when executed by the one or more processors 610, 620, perform the operation control method of the virtual reality application in any of the above method embodiments, for example, perform the above-described method steps 302 to 308 in fig. 3, and method steps 407 to 410 in fig. 4, and implement the functions of the event confirmation unit, the head tracking sensing unit, and the rendering module in fig. 2.
The product can execute the method provided by the embodiment of the application, and has the corresponding functional modules and beneficial effects of the execution method. For technical details that are not described in detail in this embodiment, reference may be made to the methods provided in the embodiments of the present application.
The electronic device of the embodiments of the present application exists in various forms, including but not limited to:
(1) mobile communication devices, which are characterized by mobile communication capabilities and are primarily targeted at providing voice and data communications. Such terminals include smart phones (e.g., iphones), multimedia phones, functional phones, and low-end phones, among others.
(2) The ultra-mobile personal computer equipment belongs to the category of personal computers, has calculation and processing functions and generally has the characteristic of mobile internet access. Such terminals include PDA, MID, and UMPC devices, such as ipads.
(3) Portable entertainment devices such devices may display and play multimedia content. Such devices include audio and video players (e.g., ipods), handheld game consoles, electronic books, as well as smart toys and portable car navigation devices.
(4) The server is similar to a general computer architecture, but has higher requirements on processing capability, stability, reliability, safety, expandability, manageability and the like because of the need of providing highly reliable services.
(5) And other electronic devices with data interaction functions.
Embodiments of the present application provide a non-transitory computer-readable storage medium, which stores computer-executable instructions, which are executed by one or more processors, such as the processors 610 and 620 in fig. 6, and enable the one or more processors to perform an operation control method of a virtual reality application in any of the above method embodiments, for example, the method steps 302 to 308 in fig. 3 and the method steps 407 to 410 in fig. 4 described above are performed, so as to implement the functions of the event confirmation unit, the head tracking sensing unit, and the rendering module in fig. 2.
The above-described embodiments of the apparatus are merely illustrative, and the units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the modules may be selected according to actual needs to achieve the purpose of the solution of the present embodiment.
Through the above description of the embodiments, those skilled in the art will clearly understand that each embodiment can be implemented by software plus a general hardware platform, and certainly can also be implemented by hardware. It will be understood by those skilled in the art that all or part of the processes of the methods of the embodiments described above can be implemented by hardware related to instructions of a computer program, which can be stored in a computer readable storage medium, and when executed, can include the processes of the embodiments of the methods described above. The storage medium may be a magnetic disk, an optical disk, a Read-Only Memory (ROM), a Random Access Memory (RAM), or the like.
Finally, it should be noted that: the above embodiments are only used to illustrate the technical solutions of the present application, and not to limit the same; within the context of the present application, where technical features in the above embodiments or in different embodiments can also be combined, the steps can be implemented in any order and there are many other variations of the different aspects of the present application as described above, which are not provided in detail for the sake of brevity; although the present application has been described in detail with reference to the foregoing embodiments, it should be understood by those of ordinary skill in the art that: the technical solutions described in the foregoing embodiments may still be modified, or some technical features may be equivalently replaced; and the modifications or the substitutions do not make the essence of the corresponding technical solutions depart from the scope of the technical solutions of the embodiments of the present application.

Claims (10)

1. An operation control method of a virtual reality application is characterized by comprising the following steps:
tracking and acquiring current head posture parameters of a user, and determining the motion state of the head of the user according to the current head posture parameters;
determining a user input event, in particular,
when the user input event comprises a visual angle control event and a direction control event, determining the user input event corresponding to the motion state of the head of the user according to the pre-established corresponding relation between the head motion state and the user input event,
when the user input event comprises a direction control event, reading an external device input event of a user, and determining the motion state of the head of the user and the user input event corresponding to the external device input event of the user according to a pre-established corresponding relation between the head motion state, the external device input event and the user input event, wherein the external device comprises a binary key, the external device input event comprises a first input event and a second input event, at the moment,
the pre-established corresponding relation among the head motion state, the external equipment input event and the user input event comprises the following steps:
when the head movement state is upward, if the external device input event is a first input event, the user input event is an upward event, if the external device input event is a second input event, the user input event is a downward event,
when the head movement state is downward, if the external device input event is a first input event, the user input event is a downward event, if the external device input event is a second input event, the user input event is an upward event,
when the head movement state is towards the left, if the external device input event is a first input event, the user input event is a left event, if the external device input event is a second input event, the user input event is a right event,
when the head movement state is right, if the external device input event is a first input event, the user input event is a right event, if the external device input event is a second input event, the user input event is a left event,
when the head movement state is upward left, if the external device input event is a first input event, the user input event is an upward left event, if the external device input event is a second input event, the user input event is a downward right event,
when the head movement state is upward right, if the external device input event is a first input event, the user input event is an upward right event, if the external device input event is a second input event, the user input event is a downward left event,
when the head movement state is left-down, if the external device input event is a first input event, the user input event is a left-down event, if the external device input event is a second input event, the user input event is a right-up event,
when the head movement state is downward to the right, if the external equipment input event is a first input event, the user input event is a downward event to the right, and if the external equipment input event is a second input event, the user input event is an upward event to the left;
and executing the function corresponding to the determined user input event.
2. The method of claim 1, wherein the direction control event is used to control a direction of motion of a virtual object in a virtual reality scene displayed by the virtual reality application;
the executing the function corresponding to the determined user input event comprises:
and controlling the virtual object in the virtual scene to move according to the movement direction corresponding to the movement direction control event according to the direction control event.
3. The method according to claim 1, wherein the motion state of the user's head comprises a head rotation direction and/or a head rotation angle.
4. The method of claim 1, wherein the virtual reality application is transformed from a native 3D application;
the user input event is a view angle control event;
the executing the function corresponding to the user input event comprises:
and transforming the original observation matrix or the original projection matrix of the virtual scene according to the current head posture parameter, and constructing and displaying a virtual scene image by using the transformed observation matrix or projection matrix, thereby transforming the observation angle of the virtual scene and realizing the synchronization of the observation angle of the virtual scene and the observation angle of the user after the head moves.
5. An operation control device for a virtual reality application, comprising:
the head tracking sensing unit is used for tracking and acquiring the current head posture parameter of the user and determining the motion state of the head of the user according to the current head posture parameter;
an event confirmation unit for determining a user input event, specifically,
when the user input event comprises a visual angle control event and a direction control event, the event confirmation unit is used for determining the user input event corresponding to the motion state of the head of the user according to the pre-established corresponding relation between the head motion state and the user input event,
when the user input event comprises a direction control event, the external interactive device is used for reading an external device input event of a user, the event confirmation unit is used for determining the motion state of the head of the user and a user input event corresponding to the external device input event of the user according to a pre-established corresponding relation between the head motion state, the external device input event and the user input event, wherein the external device comprises a binary key, the external device input event comprises a first input event and a second input event, and at the moment,
the pre-established corresponding relation among the head motion state, the external equipment input event and the user input event comprises the following steps:
when the head movement state is upward, if the external device input event is a first input event, the user input event is an upward event, if the external device input event is a second input event, the user input event is a downward event,
when the head movement state is downward, if the external device input event is a first input event, the user input event is a downward event, if the external device input event is a second input event, the user input event is an upward event,
when the head movement state is towards the left, if the external device input event is a first input event, the user input event is a left event, if the external device input event is a second input event, the user input event is a right event,
when the head movement state is right, if the external device input event is a first input event, the user input event is a right event, if the external device input event is a second input event, the user input event is a left event,
when the head movement state is upward left, if the external device input event is a first input event, the user input event is an upward left event, if the external device input event is a second input event, the user input event is a downward right event,
when the head movement state is upward right, if the external device input event is a first input event, the user input event is an upward right event, if the external device input event is a second input event, the user input event is a downward left event,
when the head movement state is left-down, if the external device input event is a first input event, the user input event is a left-down event, if the external device input event is a second input event, the user input event is a right-up event,
when the head movement state is downward to the right, if the external equipment input event is a first input event, the user input event is a downward event to the right, and if the external equipment input event is a second input event, the user input event is an upward event to the left;
and the control unit is used for executing the function corresponding to the determined user input event.
6. The apparatus of claim 5, wherein the direction control event is used to control a direction of motion of a virtual object in a virtual reality scene displayed by the virtual reality application,
the control unit is specifically configured to control, according to the direction control event, a virtual object in a virtual scene to move according to a movement direction corresponding to the movement direction control event.
7. The apparatus of claim 5, wherein the state of motion of the user's head comprises a head rotation direction and/or a head rotation angle.
8. The apparatus of claim 5, wherein the virtual reality application is transformed from a native 3D application;
the user input event is a view angle control event;
the control unit is used for:
and transforming the original observation matrix or the original projection matrix of the virtual scene according to the current head posture parameter, and constructing and displaying a virtual scene image by using the transformed observation matrix or projection matrix, thereby transforming the observation angle of the virtual scene and realizing the synchronization of the observation angle of the virtual scene and the observation angle of the user after the head moves.
9. A virtual reality device, comprising:
a display, at least one processor; and the number of the first and second groups,
a memory communicatively coupled to the at least one processor; wherein the content of the first and second substances,
the memory stores instructions executable by the at least one processor to enable the at least one processor to perform the method of any one of claims 1 to 4.
10. A non-transitory computer-readable storage medium storing computer-executable instructions for causing a computer to perform the method of any one of claims 1 to 4.
CN201611260768.8A 2016-12-30 2016-12-30 Operation control method and device for virtual reality application Active CN106873767B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201611260768.8A CN106873767B (en) 2016-12-30 2016-12-30 Operation control method and device for virtual reality application

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201611260768.8A CN106873767B (en) 2016-12-30 2016-12-30 Operation control method and device for virtual reality application

Publications (2)

Publication Number Publication Date
CN106873767A CN106873767A (en) 2017-06-20
CN106873767B true CN106873767B (en) 2020-06-23

Family

ID=59164216

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201611260768.8A Active CN106873767B (en) 2016-12-30 2016-12-30 Operation control method and device for virtual reality application

Country Status (1)

Country Link
CN (1) CN106873767B (en)

Families Citing this family (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107290972B (en) * 2017-07-05 2021-02-26 三星电子(中国)研发中心 Equipment control method and device
CN107402634B (en) * 2017-07-28 2021-05-18 歌尔光学科技有限公司 Parameter adjusting method and device for virtual reality equipment
CN107817895B (en) * 2017-09-26 2021-01-05 微幻科技(北京)有限公司 Scene switching method and device
US10341537B2 (en) * 2017-09-29 2019-07-02 Sony Interactive Entertainment America Llc Spectator view into an interactive gaming world showcased in a live event held in a real-world venue
CN107908281A (en) * 2017-11-06 2018-04-13 北京小米移动软件有限公司 Virtual reality exchange method, device and computer-readable recording medium
CN108245890B (en) * 2018-02-28 2021-04-27 网易(杭州)网络有限公司 Method and device for controlling movement of object in virtual scene
CN108717733B (en) * 2018-06-07 2019-07-02 腾讯科技(深圳)有限公司 View angle switch method, equipment and the storage medium of virtual environment
CN109144245B (en) * 2018-07-04 2021-09-14 Oppo(重庆)智能科技有限公司 Equipment control method and related product
CN110827413A (en) * 2018-08-09 2020-02-21 北京微播视界科技有限公司 Method, apparatus and computer-readable storage medium for controlling a change in a virtual object form
CN109471533B (en) * 2018-11-09 2021-09-07 深圳职业技术学院 Student end system in VR/AR classroom and use method thereof
CN111338476A (en) * 2020-02-25 2020-06-26 上海唯二网络科技有限公司 Method and device for realizing human-computer interaction through head-mounted VR display equipment
CN114699770A (en) * 2022-04-19 2022-07-05 北京字跳网络技术有限公司 Method and device for controlling motion of virtual object

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103793060A (en) * 2014-02-14 2014-05-14 杨智 User interaction system and method
CN105867613A (en) * 2016-03-21 2016-08-17 乐视致新电子科技(天津)有限公司 Head control interaction method and apparatus based on virtual reality system
CN105892680A (en) * 2016-04-28 2016-08-24 乐视控股(北京)有限公司 Interactive equipment control method and device based on virtual reality helmet
CN106200927A (en) * 2016-06-30 2016-12-07 乐视控股(北京)有限公司 A kind of information processing method and headset equipment
CN106200899A (en) * 2016-06-24 2016-12-07 北京奇思信息技术有限公司 The method and system that virtual reality is mutual are controlled according to user's headwork

Family Cites Families (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103116451B (en) * 2013-01-25 2018-10-26 腾讯科技(深圳)有限公司 A kind of virtual character interactive of intelligent terminal, device and system
CN105704468B (en) * 2015-08-31 2017-07-18 深圳超多维光电子有限公司 Stereo display method, device and electronic equipment for virtual and reality scene
CN105955461A (en) * 2016-04-25 2016-09-21 乐视控股(北京)有限公司 Interactive interface management method and system

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103793060A (en) * 2014-02-14 2014-05-14 杨智 User interaction system and method
CN105867613A (en) * 2016-03-21 2016-08-17 乐视致新电子科技(天津)有限公司 Head control interaction method and apparatus based on virtual reality system
CN105892680A (en) * 2016-04-28 2016-08-24 乐视控股(北京)有限公司 Interactive equipment control method and device based on virtual reality helmet
CN106200899A (en) * 2016-06-24 2016-12-07 北京奇思信息技术有限公司 The method and system that virtual reality is mutual are controlled according to user's headwork
CN106200927A (en) * 2016-06-30 2016-12-07 乐视控股(北京)有限公司 A kind of information processing method and headset equipment

Also Published As

Publication number Publication date
CN106873767A (en) 2017-06-20

Similar Documents

Publication Publication Date Title
CN106873767B (en) Operation control method and device for virtual reality application
JP7486276B2 (en) Eye Tracking Calibration Technique
US11340694B2 (en) Visual aura around field of view
EP3427130B1 (en) Virtual reality
AU2018355441B2 (en) Virtual reticle for augmented reality systems
AU2017244109B2 (en) Interactions with 3D virtual objects using poses and multiple-DOF controllers
EP3365724B1 (en) Selecting virtual objects in a three-dimensional space
JP7283388B2 (en) Information processing device, information processing method, and program
Lee et al. Tunnelslice: Freehand subspace acquisition using an egocentric tunnel for wearable augmented reality
JP2022153476A (en) Animation creation system
JP2022020686A (en) Information processing method, program, and computer
CN114053693A (en) Object control method and device in virtual scene and terminal equipment
JP2019032715A (en) Information processing method, device, and program for causing computer to execute the method
CN117930983A (en) Display control method, device, equipment and medium
KR20240093921A (en) Eye tracking calibration techniques

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant