CN112451969B - Virtual object control method, device, computer equipment and storage medium - Google Patents

Virtual object control method, device, computer equipment and storage medium Download PDF

Info

Publication number
CN112451969B
CN112451969B CN202011413036.4A CN202011413036A CN112451969B CN 112451969 B CN112451969 B CN 112451969B CN 202011413036 A CN202011413036 A CN 202011413036A CN 112451969 B CN112451969 B CN 112451969B
Authority
CN
China
Prior art keywords
coordinate
distance
physical
coordinate distance
sliding operation
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202011413036.4A
Other languages
Chinese (zh)
Other versions
CN112451969A (en
Inventor
杨金昊
林凌云
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Tencent Technology Shenzhen Co Ltd
Original Assignee
Tencent Technology Shenzhen Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Tencent Technology Shenzhen Co Ltd filed Critical Tencent Technology Shenzhen Co Ltd
Priority to CN202011413036.4A priority Critical patent/CN112451969B/en
Publication of CN112451969A publication Critical patent/CN112451969A/en
Application granted granted Critical
Publication of CN112451969B publication Critical patent/CN112451969B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/55Controlling game characters or game objects based on the game progress
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/40Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment
    • A63F13/42Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment by mapping the input signals into game commands, e.g. mapping the displacement of a stylus on a touch screen to the steering angle of a virtual vehicle
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/80Special adaptations for executing a specific game genre or game mode
    • A63F13/837Shooting of targets
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02PCLIMATE CHANGE MITIGATION TECHNOLOGIES IN THE PRODUCTION OR PROCESSING OF GOODS
    • Y02P90/00Enabling technologies with a potential contribution to greenhouse gas [GHG] emissions mitigation
    • Y02P90/02Total factory control, e.g. smart factories, flexible manufacturing systems [FMS] or integrated manufacturing systems [IMS]

Abstract

The application relates to a virtual object control method, a virtual object control device, computer equipment and a storage medium, and relates to the technical field of virtual scenes. The method comprises the following steps: displaying a virtual scene picture corresponding to the virtual scene; responding to sliding operation on the virtual scene picture, and acquiring a first coordinate distance corresponding to the sliding operation; acquiring a first physical distance corresponding to the first coordinate distance; and controlling the target virtual object to rotate in the virtual scene based on the first physical distance and the sliding direction of the sliding operation. By the method, the user can estimate the angle of rotation of the visual angle of the target virtual object after rotating according to the operable physical distance, and the accuracy of the visual angle rotation corresponding to the target virtual object is improved.

Description

Virtual object control method, device, computer equipment and storage medium
Technical Field
The present disclosure relates to the field of virtual scene technologies, and in particular, to a virtual object control method, device, computer device, and storage medium.
Background
Currently, in shooting game applications, for example, in a first person shooting game, in order to improve a selection direction that a user can conveniently select when shooting or moving, the user can implement rotation of a user viewing angle by touching a mobile phone screen or moving a pointer input device such as a mouse and a touch pad.
In the related art, in order to achieve improvement of accuracy of a user in rotating a viewing angle, the user may select sensitivity of viewing angle rotation in a setting interface of a game-like application, and the user may manually select sensitivity most suitable for the user to improve accuracy of the user in rotating the viewing angle.
However, when the user performs the rotation of the viewing angle using different devices in the related art, the difference of the rotation angles is large and the accuracy of the rotation of the viewing angle is low.
Disclosure of Invention
The embodiment of the application provides a virtual object control method, a device, computer equipment and a storage medium, which can control the rotation of a target virtual object according to the sliding operation distance of a user, and improve the accuracy of the rotation of a visual angle corresponding to the target virtual object, and the technical scheme is as follows:
in one aspect, a virtual object control method is provided, the method including:
displaying a virtual scene picture corresponding to the virtual scene; the virtual scene comprises a target virtual object;
responding to sliding operation on the virtual scene picture, and acquiring a first coordinate distance corresponding to the sliding operation; the first coordinate distance is a distance between pixel coordinates between a start point and an end point of the sliding operation;
Acquiring a first physical distance corresponding to the first coordinate distance; the first physical distance is a physical distance at which the sliding operation slides;
and controlling the target virtual object to rotate in the virtual scene based on the first physical distance and the sliding direction of the sliding operation.
In yet another aspect, a virtual object control method is provided, the method including:
displaying a virtual scene picture corresponding to the virtual scene; the virtual scene comprises a target virtual object;
controlling the target virtual object to rotate in the virtual scene based on a physical distance of sliding of the sliding operation and a sliding direction of the sliding operation in response to the sliding operation on a virtual scene screen;
wherein, in terminals with different screen sizes and resolutions, the angles of rotation of the target virtual objects in the virtual scene are the same under the control of the sliding operation of the same physical distance.
In still another aspect, there is provided a virtual object control apparatus, the apparatus including:
the virtual scene picture display module is used for displaying a virtual scene picture corresponding to the virtual scene; the virtual scene comprises a target virtual object;
The first coordinate distance acquisition module is used for responding to the sliding operation on the virtual scene picture and acquiring a first coordinate distance corresponding to the sliding operation; the first coordinate distance is a distance between pixel coordinates between a start point and an end point of the sliding operation;
the first physical distance acquisition module is used for acquiring a first physical distance corresponding to the first coordinate distance; the first physical distance is a physical distance at which the sliding operation slides;
and the virtual object rotation module is used for controlling the target virtual object to rotate in the virtual scene based on the first physical distance and the sliding direction of the sliding operation.
In one possible implementation, the apparatus further includes:
the resolution obtaining module is used for obtaining the rendering resolution of the virtual scene picture and displaying the physical resolution of the display screen of the virtual scene picture;
the first coordinate distance acquisition module is used for acquiring the first coordinate distance,
and responding to the sliding operation on the virtual scene picture, and acquiring the first coordinate distance based on the rendering resolution and the physical resolution.
In one possible implementation, the rendering resolution is the same as the physical resolution;
The first coordinate distance acquisition module is used for acquiring the first coordinate distance,
in response to receiving the sliding operation, and the rendering resolution is the same as the physical resolution, the first coordinate distance is acquired based on the physical resolution.
In one possible implementation manner, the first coordinate distance obtaining module includes:
a position coordinate acquisition unit configured to acquire a first position coordinate and a second position coordinate of the sliding operation at the physical resolution; the first position coordinates are physical pixel coordinates of a start point of the sliding operation; the second position coordinates are physical pixel coordinates of an end point of the sliding operation;
and the first coordinate distance acquisition unit is used for acquiring the pixel distance between the first position coordinate and the second position coordinate as the first coordinate distance.
In one possible implementation manner, the first coordinate distance obtaining module includes:
a second coordinate distance obtaining unit, configured to obtain a second coordinate distance corresponding to the sliding operation based on the rendering resolution in response to receiving the sliding operation, where the rendering resolution is smaller than the physical resolution;
A scaling value obtaining unit, configured to obtain a scaling value based on the rendering resolution and the physical resolution;
and the first coordinate distance acquisition unit is used for acquiring the first coordinate distance based on the second coordinate distance and the scaling value.
In one possible implementation manner, the second coordinate distance acquiring unit includes:
a position coordinate acquisition subunit, configured to acquire a third position coordinate and a fourth position coordinate of the sliding operation under the rendering resolution; the third position coordinates are rendering pixel coordinates of a start point of the sliding operation; the fourth position coordinates are rendering pixel coordinates of an end point of the sliding operation;
and the coordinate distance acquisition subunit is used for acquiring the pixel distance between the third position coordinate and the fourth position coordinate as the second coordinate distance.
In one possible implementation, the apparatus further includes:
the pixel density acquisition module is used for acquiring the pixel density of a display screen for displaying the virtual scene picture; the pixel density is used for indicating the corresponding relation between the pixel value and the size of the terminal;
and the first physical distance acquisition module is used for acquiring the ratio of the first coordinate distance to the pixel density as the first physical distance corresponding to the first coordinate distance.
In one possible implementation manner, the first physical distance acquiring module is configured to:
and acquiring the first physical distance corresponding to the first coordinate distance in response to the relation between the first coordinate distance and a coordinate distance threshold meeting a specified condition.
In one possible implementation, the coordinate distance threshold includes a first coordinate distance threshold; the apparatus further comprises:
and the first specified condition determining module is used for determining that the relation between the first coordinate distance and the coordinate distance threshold meets the specified condition in response to the fact that the modulus of the first coordinate distance is larger than the first coordinate distance threshold.
In one possible implementation, the coordinate distance threshold includes a second coordinate distance threshold; the first coordinate distance comprises a first abscissa distance and a first ordinate distance; the apparatus further comprises:
and a second specified condition determining module configured to determine that a relationship between the first coordinate distance and the coordinate distance threshold satisfies the specified condition in response to the first abscissa distance and the first ordinate distance being less than the second coordinate distance threshold.
In one possible implementation, the coordinate distance threshold includes a first coordinate distance threshold and a second coordinate distance threshold; the first coordinate distance comprises a first abscissa distance and a first ordinate distance;
The apparatus further comprises:
and a third specified condition determining module, configured to determine that a relationship between the first coordinate distance and the coordinate distance threshold satisfies the specified condition in response to a modulus of the first coordinate distance being greater than the first coordinate distance threshold and the first abscissa distance and the first ordinate distance being less than the second coordinate distance threshold.
In yet another aspect, there is provided a virtual object control apparatus, the apparatus including:
the scene picture display module is used for displaying a virtual scene picture corresponding to the virtual scene; the virtual scene comprises a target virtual object;
the virtual object control module is used for responding to the sliding operation on the virtual scene picture and controlling the target virtual object to rotate in the virtual scene based on the sliding physical distance of the sliding operation and the sliding direction of the sliding operation;
wherein, in terminals with different screen sizes and resolutions, the angles of rotation of the target virtual objects in the virtual scene are the same under the control of the sliding operation of the same physical distance.
In another aspect, a computer device is provided, the computer device comprising a processor and a memory, the memory storing at least one instruction, at least one program, a set of codes, or a set of instructions, the at least one instruction, the at least one program, the set of codes, or the set of instructions being loaded and executed by the processor to implement the virtual object control method described above.
In another aspect, a computer readable storage medium having stored therein at least one instruction, at least one program, a set of codes, or a set of instructions loaded and executed by the processor to implement the above virtual object control method is provided.
In yet another aspect, a computer program product or computer program is provided, the computer program product or computer program comprising computer instructions stored in a computer readable storage medium. A processor of a computer device reads the computer instructions from a computer-readable storage medium, and the processor executes the computer instructions so that the computer device performs a virtual object control method.
The beneficial effects of the technical scheme provided by the embodiment of the application at least comprise:
the user takes sliding operation on the terminal corresponding to the virtual scene picture, the terminal obtains a distance parameter corresponding to the sliding operation, determines the sliding physical distance of the user taking the sliding operation on the terminal screen according to the distance parameter, and rotates the target virtual object based on the physical distance. According to the scheme, the terminal calculates the physical distance of manual operation of the user corresponding to the coordinate distance value according to the coordinate distance value on the virtual scene picture, and determines the rotation angle of the target virtual object according to the physical distance, so that the rotation angle of the target virtual object corresponds to the physical distance of manual operation of the user, the user can estimate the rotation angle of the visual angle of the target virtual object after rotating according to the operable physical distance, and the accuracy of the visual angle rotation corresponding to the target virtual object is improved.
Drawings
The accompanying drawings, which are incorporated in and constitute a part of this specification, illustrate embodiments consistent with the application and together with the description, serve to explain the principles of the application.
FIG. 1 illustrates a schematic diagram of an implementation environment provided by an exemplary embodiment of the present application;
FIG. 2 illustrates a display interface schematic of a virtual scene provided by an exemplary embodiment of the present application;
FIG. 3 illustrates a flow chart of a virtual object control method provided by an exemplary embodiment of the present application;
FIG. 4 illustrates a flow chart of a virtual object control method illustrated in an exemplary embodiment of the present application;
FIG. 5 illustrates a flow chart of one useful operation selection involved in the embodiment illustrated in FIG. 4;
FIG. 6 is a diagram showing a high resolution terminal screen rotation according to the embodiment of FIG. 4;
FIG. 7 is a diagram showing a rotation of a low resolution terminal screen object virtual object according to the embodiment shown in FIG. 4;
FIG. 8 is a schematic diagram illustrating a virtual object control flow according to an example embodiment;
FIG. 9 illustrates a flowchart of a virtual object control method provided by an exemplary embodiment of the present application;
FIG. 10 illustrates a block diagram of a virtual object control apparatus provided in an exemplary embodiment of the present application;
FIG. 11 illustrates a block diagram of a virtual object control apparatus provided in an exemplary embodiment of the present application;
FIG. 12 is a block diagram of a computer device shown in accordance with an exemplary embodiment;
fig. 13 is a block diagram of a computer device, according to an example embodiment.
Detailed Description
Reference will now be made in detail to exemplary embodiments, examples of which are illustrated in the accompanying drawings. When the following description refers to the accompanying drawings, the same numbers in different drawings refer to the same or similar elements, unless otherwise indicated. The implementations described in the following exemplary examples are not representative of all implementations consistent with the present application. Rather, they are merely examples of apparatus and methods consistent with some aspects of the present application as detailed in the accompanying claims.
It should be understood that references herein to "a number" means one or more, and "a plurality" means two or more. "and/or", describes an association relationship of an association object, and indicates that there may be three relationships, for example, a and/or B, and may indicate: a exists alone, A and B exist together, and B exists alone. The character "/" generally indicates that the context-dependent object is an "or" relationship.
For ease of understanding, several terms referred to in this application are explained below.
1) Virtual scene
A virtual scene is a virtual scene that an application program displays (or provides) while running on a terminal. The virtual scene can be a simulation environment scene of a real world, a half-simulation half-fictional three-dimensional environment scene, or a pure fictional three-dimensional environment scene. The virtual scene may be any one of a two-dimensional virtual scene, a 2.5-dimensional virtual scene, and a three-dimensional virtual scene, and the following embodiments are exemplified by the virtual scene being a three-dimensional virtual scene, but are not limited thereto. Optionally, the virtual scene may also be used for virtual scene fight between at least two virtual characters. Optionally, the virtual scene may also be used to fight between at least two virtual characters using a virtual firearm. Optionally, the virtual scene may be further operable to use the virtual firearm to fight between at least two virtual characters within a target area range that is continuously smaller over time in the virtual scene.
Virtual scenes are typically presented by application generation in a computer device such as a terminal based on hardware (such as a screen) in the terminal. The terminal can be a mobile terminal such as a smart phone, a tablet computer or an electronic book reader; alternatively, the terminal may be a notebook computer or a personal computer device of a stationary computer.
2) Virtual object
Virtual objects refer to movable objects in a virtual scene. The movable object may be at least one of a virtual character, a virtual animal, a virtual vehicle. Alternatively, when the virtual scene is a three-dimensional virtual scene, the virtual object is a three-dimensional stereoscopic model created based on an animated skeleton technique. Each virtual object has its own shape, volume, and orientation in the three-dimensional virtual scene and occupies a portion of the space in the three-dimensional virtual scene.
3) Virtual prop
The virtual props refer to props which can be used by virtual objects in a virtual environment, and comprise virtual weapons such as handguns, rifles, sniper guns, daggers, knives, swords, axes and the like which can initiate injuries to other virtual objects, replenishing props such as bullets, quick clips, sighting telescope, silencers and the like which are installed on appointed virtual weapons, virtual pendants with added partial attributes for the virtual weapons, and defending props such as shields, armor, armored vehicles and the like.
4) First person shooting game
The first-person shooting game is a shooting game in which a user can play at a first-person viewing angle, and a screen of a virtual environment in the game is a screen in which the virtual environment is observed at a viewing angle of a first virtual object. In the game, at least two virtual objects perform a single-play fight mode in the virtual environment, the virtual objects achieve the purpose of survival in the virtual environment by avoiding injuries initiated by other virtual objects and dangers (such as poison gas rings, marshes and the like) existing in the virtual environment, when the life value of the virtual objects in the virtual environment is zero, the life of the virtual objects in the virtual environment is ended, and finally the virtual objects surviving in the virtual environment are winners. Optionally, the fight may take a time when the first client joins the fight as a start time and a time when the last client exits the fight as an end time, and each client may control one or more virtual objects in the virtual environment. Alternatively, the competitive mode of the fight may include a single fight mode, a two-person team fight mode, or a multi-person team fight mode, which is not limited in the embodiments of the present application.
FIG. 1 illustrates a schematic diagram of an implementation environment provided by an exemplary embodiment of the present application. The implementation environment may include: a first terminal 110, a server 120, and a second terminal 130.
The first terminal 110 is installed and operated with an application 111 supporting a virtual environment, and the application 111 may be a multi-person online fight program. When the first terminal runs the application 111, a user interface of the application 111 is displayed on the screen of the first terminal 110. The application 111 may be any one of a multiplayer online tactical Game (Multiplayer Online Battle Arena Games, MOBA), a Simulation strategic Game (SLG). In the present embodiment, the application 111 is exemplified as an FPS (First Person Shooting Game, first person shooter game). The first terminal 110 is a terminal used by the first user 112, and the first user 112 uses the first terminal 110 to control a first virtual object located in the virtual environment to perform activities, where the first virtual object may be referred to as a master virtual object of the first user 112. The activities of the first virtual object include, but are not limited to: adjusting at least one of body posture, crawling, walking, running, riding, flying, jumping, driving, picking up, shooting, attacking, throwing, releasing skills. Illustratively, the first virtual object is a first virtual character, such as an emulated character or a cartoon character.
The second terminal 130 is installed and operated with an application 131 supporting a virtual environment, and the application 131 may be a multi-person online fight program. When the second terminal 130 runs the application 131, a user interface of the application 131 is displayed on a screen of the second terminal 130. The client may be any of a MOBA game and a SLG game, and in this embodiment, the application 131 is exemplified as an FPS game. The second terminal 130 is a terminal used by the second user 132, and the second user 132 uses the second terminal 130 to control a second virtual object located in the virtual environment to perform activities, and the second virtual object may be referred to as a master virtual character of the second user 132. Illustratively, the second virtual object is a second virtual character, such as an emulated character or a cartoon character.
Optionally, the first virtual object and the second virtual object are in the same virtual world. Optionally, the first virtual object and the second virtual object may belong to the same camp, the same team, the same organization, have a friend relationship, or have temporary communication rights. Alternatively, the first virtual object and the second virtual object may belong to different camps, different teams, different organizations, or have hostile relationships.
Alternatively, the applications installed on the first terminal 110 and the second terminal 130 are the same, or the applications installed on the two terminals are the same type of application on different operating system platforms (android or IOS). The first terminal 110 may refer broadly to one of the plurality of terminals and the second terminal 130 may refer broadly to another of the plurality of terminals, the present embodiment being illustrated with only the first terminal 110 and the second terminal 130. The device types of the first terminal 110 and the second terminal 130 are the same or different, and the device types include: at least one of a smart phone, a tablet computer, an electronic book reader, an MP3 player, an MP4 player, a laptop portable computer, and a desktop computer.
Only two terminals are shown in fig. 1, but in different embodiments there are a number of other terminals that can access the server 120. Optionally, there is one or more terminals corresponding to the developer, on which a development and editing platform for supporting the application program of the virtual environment is installed, the developer may edit and update the application program on the terminal, and transmit the updated application program installation package to the server 120 through a wired or wireless network, and the first terminal 110 and the second terminal 130 may download the application program installation package from the server 120 to implement the update of the application program.
The first terminal 110, the second terminal 130, and other terminals are connected to the server 120 through a wireless network or a wired network.
The server 120 includes at least one of a server, a server cluster formed by a plurality of servers, a cloud computing platform and a virtualization center. The server 120 is used to provide background services for applications supporting a three-dimensional virtual environment. Optionally, the server 120 takes on primary computing work and the terminal takes on secondary computing work; alternatively, the server 120 takes on secondary computing work and the terminal takes on primary computing work; alternatively, a distributed computing architecture is used for collaborative computing between the server 120 and the terminals.
In one illustrative example, server 120 includes memory 121, processor 122, user account database 123, combat service module 124, and user-oriented Input/Output Interface (I/O Interface) 125. Wherein the processor 122 is configured to load instructions stored in the server 120, process data in the user account database 123 and the combat service module 124; the user account database 123 is configured to store data of user accounts used by the first terminal 110, the second terminal 130, and other terminals, such as an avatar of the user account, a nickname of the user account, and a combat index of the user account, where the user account is located; the combat service module 124 is configured to provide a plurality of combat rooms for users to combat, such as 1V1 combat, 3V3 combat, 5V5 combat, etc.; the user-oriented I/O interface 125 is used to establish communication exchanges of data with the first terminal 110 and/or the second terminal 130 via a wireless network or a wired network.
The virtual scene may be a three-dimensional virtual scene, or the virtual scene may be a two-dimensional virtual scene. Taking an example that the virtual scene is a three-dimensional virtual scene, please refer to fig. 2, which illustrates a schematic diagram of a display interface of the virtual scene provided in an exemplary embodiment of the present application. As shown in fig. 2, the display interface of the virtual scene includes a scene screen 200, and the scene screen 200 includes a virtual object 210 currently controlled, an environment screen 220 of the three-dimensional virtual scene, and a virtual object 240. Wherein, the virtual object 240 may be a virtual object controlled by a corresponding user of other terminals or a virtual object controlled by an application program.
In fig. 2, the currently controlled virtual object 210 and the virtual object 240 are three-dimensional models in a three-dimensional virtual scene, and an environmental screen of the three-dimensional virtual scene displayed in the scene screen 200 is an object observed from a perspective of the currently controlled virtual object 210, and as illustrated in fig. 2, an environmental screen 220 of the three-dimensional virtual scene displayed under the perspective of the currently controlled virtual object 210 is the ground 224, the sky 225, the horizon 223, the hill 221, and the factory building 222, for example.
The currently controlled virtual object 210 may perform skill release or use of a virtual prop under control of a user, move and perform a specified action, and the virtual object in the virtual scene may exhibit different three-dimensional models under control of the user, for example, a screen of the terminal supports touch operation, and the scene picture 200 of the virtual scene includes a virtual control, so when the user touches the virtual control, the currently controlled virtual object 210 may perform the specified action in the virtual scene and exhibit the currently corresponding three-dimensional model.
Fig. 3 shows a flowchart of a virtual object control method according to an exemplary embodiment of the present application. The virtual object control method may be performed by a computer device, which may be a terminal, a server, or the computer device may include the terminal and the server. As shown in fig. 3, the virtual object control method includes:
step 301, displaying a virtual scene picture corresponding to a virtual scene; the virtual scene contains a target virtual object.
In one possible implementation, the virtual scene picture is a picture of the virtual scene viewed at the perspective of the target virtual object.
In one possible implementation, the direction of the virtual object in the virtual scene may be the direction in which the virtual object faces, taking the virtual scene as a three-dimensional virtual scene as an example, the computer device may determine the direction of the virtual object by acquiring the projection direction of the virtual object in the horizontal direction in the three-dimensional coordinate system, for example, when the virtual object is in a standing, squatting, jumping or floating posture in the virtual scene, the direction of the virtual object is the right front direction in which the virtual object faces. Alternatively, the orientation direction of the virtual object may be a direction from the foot of the virtual object toward the head of the virtual object, for example, when the virtual object is in a posture such as lying down, swimming, flying, or the like, the orientation direction of the virtual object is a projection direction in a horizontal direction in a three-dimensional horizontal coordinate system from the foot of the virtual object toward the head of the virtual object. When the virtual scene is a two-dimensional virtual scene, the orientation direction of the virtual object is the facing direction of the virtual object. The embodiment of the application will be described by taking the virtual scene as a three-dimensional virtual scene as an example.
In one possible implementation, at least one of a firing control, a jump control, a movement control, and a prone control is superimposed on the virtual scene picture.
Wherein, in response to receiving a trigger operation of the firing control, the virtual object performs a firing operation on the target position through a virtual prop (i.e., a virtual firearm) in the virtual object hand; the jump control is used for triggering the jump operation of the virtual object; the mobile control is used for triggering and controlling the virtual object to move towards the direction corresponding to the control sliding information according to the control sliding information corresponding to the mobile control; the prone control is used for triggering prone operation of the virtual object.
In one possible implementation, the prone control may also control the virtual object to perform different operations differently by way of user operation. For example, when receiving a click operation of the prone control by a user, controlling the virtual object to execute a squat operation; when a long-press operation of a user on the prone control is received, controlling the virtual object to execute prone operation; wherein the long press operation is an operation in which the duration of time the user triggers the control is greater than the trigger threshold.
Step 302, responding to a sliding operation on the virtual scene picture, and acquiring a first coordinate distance corresponding to the sliding operation; the first coordinate distance is a distance between pixel coordinates between a start point and an end point of the sliding operation.
In one possible implementation, when the terminal is a mobile device, the sliding operation may be a touch sliding operation of the user on the virtual scene screen. The user can click the virtual scene displayed on the terminal to realize touch control, and the touch control position of the user and the virtual scene displayed on the terminal is moved under the condition of keeping the touch control state of the virtual scene displayed on the terminal, so that the touch sliding operation of the virtual scene is realized, and at the moment, the first coordinate distance corresponds to the sliding distance of the user to the touch sliding operation of the mobile device, namely corresponds to the sliding distance of the user at the touch position of the virtual scene.
In one possible implementation, when the terminal is a PC (Personal Computer ), the sliding operation may be a trigger sliding operation implemented by a user based on a pointing device such as a mouse or a touch pad.
In one possible implementation manner, the triggering sliding operation may be a triggering sliding operation implemented by a user based on a pointer device such as a mouse or a touch pad, and the triggering sliding instruction of a pointer in the PC device, that is, the user may trigger a sliding operation on a virtual scene screen by controlling sliding of the pointer device such as the mouse or the touch pad, and obtain a first coordinate distance corresponding to the sliding operation, where the first coordinate distance corresponds to the sliding distance of the pointer device such as the mouse or the touch pad.
In one possible implementation manner, in response to a sliding operation on a designated area of the virtual scene screen, a first coordinate distance corresponding to the sliding operation is obtained.
Wherein, the virtual scene picture may contain a plurality of different areas; sliding operations on various regions of the virtual scene may trigger different changes to occur in the virtual scene.
For example, taking the virtual scene picture including a first area and a second area, where the first area is a right half side of the virtual scene picture as an example, when responding to a sliding operation of the first area in the virtual scene picture, the virtual scene picture is rotated according to the sliding operation; at this time, the second area is the left half side of the virtual scene, and when responding to the sliding operation of the first area in the virtual scene, the virtual scene is translated according to the sliding operation.
Step 303, obtaining a first physical distance corresponding to the first coordinate distance; the first physical distance is a physical distance at which the sliding operation slides.
In one possible implementation, the first coordinate distance is a pixel coordinate distance value of a terminal corresponding to the virtual scene, and the first physical distance is a physical distance that the user needs to move the pixel coordinate distance on the terminal device corresponding to the virtual scene.
Step 304, controlling the target virtual object to rotate in the virtual scene based on the first physical distance and the sliding direction of the sliding operation.
The first physical distance is a physical distance that the user actually moves, which is determined by the user according to the first coordinate distance, and the terminal rotates the target virtual object according to the physical distance that the user actually moves.
In summary, according to the virtual object control method provided by the application, a user takes a sliding operation on a terminal corresponding to a virtual scene picture, the terminal obtains a distance parameter corresponding to the sliding operation, determines a physical distance of sliding of the user taking the sliding operation on a terminal screen according to the distance parameter, and rotates a target virtual object based on the physical distance. According to the scheme, the terminal calculates the physical distance of manual operation of the user corresponding to the coordinate distance value according to the coordinate distance value on the virtual scene picture, and determines the rotation angle of the target virtual object according to the physical distance, so that the rotation angle of the target virtual object corresponds to the physical distance of manual operation of the user, the user can estimate the rotation angle of the visual angle of the target virtual object after rotating according to the operable physical distance, and the accuracy of the visual angle rotation corresponding to the target virtual object is improved.
Fig. 4 illustrates a flowchart of a virtual object control method according to an exemplary embodiment of the present application. The virtual picture presentation method may be executed by a computer device, which may be a terminal, a server, or the computer device may include the terminal and the server. As shown in fig. 4, the virtual object control method includes:
step 401, displaying a virtual scene picture corresponding to the virtual scene.
In a possible implementation manner, the virtual scene interface further comprises a virtual firearm held by a virtual object corresponding to the virtual scene interface.
When the virtual scene interface includes a virtual scene picture of the virtual scene acquired by taking the virtual character as a first view angle, a virtual camera is generally arranged at a shoulder position of the virtual character in the virtual scene, and a three-dimensional picture acquired by the virtual camera is used as a virtual scene picture captured by the first view angle of the virtual character.
In one possible implementation, the center region of the virtual scene interface includes a center-of-gravity ruler corresponding to the virtual firearm.
The quasi-center scale is displayed in the central area of the virtual scene picture captured by the virtual character in the first visual angle, so that the accuracy of opening can be improved to a certain extent when a user controls the virtual character in the state of not opening the mirror, and the user experience is improved.
In one possible implementation, the virtual scene picture further includes a first region and a second region. Wherein the first region is a right half region of the virtual scene and the second region is a left half region of the virtual scene.
The first area is used for triggering the visual angle rotation corresponding to the virtual role according to the sliding operation of the user.
In a possible implementation manner, a view angle control is displayed in the first area, when a user performs a sliding operation on the view angle control, the terminal obtains a sliding direction and a sliding size of the sliding operation corresponding to the view angle control, and determines an angle rotation size of the virtual camera corresponding to the virtual character according to the sliding direction and the sliding size, so as to obtain a virtual scene picture captured after the view angle rotation of the virtual camera.
In another possible implementation manner, a hidden view angle control is stored in the first area, when a user performs a sliding operation in a position where no other control is present in the first area, the view angle control captures a sliding direction and a sliding size of the sliding operation of the user, and determines an angular rotation size of a virtual camera corresponding to the virtual character according to the sliding direction and the sliding size, so as to obtain a virtual scene picture captured after the view angle rotation of the virtual camera.
The second area is used for triggering the movement operation corresponding to the virtual character according to the sliding operation.
In one possible implementation manner, a mobile control is displayed in the second area, and according to a trigger position of the continuous trigger operation of the user on the mobile control, the virtual character is controlled to execute the mobile operation in the direction corresponding to the trigger position.
In one possible implementation manner, a movement control is displayed in the second area, and according to a sliding direction corresponding to a sliding operation of the movement control by the user and an operation time of continuous pressing operation after the sliding operation, a movement direction of the virtual character for executing the movement operation and a duration of executing the movement operation are respectively determined.
In one possible implementation, the first region is a right half region of the virtual scene interface.
Step 402, obtaining a rendering resolution of the virtual scene picture, and a physical resolution of a display screen displaying the virtual scene picture.
The physical resolution of the display screen of the terminal is also called as standard resolution, and is an inherent parameter of the display screen of the terminal, and cannot be adjusted. The physical resolution refers to the number of pixel points (i.e., the minimum display unit number) that can be displayed by the display screen at the highest, that is, the terminal performs display of an image by using the minimum pixel point (minimum display unit) in the display screen as one display unit under the physical resolution.
The rendering resolution of the virtual scene is the real resolution of the virtual scene output on the display screen of the terminal after the data corresponding to the virtual scene is subjected to image processing by the GPU (Graphics Processing Unit ) of the terminal. That is, the rendering resolution of the virtual scene is the resolution of the virtual scene output of the display screen of the terminal, and in general, the rendering resolution of the virtual scene is smaller than the physical resolution of the display screen of the terminal.
In one possible implementation, when the rendering resolution of the virtual scene is the same as the physical resolution of the display screen of the terminal, the display of the virtual scene is realized by taking the minimum display unit of the terminal as a pixel point.
In another possible implementation manner, when the rendering resolution of the virtual scene is smaller than the physical resolution of the display screen of the terminal, at this time, the number of pixels corresponding to the virtual scene output by the GPU is smaller than the minimum number of display units of the terminal, and if the minimum number of display units of the display screen of the terminal is directly taken as the number of pixels of the virtual scene, the virtual scene can only be displayed in a partial area corresponding to the display screen of the terminal. Therefore, in an application program, particularly a game application program, which needs to display a virtual scene, after the GPU outputs the virtual scene, the virtual scene is usually stretched and displayed according to the rendering resolution of the virtual scene and the physical resolution of the display screen of the terminal, and the pixel point of the virtual scene is displayed by the minimum display unit of a plurality of terminals, so that the virtual scene with smaller rendering resolution can be normally displayed on the display screen with larger physical resolution.
Step 403, in response to the sliding operation on the virtual scene, acquiring the first coordinate distance based on the rendering resolution and the physical resolution.
In one possible implementation, the rendering resolution is the same as the physical resolution; in response to receiving the sliding operation, and the rendering resolution is the same as the physical resolution, the first coordinate distance is obtained based on the physical resolution.
When the rendering resolution is the same as the physical resolution, the image display of the virtual scene picture is realized by taking the minimum display unit of the display screen of the terminal as the pixel points, namely the number of the pixel points of the virtual scene picture is the same as the number of the minimum display unit of the display screen. At this time, the coordinate distance of the pixel point corresponding to the sliding operation performed by the user on the display screen of the terminal is not only the coordinate distance of the pixel point corresponding to the rendering resolution but also the coordinate distance corresponding to the minimum display unit corresponding to the physical resolution.
In one possible implementation, a first position coordinate and a second position coordinate of the sliding operation at the physical resolution are obtained; the first position coordinates are physical pixel coordinates of a start point of the sliding operation; the second position coordinates are physical pixel coordinates of an end point of the sliding operation; and acquiring the pixel distance between the first position coordinate and the second position coordinate as the first coordinate distance.
When the rendering resolution is the same as the physical resolution, the first position coordinate and the second position coordinate of the virtual scene image corresponding to the sliding operation can be directly obtained.
In one possible implementation, in response to receiving the sliding operation, and the rendering resolution is less than the physical resolution, obtaining a second coordinate distance corresponding to the sliding operation based on the rendering resolution; obtaining a scaling value based on the rendering resolution and the physical resolution; and acquiring the first coordinate distance based on the second coordinate distance and the scaling value.
When the rendering resolution is smaller than the physical resolution, that is, when the rendering resolution is different from the physical resolution, at this time, the coordinate distance obtained through the API interface corresponding to the virtual scene is the second coordinate distance corresponding to the rendering resolution, and the stretching size (i.e., the scaling value) of the virtual scene on the display screen with larger physical resolution can be obtained through the ratio relationship between the rendering resolution and the physical resolution, and according to the scaling value and the second coordinate distance, the first coordinate distance actually corresponding to the physical resolution can be obtained.
In one possible implementation, a third position coordinate and a fourth position coordinate of the sliding operation at the rendering resolution are obtained; the third position coordinates are rendering pixel coordinates of a start point of the sliding operation; the fourth position coordinates are rendering pixel coordinates of the end point of the sliding operation; and obtaining the pixel distance between the third position coordinate and the fourth position coordinate as the second coordinate distance.
When the rendering resolution is smaller than the physical resolution, the third position coordinate and the fourth position coordinate corresponding to the sliding operation can be acquired through the unique API interface of the virtual scene picture, and the second coordinate distance corresponding to the rendering resolution is acquired according to the relation between the third position coordinate and the fourth position coordinate.
Step 404, obtaining pixel density of a display screen displaying the virtual scene picture; the pixel density is used to indicate the correspondence of the pixel value of the terminal to the size.
Wherein the pixel density (PPI, pixels Per Inch) is the number of Pixels Per Inch in the display screen of the terminal. The higher the PPI value of the display screen of the terminal, the higher the density at which the display screen can display images.
Since PPI represents the number of pixels possessed per inch in a display screen and the physical resolution of the display screen is the minimum number of display units of the display screen, when each minimum display unit is one pixel, the physical resolution represents the number of pixels of the display screen, the value of the other one of the display screen can be obtained from any two of PPI, physical resolution of the display screen, and size of the display screen of the terminal.
Step 405, obtaining the ratio of the first coordinate distance to the pixel density as the first physical distance corresponding to the first coordinate distance.
In one possible implementation, the converted rotation amplitudes are approximately equal at the same physical distance of movement. The physical distance can be calculated by the PPI of the device and the physical size information of the device. For example, when the physical resolution is the same as the rendering resolution, the touch sliding operation is moved by x (first coordinate distance in pixels), and the resolution of the screen is
Figure SMS_1
And PPI of the screen is known, wherein
Figure SMS_2
For the pixel height of the screen, +.>
Figure SMS_3
Is the pixel width of the screen. The distance (in inches) moved is:
Figure SMS_4
The equipment size is calculated as follows:
Figure SMS_5
substituting a formula corresponding to the equipment size into a formula corresponding to the moving distance to obtain:
Figure SMS_6
i.e. the physical distance moved, can be obtained by PPI of the device and the pixel size of the device moved.
In one possible implementation, when the physical resolution is different from the rendering resolution, the above formula is not directly applicable, and first, screen. Width and screen. Height are both screen resolutions, and touch. Position (pixel coordinate values corresponding to the touch sliding operation, third position coordinate and fourth position coordinate) is a value at the rendering resolution. It is possible to change rendering resolution due to different image quality. The rendering resolution is not necessarily an equal value to the resolution of the device. We have the same rotation value that needs to be converted while ensuring that the same physical distance is moved. We need to find the movement distance from the device PPI by touch. Position, screen. Width and screen. Height. The touch. Position difference between two frames is the DeltaPosition (the pixel coordinate distance at the rendering resolution corresponding to the touch slide operation, i.e., the second coordinate distance). That is, before acquiring the moving physical distance, it is necessary to acquire the first coordinate distance at the physical resolution by rendering the second coordinate distance at the resolution, as follows:
Figure SMS_7
Wherein, the liquid crystal display device comprises a liquid crystal display device,
Figure SMS_8
to render the pixel height of the terminal at resolution,
Figure SMS_9
is the pixel width of the terminal at rendering resolution. The mobile device pixel obtained by the above formula will render the second coordinate distance +.>
Figure SMS_10
The first coordinate distance (i.e., the moving device pixel) at the physical resolution is obtained by rendering the pixel point ratio (i.e., the scaling ratio) of the resolution to the physical resolution.
Thus, the physical distance moved may be as follows:
Figure SMS_11
in one possible implementation, the first physical distance corresponding to the first coordinate distance is obtained in response to a relationship between the first coordinate distance and a coordinate distance threshold meeting a specified condition.
When the first coordinate distance under the physical resolution is obtained, the first coordinate distance can be firstly judged according to the coordinate distance threshold value, whether the first coordinate distance meets the specified condition is determined, and when the first coordinate distance meets the specified condition, the physical distance corresponding to the first coordinate distance is obtained.
When the first coordinate distance does not meet the specified condition, the fact that the touch sliding operation corresponding to the first coordinate distance is possibly invalid is indicated, at the moment, the first coordinate distance is set to be zero, the touch sliding operation is determined to be invalid, and the terminal ignores the touch sliding operation.
In one possible implementation, in response to the modulus of the first coordinate distance being greater than the first coordinate distance threshold, it is determined that a relationship between the first coordinate distance and the coordinate distance threshold satisfies the specified condition.
When the module length of the first coordinate distance is not greater than the first coordinate distance threshold, the touch sliding operation distance of the user corresponding to the first coordinate distance is too small, at this time, the first coordinate distance may not be the coordinate distance generated by the touch sliding operation actively triggered by the user, but the coordinate distance value acquired by the terminal due to jitter of the user or terminal error, at this time, the specified condition is not satisfied, that is, the terminal ignores the first coordinate distance, and determines the operation corresponding to the first coordinate distance as an invalid operation.
In one possible implementation, in response to the first abscissa distance and the first ordinate distance being less than the second coordinate distance threshold, it is determined that a relationship between the first coordinate distance and the coordinate distance threshold satisfies the specified condition.
The first abscissa distance may be an X-axis coordinate distance value in the first coordinate distance, and the first ordinate distance may be a Y-axis coordinate distance value in the first coordinate distance; when the X-axis coordinate distance value and the Y-axis coordinate distance value are smaller than the second coordinate distance threshold value, the touch sliding operation is indicated to be actively triggered by a user; when at least one of the X-axis coordinate distance value and the Y-axis coordinate distance value is not smaller than the second coordinate distance threshold, it is indicated that at least one azimuth exists on the X-axis or the Y-axis, and the first coordinate distance is larger than the coordinate distance value corresponding to the normal touch sliding operation, that is, the first coordinate distance is generated by the touch sliding operation under the external interference received by the user, and at the moment, the specified condition is not satisfied, and the touch sliding operation (that is, the sliding operation) corresponding to the first coordinate distance is confirmed as an invalid operation.
In one possible implementation, the first coordinate distance is obtained as the valid coordinate distance in response to a modulus of the first coordinate distance being greater than the first coordinate distance threshold and the first abscissa distance and the first ordinate distance being less than the second coordinate distance threshold.
When the module length of the first coordinate distance corresponding to the sliding operation is larger than the first coordinate distance threshold value, and the first horizontal coordinate distance and the first vertical coordinate distance are smaller than the second coordinate distance threshold value, the sliding operation is considered to be effective operation, specified conditions are met, and the corresponding physical distance is acquired according to the first coordinate distance.
Referring to fig. 5, a flowchart of an effective operation selection according to an embodiment of the present application is shown. As shown in fig. 5, taking the example that the first coordinate distance is obtained by the terminal through the user key positions of two consecutive frames, the terminal records the current key position as currentTounchPosition according to the touch position obtained by the terminal in each frame, and updates the key position of the previous frame as preTounchPosition in each frame. Since the value of touch. Position is a two-dimensional vector, it can be expressed as (x, y), where x is the abscissa and y is the ordinate. A determination may be made as to whether the current key input is a valid input based on the vectors of the two frames.
S501, the terminal records the key position of the last frame of the user through an API (Application Programming Interface, application program interface) of the game. S502, the terminal records the key position of the current frame of the user through the API interface of the game. S503, the terminal compares the magnitude of the vector module value between the two frames of key positions, when the vector module value is too small (smaller than a threshold value), the sliding of the two continuous frames of key positions is not the normal user operation sliding, and the sliding of the two continuous frames of key positions is determined to be invalid input at the moment, and the position of the frame is updated to be the key position of the last frame; when the vector module value difference between the key positions of the two continuous frames is larger (larger than the threshold value), the sliding of the key positions of the two continuous frames is possibly attributed to the operation sliding of the user under normal conditions, and the next judgment is carried out. S504, the terminal compares the X-axis difference between the two frames of key positions, when the X-axis coordinate difference of the two continuous frames of key positions is too large (larger than a threshold value), the sliding of the two continuous frames of key positions is not the user operation sliding under normal conditions, and the sliding screen speed is too high possibly caused by the impact of external force on the user, at the moment, the sliding of the two continuous frames of key positions is determined to be invalid input, and the frame position is updated to be the key position of the last frame; when the difference in the X-axis direction between the key positions of the two consecutive frames is not large (smaller than the threshold value), the data indicating that the sliding of the key positions of the two consecutive frames in the X-axis direction is normal, may be in the sliding operation of the user under normal conditions, and the next step of determination is performed. S505, the terminal compares the Y-axis coordinate difference between the two frames of key positions, when the Y-axis coordinate difference between the two consecutive frames of key positions is too large (larger than a threshold value), the sliding of the two consecutive frames of key positions is not the user operation sliding under normal conditions, and the sliding screen speed is too fast possibly caused by the impact of external force on the user, at the moment, the sliding of the two consecutive frames of key positions is determined as invalid input, and the frame position is updated as the key position of the last frame; when the Y-axis coordinate difference between the key positions of the two continuous frames is not large (smaller than a threshold value), the data of the sliding of the key positions of the two continuous frames in the Y-axis direction are indicated to be normal, at the moment, the sliding data of the key positions of the two continuous frames are all normal, the effective first coordinate distance corresponding to the sliding of the key positions of the two continuous frames is obtained according to the sliding data of the key positions of the two continuous frames, the physical distance corresponding to the movement of a user is obtained according to the first coordinate distance, and the key position corresponding to the current frame is updated to the key position of the last frame so as to judge the key position of the next frame.
Step 406, controlling the target virtual object to rotate in the virtual scene based on the first physical distance and the sliding direction of the sliding operation.
In one possible implementation, a rotation angle is obtained based on the first physical distance and the distance rotation parameter, and rotation of the target virtual object in the virtual scene is controlled based on the rotation angle.
The first distance is a physical distance that a user slides on a display screen of the terminal, and according to the physical distance and the rotation parameter, the terminal can determine a rotation angle corresponding to the physical distance.
In one possible implementation, the rotation parameter may be preset by an administrator of the application program corresponding to the virtual scene.
In another possible implementation manner, the user corresponding to the terminal may set the rotation parameter through a setting interface.
In the game application program, the user can determine the magnitude of the rotation parameter through a setting interface in the virtual scene interface and through a sensitivity control in the setting interface.
The rotation angle is determined based on the physical distance the user slides on the display screen corresponding to the terminal, and thus the rotation angle of the target virtual object is also related to the physical distance the user slides on the display screen corresponding to the terminal. Therefore, when the user with the same setting (i.e. the user with the same rotation coefficient) rotates the target virtual object with different rendering resolutions on different screen resolutions, when the sliding physical distances are the same, the rotation angles of the virtual scene images observed by the target virtual object should be the same.
In one possible implementation, a first position coordinate and a second position coordinate of the sliding operation at the physical resolution are obtained; and determining the sliding direction of the sliding operation according to the transverse coordinate distance and the longitudinal coordinate distance of the first position coordinate and the second position coordinate.
In one possible implementation manner, the sliding direction may be obtained according to two position coordinates (a first position coordinate and a second position coordinate) of the sliding operation of the user, and according to the two position coordinates of the sliding operation of the user and a time difference between the two position coordinates, a trend of a velocity vector of the sliding operation of the user in a unit time may be obtained, and according to the trend of the velocity vector, a rotation angle of the target virtual object may be determined.
Referring to fig. 6, a high resolution terminal screen rotation diagram according to an embodiment of the present application is shown. As shown in fig. 6, 600 is a virtual scene displayed by the high-resolution terminal (the physical resolution of the display screen of the terminal is higher) before the target virtual object rotates, wherein a center scale 601 is superimposed and displayed in the center area of the virtual scene, and in the virtual scene displayed before the target virtual object rotates, the user starts a touch sliding operation from the touch sliding operation position 602 at a touch sliding operation position 602, so that the virtual scene 600 becomes a virtual scene 610 displayed by the high-resolution terminal after the target virtual object rotates.
In the virtual scene 610 displayed after the rotation of the target virtual object, there is also a center-of-gravity scale 611 superimposed and displayed in the center area of the virtual scene, and in the virtual scene 610 displayed after the rotation of the target virtual object, as the user's touch slide operation reaches the touch slide operation position 612, the center-of-gravity scale 611 in the center area also corresponds to the fixed distance by which the target virtual object rotates.
Fig. 7 is a schematic diagram illustrating rotation of a low-resolution terminal screen target virtual object according to an embodiment of the present application. As shown in fig. 7, 700 is a virtual scene displayed by a low-resolution terminal (i.e., the physical resolution of the display screen of the terminal is lower) before the target virtual object rotates, where the virtual scene is a virtual scene with agreed azimuth that is displayed in the same position in the same virtual scene as 600 in fig. 6, and since fig. 7 is a virtual scene with lower screen resolution, and the rendering resolution corresponding to the virtual scene is also lower in the lower screen resolution, the number of pixels in the virtual scene is smaller than that in the virtual scene displayed in fig. 6, and thus the content of the displayed screen is smaller.
For fig. 7, 700 is a virtual scene displayed by the low-resolution terminal (i.e. the physical resolution of the display screen of the terminal is lower) before the target virtual object rotates, wherein a center scale 701 is displayed in a superimposed manner in the central area of the virtual scene, in the virtual scene displayed before the target virtual object rotates, the touch sliding operation position of the user is 702, and the user starts the touch sliding operation from the touch sliding operation position 702, so that the target virtual object rotates, and the virtual scene 700 becomes a virtual scene 710 displayed by the high-resolution terminal after the target virtual object rotates.
In the virtual scene 710 displayed after the rotation of the target virtual object, there is also a center-of-gravity ruler 711 superimposed and displayed in the center area of the virtual scene, and in the virtual scene 710 displayed after the rotation of the target virtual object, as the user's touch slide operation reaches the touch slide operation position 712, the center-of-gravity ruler 711 in the center area also corresponds to the fixed distance by which the target virtual object rotates.
In fig. 6 and 7, the moving distance of the user on the screens with different physical resolutions is the same, and as can be seen from the relative positions of the rotary scale and the surrounding building in fig. 6 and 7, the rotating angle of the rotary scale is the same when the moving distance on the screens with different physical resolutions is the same. That is, fig. 6 and fig. 7 show that, as required in the embodiments of the present application, the same physical distance is moved on the screens with different physical resolutions, so that the same angle of screen rotation can be achieved. In a first person shooter-type game, aiming is the core of the shooter game, and how to accurately aim at the core skills which are difficult to master for all shooters. At the computer end, the player can operate in a sufficiently wide area through the mouse. The aiming experience of the player is also greatly limited due to the limited operation area of the mobile phone end.
The rotation view angle is the only mode for realizing aiming, a series of algorithm designs made for the speed mapping of the sliding screen are affected by equipment with various sizes and resolutions in order to solve the trouble of the rotation view angle of each scene at the mobile phone end, the final screen-dividing effect cannot be realized in a mode of algorithm design, through the scheme shown by the embodiment of the application, the angles of picture rotation are the same when different physical resolutions and screens with different sizes are moved for the same physical distance, unification of lens rotation of different equipment is realized, user experience is improved, the angles of lens rotation can be accurately judged by the user through the sliding distance on the display screen of the terminal, and the accuracy of interaction between the user and the terminal is improved.
The scheme shown in the embodiment of the application can meet the following points: and (3) screening out invalid screen-drawing input, and ensuring that the screen-drawing experience of different equipment and the screen-drawing steering meet the operation expectations of players.
The meaning of screening invalid input is that the hands of the player are not in a stable state on the screen, the duration is accompanied by jitter with a certain frequency, if the input is not needed, the lenses of the player are accompanied by jitter with high intensity, and finally the game experience of the player is influenced to ensure the screen-dividing experience of different devices, namely, the scheme shown in the embodiment of the application can be correctly effective in different devices, and can be easily rotated by 90 degrees by one lens, so that the same physical distance can be moved by 90 degrees on a small-screen mobile phone or a large-screen tablet computer. The screen-scribing steering meets the operation expectation of a player, namely the rotation requirement of the lens visual angle corresponds to the operation behavior of the player one by one, and accurate operation feedback is provided.
In summary, according to the virtual object control method provided by the application, a user takes a sliding operation on a terminal corresponding to a virtual scene picture, the terminal obtains a distance parameter corresponding to the sliding operation, determines a physical distance of sliding of the user taking the sliding operation on a terminal screen according to the distance parameter, and rotates a target virtual object based on the physical distance. According to the scheme, the terminal calculates the physical distance of manual operation of the user corresponding to the coordinate distance value according to the coordinate distance value on the virtual scene picture, and determines the rotation angle of the target virtual object according to the physical distance, so that the rotation angle of the target virtual object corresponds to the physical distance of manual operation of the user, the user can estimate the rotation angle of the visual angle of the target virtual object after rotating according to the operable physical distance, and the accuracy of the visual angle rotation corresponding to the target virtual object is improved.
Referring to fig. 8, a schematic diagram of a virtual object control flow is shown according to an exemplary embodiment. As shown in fig. 8, the game application client is taken as an example of a game application client of a mobile terminal device. S801, acquiring a coordinate distance between two frames, namely acquiring coordinates corresponding to sliding touch operations of two continuous frames of users of a virtual scene picture through an API interface by a terminal, acquiring the coordinate distance corresponding to the sliding touch operations between the two continuous frames according to the coordinates corresponding to the sliding touch operations of the two continuous frames of users, and determining the sliding touch operations between the two continuous frames as effective operations through effective operation judgment shown in FIG. 5. S802, after the touch sliding operation between two continuous frames is determined to be effective operation, acquiring display pixel information of the equipment, wherein the display pixel information comprises physical resolution of the equipment and rendering resolution corresponding to the virtual scene picture. S803, acquiring the equipment PPI corresponding to the terminal. S804, according to the physical resolution of the device, the rendering resolution, the PPI of the device, and the coordinate distance (i.e., the pixel distance) corresponding to the continuous two-frame touch sliding operation, the physical distance (i.e., the actual distance of the user sliding on the screen) corresponding to the continuous two-frame touch sliding operation may be obtained. S805, determining a rotation angle of the virtual scene picture corresponding to the physical distance according to the product of the physical distance corresponding to the touch sliding operation and the sliding coefficient, and rotating the target virtual object according to the rotation angle to realize the variation of the virtual scene picture observed by the target virtual object.
Fig. 9 shows a flowchart of a virtual object control method according to an exemplary embodiment of the present application. The virtual object control method may be performed by a computer device, which may be a terminal, a server, or the computer device may include the terminal and the server. As shown in fig. 9, the virtual object control method includes:
step 901, displaying a virtual scene picture corresponding to a virtual scene; the virtual scene contains a target virtual object.
Step 902, in response to a sliding operation on a virtual scene screen, controlling the target virtual object to rotate in the virtual scene based on a physical distance of sliding of the sliding operation and a sliding direction of the sliding operation;
wherein, in terminals with different screen sizes and resolutions, the angles of rotation of the target virtual object in the virtual scene are the same under the control of the sliding operation of the same physical distance.
In summary, according to the virtual object control method provided by the application, a user takes a sliding operation on a terminal corresponding to a virtual scene picture, the terminal obtains a distance parameter corresponding to the sliding operation, determines a physical distance of sliding of the user taking the sliding operation on a terminal screen according to the distance parameter, and rotates a target virtual object based on the physical distance. According to the scheme, the terminal calculates the physical distance of manual operation of the user corresponding to the coordinate distance value according to the coordinate distance value on the virtual scene picture, and determines the rotation angle of the target virtual object according to the physical distance, so that the rotation angle of the target virtual object corresponds to the physical distance of manual operation of the user, the user can estimate the rotation angle of the visual angle of the target virtual object after rotating according to the operable physical distance, and the accuracy of the visual angle rotation corresponding to the target virtual object is improved.
Fig. 10 shows a block diagram of a virtual object control apparatus according to an exemplary embodiment of the present application. The virtual object control device may be applied to a computer device, and the computer device may be a terminal, or may be a terminal and a server, where the terminal may be a terminal shown in fig. 1. As shown in fig. 10, the virtual object control apparatus includes:
the virtual scene picture display module 1001 is configured to display a virtual scene picture corresponding to a virtual scene; the virtual scene comprises a target virtual object;
a first coordinate distance obtaining module 1002, configured to obtain a first coordinate distance corresponding to a sliding operation on the virtual scene screen in response to the sliding operation; the first coordinate distance is a distance between pixel coordinates between a start point and an end point of the sliding operation;
a first physical distance obtaining module 1003, configured to obtain a first physical distance corresponding to the first coordinate distance; the first physical distance is a physical distance at which the sliding operation slides;
and a virtual object rotation module 1004, configured to control the target virtual object to rotate in the virtual scene based on the first physical distance and the sliding direction of the sliding operation.
In one possible implementation, the apparatus further includes:
the resolution obtaining module is used for obtaining the rendering resolution of the virtual scene picture and displaying the physical resolution of the display screen of the virtual scene picture;
the first coordinate distance acquisition module 1002 is configured to,
and responding to the sliding operation on the virtual scene picture, and acquiring the first coordinate distance based on the rendering resolution and the physical resolution.
In one possible implementation, the rendering resolution is the same as the physical resolution;
the first coordinate distance acquisition module 1002 is configured to,
in response to receiving the sliding operation, and the rendering resolution is the same as the physical resolution, the first coordinate distance is acquired based on the physical resolution.
In one possible implementation manner, the first coordinate distance obtaining module 1002 includes:
a position coordinate acquisition unit configured to acquire a first position coordinate and a second position coordinate of the sliding operation at the physical resolution; the first position coordinates are physical pixel coordinates of a start point of the sliding operation; the second position coordinates are physical pixel coordinates of an end point of the sliding operation;
And the first coordinate distance acquisition unit is used for acquiring the pixel distance between the first position coordinate and the second position coordinate as the first coordinate distance.
In one possible implementation manner, the first coordinate distance obtaining module 1002 includes:
a second coordinate distance obtaining unit, configured to obtain a second coordinate distance corresponding to the sliding operation based on the rendering resolution in response to receiving the sliding operation, where the rendering resolution is smaller than the physical resolution;
a scaling value obtaining unit, configured to obtain a scaling value based on the rendering resolution and the physical resolution;
and the first coordinate distance acquisition unit is used for acquiring the first coordinate distance based on the second coordinate distance and the scaling value.
In one possible implementation manner, the second coordinate distance acquiring unit includes:
a position coordinate acquisition subunit, configured to acquire a third position coordinate and a fourth position coordinate of the sliding operation under the rendering resolution; the third position coordinates are rendering pixel coordinates of a start point of the sliding operation; the fourth position coordinates are rendering pixel coordinates of an end point of the sliding operation;
And the coordinate distance acquisition subunit is used for acquiring the pixel distance between the third position coordinate and the fourth position coordinate as the second coordinate distance.
In one possible implementation, the apparatus further includes:
the pixel density acquisition module is used for acquiring the pixel density of a display screen for displaying the virtual scene picture; the pixel density is used for indicating the corresponding relation between the pixel value and the size of the terminal;
a first physical distance obtaining module 1003, configured to obtain the ratio of the first coordinate distance to the pixel density as the first physical distance corresponding to the first coordinate distance.
In one possible implementation manner, the first physical distance acquiring module is configured to:
and acquiring the first physical distance corresponding to the first coordinate distance in response to the relation between the first coordinate distance and a coordinate distance threshold meeting a specified condition.
In one possible implementation, the coordinate distance threshold includes a first coordinate distance threshold; the apparatus further comprises:
and the first specified condition determining module is used for determining that the relation between the first coordinate distance and the coordinate distance threshold meets the specified condition in response to the fact that the modulus of the first coordinate distance is larger than the first coordinate distance threshold.
In one possible implementation, the coordinate distance threshold includes a second coordinate distance threshold; the first coordinate distance comprises a first abscissa distance and a first ordinate distance; the apparatus further comprises:
and a second specified condition determining module configured to determine that a relationship between the first coordinate distance and the coordinate distance threshold satisfies the specified condition in response to the first abscissa distance and the first ordinate distance being less than the second coordinate distance threshold.
In one possible implementation, the coordinate distance threshold includes a first coordinate distance threshold and a second coordinate distance threshold; the first coordinate distance comprises a first abscissa distance and a first ordinate distance;
the apparatus further comprises:
and a third specified condition determining module, configured to determine that a relationship between the first coordinate distance and the coordinate distance threshold satisfies the specified condition in response to a modulus of the first coordinate distance being greater than the first coordinate distance threshold and the first abscissa distance and the first ordinate distance being less than the second coordinate distance threshold.
In summary, according to the virtual object control method provided by the application, a user takes a sliding operation on a terminal corresponding to a virtual scene picture, the terminal obtains a distance parameter corresponding to the sliding operation, determines a physical distance of sliding of the user taking the sliding operation on a terminal screen according to the distance parameter, and rotates a target virtual object based on the physical distance. According to the scheme, the terminal calculates the physical distance of manual operation of the user corresponding to the coordinate distance value according to the coordinate distance value on the virtual scene picture, and determines the rotation angle of the target virtual object according to the physical distance, so that the rotation angle of the target virtual object corresponds to the physical distance of manual operation of the user, the user can estimate the rotation angle of the visual angle of the target virtual object after rotating according to the operable physical distance, and the accuracy of the visual angle rotation corresponding to the target virtual object is improved.
Fig. 11 shows a block diagram of a virtual object control apparatus according to an exemplary embodiment of the present application. The virtual object control device may be applied to a computer device, and the computer device may be a terminal, or may be a terminal and a server, where the terminal may be a terminal shown in fig. 1. As shown in fig. 11, the virtual object control apparatus includes:
the scene picture display module 1101 is configured to display a virtual scene picture corresponding to a virtual scene; the virtual scene comprises a target virtual object;
a virtual object control module 1102, configured to control, in response to a sliding operation on a virtual scene screen, rotation of the target virtual object in the virtual scene based on a physical distance of the sliding operation sliding and a sliding direction of the sliding operation;
wherein, in terminals with different screen sizes and resolutions, the angles of rotation of the target virtual objects in the virtual scene are the same under the control of the sliding operation of the same physical distance.
In summary, according to the virtual object control method provided by the application, a user takes a sliding operation on a terminal corresponding to a virtual scene picture, the terminal obtains a distance parameter corresponding to the sliding operation, determines a physical distance of sliding of the user taking the sliding operation on a terminal screen according to the distance parameter, and rotates a target virtual object based on the physical distance. According to the scheme, the terminal calculates the physical distance of manual operation of the user corresponding to the coordinate distance value according to the coordinate distance value on the virtual scene picture, and determines the rotation angle of the target virtual object according to the physical distance, so that the rotation angle of the target virtual object corresponds to the physical distance of manual operation of the user, the user can estimate the rotation angle of the visual angle of the target virtual object after rotating according to the operable physical distance, and the accuracy of the visual angle rotation corresponding to the target virtual object is improved.
Fig. 12 is a block diagram illustrating a computer device 1200, according to an example embodiment. The computer device may be implemented as a server in the above-described aspects of the present application.
The computer apparatus 1200 includes a central processing unit (Central Processing Unit, CPU) 1201, a system Memory 1204 including a random access Memory (Random Access Memory, RAM) 1202 and a Read-Only Memory (ROM) 1203, and a system bus 1205 connecting the system Memory 1204 and the central processing unit 1201. The computer device 1200 also includes a basic Input/Output system (I/O) 1206, which helps to transfer information between various devices within the computer, and a mass storage device 1207, which stores an operating system 1213, application programs 1214, and other program modules 1215.
The basic input/output system 1206 includes a display 1208 for displaying information and an input device 1209, such as a mouse, keyboard, etc., for user input of information. Wherein the display 1208 and the input device 1209 are coupled to the central processing unit 1201 via an input-output controller 1210 coupled to a system bus 1205. The basic input/output system 1206 may also include an input/output controller 1210 for receiving and processing input from a number of other devices, such as a keyboard, mouse, or electronic stylus. Similarly, the input output controller 1210 also provides output to a display screen, a printer, or other type of output device.
The mass storage device 1207 is connected to the central processing unit 1201 through a mass storage controller (not shown) connected to the system bus 1205. The mass storage device 1207 and its associated computer-readable media provide non-volatile storage for the computer device 1200. That is, the mass storage device 1207 may include a computer readable medium (not shown), such as a hard disk or a compact disk-Only (CD-ROM) drive.
The computer readable medium may include computer storage media and communication media without loss of generality. Computer storage media includes volatile and nonvolatile, removable and non-removable media implemented in any method or technology for storage of information such as computer readable instructions, data structures, program modules or other data. Computer storage media includes RAM, ROM, erasable programmable read-Only register (Erasable Programmable Read Only Memory, EPROM), electrically erasable programmable read-Only Memory (EEPROM) flash Memory or other solid state Memory technology, CD-ROM, digital versatile disks (Digital Versatile Disc, DVD) or other optical storage, magnetic cassettes, magnetic tape, magnetic disk storage or other magnetic storage devices. Of course, those skilled in the art will recognize that the computer storage medium is not limited to the one described above. The system memory 1204 and mass storage device 1207 described above may be collectively referred to as memory.
According to various embodiments of the disclosure, the computer device 1200 may also operate through a network, such as the Internet, to a remote computer on the network. I.e., the computer device 1200 may be connected to the network 1212 through a network interface unit 1211 coupled to the system bus 1205, or alternatively, the network interface unit 1211 may be used to connect to other types of networks or remote computer systems (not shown).
The memory further includes at least one instruction, at least one program, a code set, or an instruction set, where the at least one instruction, the at least one program, the code set, or the instruction set is stored in the memory, and the central processing unit 1201 implements all or part of the steps in the flowcharts of the virtual object control method shown in the above embodiments by executing the at least one instruction, the at least one program, the code set, or the instruction set.
Fig. 13 is a block diagram of a computer device 1300, shown in accordance with an exemplary embodiment. The computer device 1300 may be a terminal such as a smart phone, tablet, MP3 player (Moving Picture Experts Group Audio Layer III, mpeg 3), MP4 (Moving Picture Experts Group Audio Layer IV, mpeg 4) player, notebook or desktop. The computer device 1300 may also be referred to by other names of user devices, portable terminals, laptop terminals, desktop terminals, and the like.
In general, the computer device 1300 includes: a processor 1301, and a memory 1302.
Processor 1301 may include one or more processing cores, such as a 4-core processor, an 8-core processor, and the like. Processor 1301 may be implemented in at least one hardware form of DSP (Digital Signal Processing ), FPGA (Field-Programmable Gate Array, field programmable gate array), PLA (Programmable Logic Array ). Processor 1301 may also include a main processor, which is a processor for processing data in an awake state, also called a CPU (Central Processing Unit ), and a coprocessor; a coprocessor is a low-power processor for processing data in a standby state. In some embodiments, processor 1301 may integrate a GPU (Graphics Processing Unit, image processor) for rendering and rendering of content required to be displayed by the display screen. In some embodiments, the processor 1301 may also include an AI (Artificial Intelligence ) processor for processing computing operations related to machine learning.
Memory 1302 may include one or more computer-readable storage media, which may be non-transitory. Memory 1302 may also include high-speed random access memory, as well as non-volatile memory, such as one or more magnetic disk storage devices, flash memory storage devices. In some embodiments, a non-transitory computer readable storage medium in memory 1302 is used to store at least one instruction for execution by processor 1301 to implement the virtual object control method provided by the method embodiments herein.
In some embodiments, the computer device 1300 may further optionally include: a peripheral interface 1303 and at least one peripheral. The processor 1301, the memory 1302, and the peripheral interface 1303 may be connected by a bus or signal lines. The respective peripheral devices may be connected to the peripheral device interface 1303 through a bus, a signal line, or a circuit board. Specifically, the peripheral device includes: at least one of radio frequency circuitry 1304, a display screen 1305, a camera assembly 1306, audio circuitry 1307, and a power supply 1308.
A peripheral interface 1303 may be used to connect I/O (Input/Output) related at least one peripheral to the processor 1301 and the memory 1302. In some embodiments, processor 1301, memory 1302, and peripheral interface 1303 are integrated on the same chip or circuit board; in some other embodiments, either or both of the processor 1301, the memory 1302, and the peripheral interface 1303 may be implemented on separate chips or circuit boards, which is not limited in this embodiment.
The Radio Frequency circuit 1304 is used to receive and transmit RF (Radio Frequency) signals, also known as electromagnetic signals. The radio frequency circuit 1304 communicates with a communication network and other communication devices via electromagnetic signals. The radio frequency circuit 1304 converts an electrical signal to an electromagnetic signal for transmission, or converts a received electromagnetic signal to an electrical signal. Optionally, the radio frequency circuit 1304 includes: antenna systems, RF transceivers, one or more amplifiers, tuners, oscillators, digital signal processors, codec chipsets, subscriber identity module cards, and so forth. The radio frequency circuit 1304 may communicate with other terminals via at least one wireless communication protocol. The wireless communication protocol includes, but is not limited to: the world wide web, metropolitan area networks, intranets, generation mobile communication networks (2G, 3G, 4G, and 5G), wireless local area networks, and/or WiFi (Wireless Fidelity ) networks. In some embodiments, the radio frequency circuit 1304 may also include NFC (Near Field Communication ) related circuits, which are not limited in this application.
The display screen 1305 is used to display a UI (User Interface). The UI may include graphics, text, icons, video, and any combination thereof. When the display 1305 is a touch display, the display 1305 also has the ability to capture touch signals at or above the surface of the display 1305. The touch signal may be input to the processor 1301 as a control signal for processing. At this point, the display 1305 may also be used to provide virtual buttons and/or a virtual keyboard, also referred to as soft buttons and/or a soft keyboard. In some embodiments, the display screen 1305 may be one, providing a front panel of the computer apparatus 1300; in other embodiments, the display screen 1305 may be at least two, disposed on different surfaces of the computer apparatus 1300 or in a folded design; in still other embodiments, the display 1305 may be a flexible display disposed on a curved surface or a folded surface of the computer apparatus 1300. Even more, the display screen 1305 may be arranged in a non-rectangular irregular pattern, i.e., a shaped screen. The display screen 1305 may be made of LCD (Liquid Crystal Display ), OLED (Organic Light-Emitting Diode) or other materials.
The camera assembly 1306 is used to capture images or video. Optionally, camera assembly 1306 includes a front camera and a rear camera. Typically, the front camera is disposed on the front panel of the terminal and the rear camera is disposed on the rear surface of the terminal. In some embodiments, the at least two rear cameras are any one of a main camera, a depth camera, a wide-angle camera and a tele camera, so as to realize that the main camera and the depth camera are fused to realize a background blurring function, and the main camera and the wide-angle camera are fused to realize a panoramic shooting and Virtual Reality (VR) shooting function or other fusion shooting functions. In some embodiments, camera assembly 1306 may also include a flash. The flash lamp can be a single-color temperature flash lamp or a double-color temperature flash lamp. The dual-color temperature flash lamp refers to a combination of a warm light flash lamp and a cold light flash lamp, and can be used for light compensation under different color temperatures.
The audio circuit 1307 may include a microphone and a speaker. The microphone is used for collecting sound waves of users and environments, converting the sound waves into electric signals, and inputting the electric signals to the processor 1301 for processing, or inputting the electric signals to the radio frequency circuit 1304 for voice communication. For purposes of stereo acquisition or noise reduction, the microphone may be multiple, each disposed at a different location of the computer device 1300. The microphone may also be an array microphone or an omni-directional pickup microphone. The speaker is then used to convert electrical signals from the processor 1301 or the radio frequency circuit 1304 into sound waves. The speaker may be a conventional thin film speaker or a piezoelectric ceramic speaker. When the speaker is a piezoelectric ceramic speaker, not only the electric signal can be converted into a sound wave audible to humans, but also the electric signal can be converted into a sound wave inaudible to humans for ranging and other purposes. In some embodiments, the audio circuit 1307 may also comprise a headphone jack.
The power supply 1308 is used to power the various components in the computer device 1300. The power source 1308 may be alternating current, direct current, a disposable battery, or a rechargeable battery. When the power source 1308 comprises a rechargeable battery, the rechargeable battery may be a wired rechargeable battery or a wireless rechargeable battery. The wired rechargeable battery is a battery charged through a wired line, and the wireless rechargeable battery is a battery charged through a wireless coil. The rechargeable battery may also be used to support fast charge technology.
In some embodiments, computer device 1300 also includes one or more sensors 1310. The one or more sensors 1310 include, but are not limited to: acceleration sensor 1311, gyroscope sensor 1312, pressure sensor 1313, optical sensor 1314, and proximity sensor 1315.
The acceleration sensor 1311 can detect the magnitudes of accelerations on three coordinate axes of the coordinate system established with the computer apparatus 1300. For example, the acceleration sensor 1311 may be used to detect components of gravitational acceleration in three coordinate axes. Processor 1301 may control display screen 1305 to display a user interface in either a landscape view or a portrait view based on gravitational acceleration signals acquired by acceleration sensor 1311. The acceleration sensor 1311 may also be used for the acquisition of motion data of a game or user.
The gyro sensor 1312 may detect a body direction and a rotation angle of the computer apparatus 1300, and the gyro sensor 1312 may collect a 3D motion of the user on the computer apparatus 1300 in cooperation with the acceleration sensor 1311. Processor 1301 can implement the following functions based on the data collected by gyro sensor 1312: motion sensing (e.g., changing UI according to a tilting operation by a user), image stabilization at shooting, game control, and inertial navigation.
Pressure sensor 1313 may be disposed on a side frame of computer device 1300 and/or below display screen 1305. When the pressure sensor 1313 is disposed on the side frame of the computer apparatus 1300, a grip signal of the computer apparatus 1300 by the user may be detected, and the processor 1301 may perform left-right hand recognition or shortcut operation according to the grip signal collected by the pressure sensor 1313. When the pressure sensor 1313 is disposed at the lower layer of the display screen 1305, the processor 1301 realizes control of the operability control on the UI interface according to the pressure operation of the user on the display screen 1305. The operability controls include at least one of a button control, a scroll bar control, an icon control, and a menu control.
The optical sensor 1314 is used to collect ambient light intensity. In one embodiment, processor 1301 may control the display brightness of display screen 1305 based on the intensity of ambient light collected by optical sensor 1314. Specifically, when the intensity of the ambient light is high, the display brightness of the display screen 1305 is turned up; when the ambient light intensity is low, the display brightness of the display screen 1305 is turned down. In another embodiment, processor 1301 may also dynamically adjust the shooting parameters of camera assembly 1306 based on the intensity of ambient light collected by optical sensor 1314.
The proximity sensor 1315, also referred to as a distance sensor, is typically disposed on the front panel of the computer device 1300. The proximity sensor 1315 is used to collect the distance between the user and the front of the computer device 1300. In one embodiment, when proximity sensor 1315 detects a gradual decrease in the distance between the user and the front of computer device 1300, processor 1301 controls display screen 1305 to switch from a bright screen state to a inactive screen state; when proximity sensor 1315 detects a gradual increase in the distance between the user and the front of computer device 1300, processor 1301 controls display screen 1305 to switch from the off-screen state to the on-screen state.
Those skilled in the art will appreciate that the architecture shown in fig. 13 is not limiting as to the computer device 1300, and may include more or fewer components than shown, or may combine certain components, or employ a different arrangement of components.
In an exemplary embodiment, a non-transitory computer readable storage medium is also provided, including instructions, for example, a memory including at least one instruction, at least one program, code set, or instruction set, executable by a processor, to perform all or part of the steps of the methods shown in the corresponding embodiments of fig. 3, 4, or 9. For example, the non-transitory computer readable storage medium may be ROM, random Access Memory (RAM), CD-ROM, magnetic tape, floppy disk, optical data storage device, etc.
In an exemplary embodiment, a computer program product or a computer program is also provided, the computer program product or computer program comprising computer instructions stored in a computer readable storage medium. The processor of the computer device reads the computer instructions from the computer readable storage medium and executes the computer instructions to cause the computer device to perform all or part of the steps of the methods described above with respect to the corresponding embodiments of fig. 3, 4, or 9.
Other embodiments of the present application will be apparent to those skilled in the art from consideration of the specification and practice of the invention disclosed herein. This application is intended to cover any variations, uses, or adaptations of the application following, in general, the principles of the application and including such departures from the present disclosure as come within known or customary practice within the art to which the application pertains. It is intended that the specification and examples be considered as exemplary only, with a true scope and spirit of the application being indicated by the following claims.
It is to be understood that the present application is not limited to the precise arrangements and instrumentalities shown in the drawings, which have been described above, and that various modifications and changes may be effected without departing from the scope thereof. The scope of the application is limited only by the appended claims.

Claims (17)

1. A virtual object control method, the method comprising:
displaying a virtual scene picture corresponding to the virtual scene; the virtual scene comprises a target virtual object;
acquiring the rendering resolution of the virtual scene picture and displaying the physical resolution of a display screen of the virtual scene picture;
in response to receiving a sliding operation, recording a key position of a previous frame and a key position of a current frame;
updating the key position of the current frame to the key position of the previous frame in response to the vector modulus value between the key positions of the current frame and the previous frame being not greater than the first coordinate distance threshold; determining a lateral axial gap between a current key position and a previous frame key position in response to the vector modulus value being greater than the first coordinate distance threshold;
in response to the transverse axial difference being greater than a second coordinate distance threshold, updating the key position of the current frame to the key position of the previous frame; determining a longitudinal axis difference between a current key position and a previous frame key position in response to the lateral axis difference being not greater than the second coordinate distance threshold;
in response to the longitudinal axial gap being greater than the second coordinate distance threshold, updating the key position of the current frame to the key position of the previous frame; determining that a relationship between a coordinate distance and a coordinate distance threshold satisfies a specified condition in response to the longitudinal differential being not greater than the second coordinate distance threshold, the coordinate distance threshold comprising the first coordinate distance threshold and the second coordinate distance threshold;
Responding to the relation between the coordinate distance and the coordinate distance threshold value to meet the specified condition, wherein the rendering resolution is smaller than the physical resolution, and acquiring a second coordinate distance corresponding to the sliding operation based on the rendering resolution according to the sliding data of the positions of the two continuous frames of keys;
acquiring a scaling value based on the rendering resolution and the physical resolution; acquiring a first coordinate distance based on the second coordinate distance and the scaling value; the first coordinate distance is a distance between pixel coordinates between a start point and an end point of the sliding operation;
acquiring a first physical distance corresponding to the first coordinate distance, wherein the first physical distance is a physical distance required to be moved by a user to execute the sliding operation and move the distance between the pixel coordinates on the terminal equipment corresponding to the virtual scene picture;
determining a rotation angle of the target virtual object based on the first physical distance and a distance rotation parameter, wherein the first physical distance corresponds to the rotation angle of the target virtual object; in the terminals with different screen sizes and different physical resolutions, under the control of sliding operation of the same physical distance, the rotation angles of the target virtual objects in the virtual scene are the same; or determining a first position coordinate and a second position coordinate of the sliding operation under the physical resolution, and acquiring a speed vector trend of the sliding operation in unit time according to a transverse coordinate distance and a longitudinal coordinate distance of the first position coordinate and the second position coordinate and a time difference of the first position coordinate and the second position coordinate, wherein the first position coordinate is a physical pixel coordinate of a starting point of the sliding operation, and the second position coordinate is a physical pixel coordinate of an ending point of the sliding operation; determining the rotation angle according to the speed vector trend;
And controlling the target virtual object to rotate in the virtual scene according to the rotation angle and the sliding direction of the sliding operation, wherein the sliding direction is determined according to the transverse coordinate distance and the longitudinal coordinate distance of the first position coordinate and the second position coordinate.
2. The method of claim 1, wherein the rendering resolution is the same as the physical resolution;
the method further comprises the steps of:
in response to receiving the sliding operation, and the rendering resolution is the same as the physical resolution, the first coordinate distance is acquired based on the physical resolution.
3. The method of claim 2, wherein the obtaining the first coordinate distance based on the physical resolution in response to receiving the sliding operation and the rendering resolution being the same as the physical resolution comprises:
in response to receiving the sliding operation, acquiring the first position coordinate and the second position coordinate of the sliding operation under the physical resolution;
and acquiring the pixel distance between the first position coordinate and the second position coordinate as the first coordinate distance.
4. The method of claim 1, wherein prior to the obtaining the first physical distance corresponding to the first coordinate distance, further comprising:
acquiring pixel density of a display screen showing the virtual scene picture; the pixel density is used for indicating the corresponding relation between the pixel value and the size of the terminal equipment;
the obtaining the first physical distance corresponding to the first coordinate distance includes:
and obtaining the ratio of the first coordinate distance to the pixel density as the first physical distance corresponding to the first coordinate distance.
5. The method of claim 1, wherein prior to the obtaining the first physical distance corresponding to the first coordinate distance, further comprising:
and in response to the modulus of the first coordinate distance being greater than the first coordinate distance threshold, determining that a relationship between the first coordinate distance and the coordinate distance threshold satisfies the specified condition.
6. The method of claim 1, wherein the first coordinate distance comprises a first abscissa distance and a first ordinate distance;
before the first physical distance corresponding to the first coordinate distance is obtained, the method further includes:
In response to the first abscissa distance and the first ordinate distance being less than the second coordinate distance threshold, it is determined that a relationship between the first coordinate distance and the coordinate distance threshold satisfies the specified condition.
7. A virtual object control method, wherein the method is used for a terminal, and the method comprises:
displaying a virtual scene picture corresponding to the virtual scene; the virtual scene comprises a target virtual object;
acquiring the rendering resolution of the virtual scene picture and displaying the physical resolution of a display screen of the virtual scene picture; recording the key position of the last frame and the key position of the current frame in response to the sliding operation on the virtual scene picture;
updating the key position of the current frame to the key position of the previous frame in response to the vector modulus value between the key positions of the current frame and the previous frame being not greater than the first coordinate distance threshold; determining a lateral axial gap between a current key position and a previous frame key position in response to the vector modulus value being greater than the first coordinate distance threshold;
in response to the transverse axial difference being greater than a second coordinate distance threshold, updating the key position of the current frame to the key position of the previous frame; determining a longitudinal axis difference between a current key position and a previous frame key position in response to the lateral axis difference being not greater than the second coordinate distance threshold;
In response to the longitudinal axial gap being greater than the second coordinate distance threshold, updating the key position of the current frame to the key position of the previous frame; determining that a relationship between a coordinate distance and a coordinate distance threshold satisfies a specified condition in response to the longitudinal differential being not greater than the second coordinate distance threshold, the coordinate distance threshold comprising the first coordinate distance threshold and the second coordinate distance threshold;
responding to the relation between the coordinate distance and the coordinate distance threshold value to meet the specified condition, wherein the rendering resolution is smaller than the physical resolution, and acquiring a second coordinate distance corresponding to the sliding operation based on the rendering resolution according to the sliding data of the positions of the two continuous frames of keys;
acquiring a scaling value based on the rendering resolution and the physical resolution; acquiring a first coordinate distance based on the second coordinate distance and the scaling value; the first coordinate distance is a distance between pixel coordinates between a start point and an end point of the sliding operation;
acquiring a first physical distance corresponding to the first coordinate distance, wherein the first physical distance is a physical distance required to be moved by a user to execute the sliding operation and move the distance between the pixel coordinates on the terminal equipment corresponding to the virtual scene picture;
Determining a rotation angle of the target virtual object based on the sliding physical distance and a distance rotation parameter, wherein the first physical distance corresponds to the rotation angle of the target virtual object; in the terminals with different screen sizes and different physical resolutions, under the control of sliding operation of the same physical distance, the rotation angles of the target virtual objects in the virtual scene are the same; or determining a first position coordinate and a second position coordinate of the sliding operation under the physical resolution, and acquiring a speed vector trend of the sliding operation in unit time according to a transverse coordinate distance and a longitudinal coordinate distance of the first position coordinate and the second position coordinate and a time difference of the first position coordinate and the second position coordinate, wherein the first position coordinate is a physical pixel coordinate of a starting point of the sliding operation, and the second position coordinate is a physical pixel coordinate of an ending point of the sliding operation; determining the rotation angle according to the speed vector trend;
and controlling the target virtual object to rotate in the virtual scene according to the rotation angle and the sliding direction of the sliding operation, wherein the sliding direction is determined according to the transverse coordinate distance and the longitudinal coordinate distance of the first position coordinate and the second position coordinate.
8. A virtual object control apparatus, the apparatus comprising:
the virtual scene picture display module is used for displaying a virtual scene picture corresponding to the virtual scene; the virtual scene comprises a target virtual object;
the resolution obtaining module is used for obtaining the rendering resolution of the virtual scene picture and displaying the physical resolution of the display screen of the virtual scene picture;
the first coordinate distance acquisition module comprises a second coordinate distance acquisition unit, a scaling value acquisition unit and a first coordinate distance acquisition unit; the second coordinate distance obtaining unit is used for responding to the received sliding operation and recording the key position of the previous frame and the key position of the current frame; updating the key position of the current frame to the key position of the previous frame in response to the vector modulus value between the key positions of the current frame and the previous frame being not greater than the first coordinate distance threshold; determining a lateral axial gap between a current key position and a previous frame key position in response to the vector modulus value being greater than the first coordinate distance threshold; in response to the transverse axial difference being greater than a second coordinate distance threshold, updating the key position of the current frame to the key position of the previous frame; determining a longitudinal axis difference between a current key position and a previous frame key position in response to the lateral axis difference being not greater than the second coordinate distance threshold; in response to the longitudinal axial gap being greater than the second coordinate distance threshold, updating the key position of the current frame to the key position of the previous frame; determining that a relationship between a coordinate distance and a coordinate distance threshold satisfies a specified condition in response to the longitudinal differential being not greater than the second coordinate distance threshold, the coordinate distance threshold comprising the first coordinate distance threshold and the second coordinate distance threshold; responding to the relation between the coordinate distance and the coordinate distance threshold value to meet the specified condition, wherein the rendering resolution is smaller than the physical resolution, and acquiring a second coordinate distance corresponding to the sliding operation based on the rendering resolution according to the sliding data of the positions of the two continuous frames of keys;
The scaling value obtaining unit is used for obtaining a scaling value based on the rendering resolution and the physical resolution;
the first coordinate distance obtaining unit is configured to obtain a first coordinate distance based on the second coordinate distance and the scaling value; the first coordinate distance is a distance between pixel coordinates between a start point and an end point of the sliding operation;
the first physical distance obtaining module is used for obtaining a first physical distance corresponding to the first coordinate distance, wherein the first physical distance is a physical distance required to be moved by a user to execute the sliding operation and move the distance between the pixel coordinates on the terminal equipment corresponding to the virtual scene picture;
the virtual object rotation module is used for determining the rotation angle of the target virtual object based on the first physical distance and the distance rotation parameter, wherein the first physical distance corresponds to the rotation angle of the target virtual object; in the terminals with different screen sizes and different physical resolutions, under the control of sliding operation of the same physical distance, the rotation angles of the target virtual objects in the virtual scene are the same; or determining a first position coordinate and a second position coordinate of the sliding operation under the physical resolution, and acquiring a speed vector trend of the sliding operation in unit time according to a transverse coordinate distance and a longitudinal coordinate distance of the first position coordinate and the second position coordinate and a time difference of the first position coordinate and the second position coordinate, wherein the first position coordinate is a physical pixel coordinate of a starting point of the sliding operation, and the second position coordinate is a physical pixel coordinate of an ending point of the sliding operation; determining the rotation angle according to the speed vector trend; and controlling the target virtual object to rotate in the virtual scene according to the rotation angle and the sliding direction of the sliding operation, wherein the sliding direction is determined according to the transverse coordinate distance and the longitudinal coordinate distance of the first position coordinate and the second position coordinate.
9. The apparatus of claim 8, wherein the rendering resolution is the same as the physical resolution; the first coordinate distance acquisition module is configured to:
in response to receiving the sliding operation, and the rendering resolution is the same as the physical resolution, the first coordinate distance is acquired based on the physical resolution.
10. The apparatus of claim 9, wherein the first coordinate distance acquisition module comprises:
a position coordinate acquisition unit configured to acquire the first position coordinate and the second position coordinate of the sliding operation at the physical resolution in response to receiving the sliding operation;
and the first coordinate distance acquisition unit is used for acquiring the pixel distance between the first position coordinate and the second position coordinate as the first coordinate distance.
11. The apparatus of claim 8, wherein the apparatus further comprises:
the pixel density acquisition module is used for acquiring the pixel density of a display screen for displaying the virtual scene picture; the pixel density is used for indicating the corresponding relation between the pixel value and the size of the terminal equipment;
The first physical distance obtaining module is configured to obtain the ratio of the first coordinate distance to the pixel density as the first physical distance corresponding to the first coordinate distance.
12. The apparatus of claim 8, wherein the apparatus further comprises:
and the first specified condition determining module is used for determining that the relation between the first coordinate distance and the coordinate distance threshold meets the specified condition in response to the fact that the modulus of the first coordinate distance is larger than the first coordinate distance threshold.
13. The apparatus of claim 8, wherein the first coordinate distance comprises a first abscissa distance and a first ordinate distance; the apparatus further comprises:
and a second specified condition determining module configured to determine that a relationship between the first coordinate distance and the coordinate distance threshold satisfies the specified condition in response to the first abscissa distance and the first ordinate distance being less than the second coordinate distance threshold.
14. A virtual object control apparatus, wherein the apparatus is used for a terminal, the apparatus comprising:
the virtual scene picture display module is used for displaying a virtual scene picture corresponding to the virtual scene; the virtual scene comprises a target virtual object;
The resolution obtaining module is used for obtaining the rendering resolution of the virtual scene picture and displaying the physical resolution of the display screen of the virtual scene picture;
the first coordinate distance acquisition module comprises a second coordinate distance acquisition unit, a scaling value acquisition unit and a first coordinate distance acquisition unit; the second coordinate distance obtaining unit is used for responding to the sliding operation on the virtual scene picture and recording the key position of the last frame and the key position of the current frame; updating the key position of the current frame to the key position of the previous frame in response to the vector modulus value between the key positions of the current frame and the previous frame being not greater than the first coordinate distance threshold; determining a lateral axial gap between a current key position and a previous frame key position in response to the vector modulus value being greater than the first coordinate distance threshold; in response to the transverse axial difference being greater than a second coordinate distance threshold, updating the key position of the current frame to the key position of the previous frame; determining a longitudinal axis difference between a current key position and a previous frame key position in response to the lateral axis difference being not greater than the second coordinate distance threshold; in response to the longitudinal axial gap being greater than the second coordinate distance threshold, updating the key position of the current frame to the key position of the previous frame; determining that a relationship between a coordinate distance and a coordinate distance threshold satisfies a specified condition in response to the longitudinal differential being not greater than the second coordinate distance threshold, the coordinate distance threshold comprising the first coordinate distance threshold and the second coordinate distance threshold; responding to the relation between the coordinate distance and the coordinate distance threshold value to meet the specified condition, wherein the rendering resolution is smaller than the physical resolution, and acquiring a second coordinate distance corresponding to the sliding operation based on the rendering resolution according to the sliding data of the positions of the two continuous frames of keys;
The scaling value obtaining unit is used for obtaining a scaling value based on the rendering resolution and the physical resolution;
the first coordinate distance obtaining unit is configured to obtain a first coordinate distance based on the second coordinate distance and the scaling value; the first coordinate distance is a distance between pixel coordinates between a start point and an end point of the sliding operation;
the first physical distance obtaining module is used for obtaining a first physical distance corresponding to the first coordinate distance, wherein the first physical distance is a physical distance required to be moved by a user to execute the sliding operation and move the distance between the pixel coordinates on the terminal equipment corresponding to the virtual scene picture;
the virtual object rotation module is used for determining the rotation angle of the target virtual object based on the sliding physical distance and the distance rotation parameter of the sliding operation, wherein the first physical distance corresponds to the rotation angle of the target virtual object; in the terminals with different screen sizes and different physical resolutions, under the control of sliding operation of the same physical distance, the rotation angles of the target virtual objects in the virtual scene are the same; or determining a first position coordinate and a second position coordinate of the sliding operation under the physical resolution, and acquiring a speed vector trend of the sliding operation in unit time according to a transverse coordinate distance and a longitudinal coordinate distance of the first position coordinate and the second position coordinate and a time difference of the first position coordinate and the second position coordinate, wherein the first position coordinate is a physical pixel coordinate of a starting point of the sliding operation, and the second position coordinate is a physical pixel coordinate of an ending point of the sliding operation; determining the rotation angle according to the speed vector trend; and controlling the target virtual object to rotate in the virtual scene according to the rotation angle and the sliding direction of the sliding operation, wherein the sliding direction is determined according to the transverse coordinate distance and the longitudinal coordinate distance of the first position coordinate and the second position coordinate.
15. A computer device comprising a processor and a memory, wherein the memory has stored therein at least one program that is loaded and executed by the processor to implement the virtual object control method of any of claims 1 to 7.
16. A computer-readable storage medium, wherein at least one program is stored in the storage medium, the at least one program being loaded and executed by a processor to implement the virtual object control method according to any one of claims 1 to 7.
17. A computer program product comprising computer instructions which, when executed by a processor, implement a virtual object control method as claimed in any one of claims 1 to 7.
CN202011413036.4A 2020-12-04 2020-12-04 Virtual object control method, device, computer equipment and storage medium Active CN112451969B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202011413036.4A CN112451969B (en) 2020-12-04 2020-12-04 Virtual object control method, device, computer equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202011413036.4A CN112451969B (en) 2020-12-04 2020-12-04 Virtual object control method, device, computer equipment and storage medium

Publications (2)

Publication Number Publication Date
CN112451969A CN112451969A (en) 2021-03-09
CN112451969B true CN112451969B (en) 2023-04-21

Family

ID=74800113

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202011413036.4A Active CN112451969B (en) 2020-12-04 2020-12-04 Virtual object control method, device, computer equipment and storage medium

Country Status (1)

Country Link
CN (1) CN112451969B (en)

Families Citing this family (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113633976B (en) * 2021-08-16 2023-06-20 腾讯科技(深圳)有限公司 Operation control method, device, equipment and computer readable storage medium
CN113633975B (en) * 2021-08-19 2023-10-20 腾讯科技(深圳)有限公司 Virtual environment picture display method, device, terminal and storage medium
CN117679745B (en) * 2024-02-01 2024-04-12 南京维赛客网络科技有限公司 Method, system and medium for controlling virtual character orientation through multi-angle dynamic detection

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101763234A (en) * 2009-12-29 2010-06-30 广东威创视讯科技股份有限公司 Method and device for simulating various screen resolutions
US20200316470A1 (en) * 2018-04-27 2020-10-08 Tencent Technology (Shenzhen) Company Limited Method and terminal for displaying distance information in virtual scene

Family Cites Families (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110652725A (en) * 2019-09-29 2020-01-07 网易(杭州)网络有限公司 Method and device for controlling aiming direction in game, electronic equipment and storage medium
CN111679739B (en) * 2020-06-04 2024-04-09 京东方科技集团股份有限公司 Readable storage medium, virtual reality device, control method and control device thereof
CN111970454B (en) * 2020-09-10 2023-01-13 青岛鳍源创新科技有限公司 Shot picture display method, device, equipment and storage medium

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101763234A (en) * 2009-12-29 2010-06-30 广东威创视讯科技股份有限公司 Method and device for simulating various screen resolutions
US20200316470A1 (en) * 2018-04-27 2020-10-08 Tencent Technology (Shenzhen) Company Limited Method and terminal for displaying distance information in virtual scene

Also Published As

Publication number Publication date
CN112451969A (en) 2021-03-09

Similar Documents

Publication Publication Date Title
US11703993B2 (en) Method, apparatus and device for view switching of virtual environment, and storage medium
CN108619721B (en) Distance information display method and device in virtual scene and computer equipment
CN109529319B (en) Display method and device of interface control and storage medium
CN108815851B (en) Interface display method, equipment and storage medium for shooting in virtual environment
CN108710525B (en) Map display method, device, equipment and storage medium in virtual scene
CN111035918B (en) Reconnaissance interface display method and device based on virtual environment and readable storage medium
CN112451969B (en) Virtual object control method, device, computer equipment and storage medium
CN111921197B (en) Method, device, terminal and storage medium for displaying game playback picture
JP2024045184A (en) Method, apparatus and medium for controlling virtual object to mark virtual item
CN108786110B (en) Method, device and storage medium for displaying sighting telescope in virtual environment
CN111659117B (en) Virtual object display method and device, computer equipment and storage medium
CN111399639B (en) Method, device and equipment for controlling motion state in virtual environment and readable medium
CN112169325B (en) Virtual prop control method and device, computer equipment and storage medium
WO2021031765A1 (en) Application method and related apparatus of sighting telescope in virtual environment
CN111420402A (en) Virtual environment picture display method, device, terminal and storage medium
CN113398571A (en) Virtual item switching method, device, terminal and storage medium
CN108744511B (en) Method, device and storage medium for displaying sighting telescope in virtual environment
CN112330823B (en) Virtual prop display method, device, equipment and readable storage medium
CN111589141B (en) Virtual environment picture display method, device, equipment and medium
CN111013137B (en) Movement control method, device, equipment and storage medium in virtual scene
CN113134232B (en) Virtual object control method, device, equipment and computer readable storage medium
CN114404972A (en) Method, device and equipment for displaying visual field picture
CN112755526B (en) Virtual item control method, device, terminal and storage medium
CN111013136B (en) Movement control method, device, equipment and storage medium in virtual scene
CN112354181B (en) Open mirror picture display method and device, computer equipment and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
REG Reference to a national code

Ref country code: HK

Ref legal event code: DE

Ref document number: 40041007

Country of ref document: HK

GR01 Patent grant
GR01 Patent grant