CN113546407A - Game picture processing method and device, computer equipment and storage medium - Google Patents

Game picture processing method and device, computer equipment and storage medium Download PDF

Info

Publication number
CN113546407A
CN113546407A CN202110871656.0A CN202110871656A CN113546407A CN 113546407 A CN113546407 A CN 113546407A CN 202110871656 A CN202110871656 A CN 202110871656A CN 113546407 A CN113546407 A CN 113546407A
Authority
CN
China
Prior art keywords
target
scene
game
initial
target virtual
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202110871656.0A
Other languages
Chinese (zh)
Inventor
杨璐昊
胡博皓
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Netease Hangzhou Network Co Ltd
Original Assignee
Netease Hangzhou Network Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Netease Hangzhou Network Co Ltd filed Critical Netease Hangzhou Network Co Ltd
Priority to CN202110871656.0A priority Critical patent/CN113546407A/en
Publication of CN113546407A publication Critical patent/CN113546407A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/40Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment
    • A63F13/42Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment by mapping the input signals into game commands, e.g. mapping the displacement of a stylus on a touch screen to the steering angle of a virtual vehicle
    • A63F13/426Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment by mapping the input signals into game commands, e.g. mapping the displacement of a stylus on a touch screen to the steering angle of a virtual vehicle involving on-screen location information, e.g. screen coordinates of an area at which the player is aiming with a light gun
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/50Controlling the output signals based on the game progress
    • A63F13/52Controlling the output signals based on the game progress involving aspects of the displayed game scene
    • A63F13/525Changing parameters of virtual cameras
    • A63F13/5258Changing parameters of virtual cameras by dynamically adapting the position of the virtual camera to keep a game object or game character in its viewing frustum, e.g. for tracking a character or a ball
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/80Special adaptations for executing a specific game genre or game mode
    • A63F13/837Shooting of targets
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/30Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by output arrangements for receiving control signals generated by the game device
    • A63F2300/308Details of the user interface
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/60Methods for processing data by generating or executing the game program
    • A63F2300/66Methods for processing data by generating or executing the game program for rendering three dimensional images
    • A63F2300/6661Methods for processing data by generating or executing the game program for rendering three dimensional images for changing the position of the virtual camera
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/80Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game specially adapted for executing a specific type of game
    • A63F2300/8076Shooting

Abstract

The embodiment of the application discloses a game picture processing method and device, computer equipment and a storage medium. According to the scheme, the initial space coordinates of the target virtual props in the game world scene are projected into a display screen displaying a game picture under a first person viewing angle to obtain screen coordinates, the screen coordinates are matched and adjusted according to the target weapon viewing angle of the target virtual props and the target display size of the display screen, then the adjusted screen coordinates are reversely projected into the game world scene to obtain transformed space coordinates, the initial space coordinates are adjusted through the transformed space coordinates, and the target space coordinates of the target virtual props in the game world scene are obtained. The position coordinates of the target virtual props in the game world scene are adjusted, so that the display effect of the target virtual props in the game picture is improved, and the game experience of a user is improved.

Description

Game picture processing method and device, computer equipment and storage medium
Technical Field
The present application relates to the field of computer technologies, and in particular, to a game screen processing method and apparatus, a computer device, and a storage medium.
Background
In some shooting games (e.g., First-person shooter, also referred to as FPS game, referring to a shooting game in a First-person perspective), a virtual camera exists on a virtual character, and an image angle range which can be received in the First-person perspective is acquired by the virtual camera. That is, in an FPS game, the field angle is the range of observable game world scenes seen on the terminal display screen at any instant in time, typically given in terms of the horizontal or vertical component of the field angle, with larger angles representing a larger field of view.
However, because the screen proportion of the display screens of some terminals is different, the difference of the screen proportion directly causes the change of the angle of view in the horizontal or vertical direction of the battle in the game, so that the same game picture is displayed on different display screens inconsistently, and the game experience of the players is influenced.
Disclosure of Invention
The embodiment of the application provides a game picture processing method and device, computer equipment and a storage medium, which can improve the efficiency of processing game pictures in games.
The embodiment of the application provides a game picture processing method, which comprises the following steps:
determining an initial position of a target virtual item in the game picture;
determining a target weapon field angle of the target virtual prop in the game picture;
adjusting the initial position based on the target weapon field angle and the target display size of the display screen to obtain an adjusted position;
determining the target scene position of the target virtual prop in the game scene according to the adjusted position;
and updating the game picture based on the target scene position and the target virtual prop.
Correspondingly, the embodiment of the present application further provides a game picture processing apparatus, including:
the first determining unit is used for determining the initial position of the target virtual item in a preset game picture;
the second determination unit is used for determining the target weapon field angle of the target virtual item in the current game picture;
the adjusting unit is used for adjusting the initial position based on the target weapon field angle and the target display size of the display screen to obtain an adjusted position;
a third determining unit, configured to determine a target scene position of the target virtual item in the game scene according to the adjusted position;
and the updating unit is used for updating the current game picture based on the target scene position and the target virtual prop.
In some embodiments, the adjusting unit comprises:
the first acquisition subunit is used for acquiring a preset display proportion of the preset game picture and a preset scene field angle;
the first processing subunit is configured to process the initial position according to the preset display scale and the target display size, so as to obtain a current position of the target virtual prop in the current game picture;
the first determining subunit is used for determining a position change coefficient according to the preset scene field angle and the target weapon field angle;
and the second determining subunit is configured to determine, based on the current position and the position change coefficient, an adjusted position of the target virtual item on the current game screen.
In some embodiments, the first processing subunit is specifically configured to:
calculating a target display scale of the display screen according to the target display size;
calculating the ratio of the target display proportion to the preset display proportion to obtain a display proportion matching coefficient;
and calculating a multiplication value of the initial position and the display scale matching coefficient to obtain the current position.
In some embodiments, the first determining subunit is specifically configured to:
determining a scene focal length of the preset game picture based on the preset scene field angle;
determining a weapon focal length of the target virtual prop based on the target weapon field angle;
and calculating the ratio of the scene focal length to the weapon focal length to obtain the position change coefficient.
In some embodiments, the second determining subunit is specifically configured to:
and calculating a multiplication value of the current position and the position change coefficient to obtain the adjusted position.
In some embodiments, the third determination unit comprises:
the second obtaining subunit is configured to obtain an initial scene position of the target virtual item in the game scene;
and the second processing subunit is used for processing the initial scene position according to the adjusted position to obtain the target scene position.
In some embodiments, the second processing subunit is specifically configured to:
determining position offset information for the initial scene position based on the adjusted position;
and superposing the initial scene position and the position offset information to obtain the target scene position.
In some embodiments, the second processing subunit is further specifically configured to:
acquiring a first coordinate system to which the adjusted position belongs and a second coordinate system to which the initial scene position belongs;
projecting the adjusted position to the second coordinate system based on the relation between the first coordinate system and the second coordinate system to obtain a target position of the adjusted position in the second coordinate system;
calculating the offset distance between the initial scene position and the target position to obtain the position offset information;
and superposing the initial scene position and the position offset information to obtain the target scene position.
In some embodiments, the first determination unit comprises:
the third obtaining subunit is configured to obtain an initial scene position of the target virtual item in the game scene and a preset display size;
a calculating subunit, configured to calculate the initial position based on the initial scene position and the preset display size.
In some embodiments, the second determination unit comprises:
the fourth acquisition subunit is used for acquiring the weapon type of the target virtual prop;
a third determining subunit for determining the target weapon field of view from a plurality of preset weapon field of view angles based on the weapon type.
In some embodiments, the update unit comprises:
an adjusting subunit, configured to adjust the target virtual item from an initial scene position in the game scene to the target scene position;
and the display subunit is used for updating and displaying the target virtual prop after the adjustment position on the current game picture.
In some embodiments, the apparatus further comprises:
the acquisition unit is used for acquiring the field angle of a target scene of a virtual camera in the horizontal direction, and the virtual camera is used for acquiring the game scene;
the generating unit is used for generating the preset game picture based on the game scenes in the field angle range of the target scene.
Correspondingly, the embodiment of the present application further provides a computer device, which includes a memory, a processor and a computer program stored on the memory and executable on the processor, wherein the processor executes the game picture processing method provided in any one of the embodiments of the present application.
Correspondingly, the embodiment of the application also provides a storage medium, wherein the storage medium stores a plurality of instructions, and the instructions are suitable for being loaded by the processor to execute the game picture processing method.
According to the method and the device for processing the virtual item backpack, when the current player and the teammate player pick up the same virtual item in the same item backpack, the number of the teammate player for acquiring the virtual item is determined, the player for acquiring the virtual item and the acquisition number corresponding to the player are displayed on the game interface, so that the current player can clearly know the pickup information of each teammate player for the virtual item, if the current player lacks the virtual item, the target teammate player needing to send an item acquisition request can be quickly determined according to the pickup information of the virtual item, and therefore the processing efficiency of game pictures in the game can be improved.
Drawings
In order to more clearly illustrate the technical solutions in the embodiments of the present application, the drawings needed to be used in the description of the embodiments are briefly introduced below, and it is obvious that the drawings in the following description are only some embodiments of the present application, and it is obvious for those skilled in the art to obtain other drawings based on these drawings without creative efforts.
Fig. 1 is a scene schematic diagram of a game picture processing system according to an embodiment of the present application.
Fig. 2 is a schematic flow chart of a game screen processing method according to an embodiment of the present application.
Fig. 3 is a flowchart illustrating another game screen processing method according to an embodiment of the present application.
Fig. 4 is a schematic view of an application scenario of the game screen processing method according to the embodiment of the present application.
Fig. 5 is a schematic view of another application scenario of the game screen processing method according to the embodiment of the present application.
Fig. 6 is a schematic view of another application scenario of the game screen processing method according to the embodiment of the present application.
Fig. 7 is a block diagram of a game screen processing device according to an embodiment of the present application.
Fig. 8 is a schematic structural diagram of a computer device according to an embodiment of the present application.
Detailed Description
The technical solutions in the embodiments of the present application will be described clearly and completely with reference to the drawings in the embodiments of the present application, and it is obvious that the described embodiments are only a part of the embodiments of the present application, and not all of the embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present application.
The embodiment of the application provides a game picture processing method and device, a storage medium and computer equipment. Specifically, the game screen processing method according to the embodiment of the present application may be executed by a computer device, where the computer device may be a terminal or a server or other devices. The terminal may be a terminal device such as a smart phone, a tablet Computer, a notebook Computer, a touch screen, a game machine, a Personal Computer (PC), a Personal Digital Assistant (PDA), and the like, and may further include a client, which may be a game application client, a browser client carrying a game program, or an instant messaging client, and the like. The server may be an independent physical server, a server cluster or a distributed system formed by a plurality of physical servers, or a cloud server providing basic cloud computing services such as cloud service, a cloud database, cloud computing, a cloud function, cloud storage, network service, cloud communication, middleware service, domain name service, security service, CDN, and a big data and artificial intelligence platform.
For example, when the game screen processing method is operated on the terminal, the terminal device stores a game application program and is used for presenting a virtual scene in a game screen. The terminal device is used for interacting with a user through a graphical user interface, for example, downloading and installing a game application program through the terminal device and running the game application program. The manner in which the terminal device provides the graphical user interface to the user may include a variety of ways, for example, the graphical user interface may be rendered for display on a display screen of the terminal device or presented by holographic projection. For example, the terminal device may include a touch display screen for presenting a graphical user interface including a game screen and receiving operation instructions generated by a user acting on the graphical user interface, and a processor for executing the game, generating the graphical user interface, responding to the operation instructions, and controlling display of the graphical user interface on the touch display screen.
For example, when the game screen processing method is executed on a server, the game screen processing method may be a cloud game. Cloud gaming refers to a gaming regime based on cloud computing. In the running mode of the cloud game, the running main body of the game application program and the game picture presenting main body are separated, and the storage and the running of the game picture processing method are finished on the cloud game server. The game screen presentation is performed at a cloud game client, which is mainly used for receiving and sending game data and presenting the game screen, for example, the cloud game client may be a display device with a data transmission function near a user side, such as a mobile terminal, a television, a computer, a palm computer, a personal digital assistant, and the like, but a terminal device for performing game data processing is a cloud game server at the cloud end. When a game is played, a user operates the cloud game client to send an operation instruction to the cloud game server, the cloud game server runs the game according to the operation instruction, data such as game pictures and the like are encoded and compressed, the data are returned to the cloud game client through a network, and finally the data are decoded through the cloud game client and the game pictures are output.
Referring to fig. 1, fig. 1 is a schematic view of a scene of a game screen processing system according to an embodiment of the present disclosure. The system may include at least one terminal, at least one server, at least one database, and a network. The terminal held by the user can be connected to servers of different games through a network. A terminal is any device having computing hardware capable of supporting and executing a software product corresponding to a game. In addition, the terminal has one or more multi-touch sensitive screens for sensing and obtaining input of a user through a touch or slide operation performed at a plurality of points of one or more touch display screens. In addition, when the system includes a plurality of terminals, a plurality of servers, and a plurality of networks, different terminals may be connected to each other through different networks and through different servers. The network may be a wireless network or a wired network, such as a Wireless Local Area Network (WLAN), a Local Area Network (LAN), a cellular network, a 2G network, a 3G network, a 4G network, a 5G network, etc. In addition, different terminals 1000 may be connected to other terminals or a server using their own bluetooth network or hotspot network. For example, multiple users may be online through different terminals to connect and synchronize with each other over a suitable network to support multiplayer gaming. Additionally, the system may include a plurality of databases coupled to different servers and in which information relating to the gaming environment may be stored continuously as different users play the multiplayer game online.
The embodiment of the application provides a game picture processing method, which can be executed by a terminal or a server. The embodiment of the present application will be described with an example in which the game screen processing method is executed by a terminal. The terminal comprises a touch display screen and a processor, wherein the touch display screen is used for presenting a graphical user interface and receiving an operation instruction generated by a user acting on the graphical user interface. When a user operates the graphical user interface through the touch display screen, the graphical user interface can control the local content of the terminal through responding to the received operation instruction, and can also control the content of the opposite-end server through responding to the received operation instruction. For example, the operation instruction generated by the user acting on the graphical user interface comprises an instruction for starting a game application, and the processor is configured to start the game application after receiving the instruction provided by the user for starting the game application. Further, the processor is configured to render and draw a graphical user interface associated with the game on the touch display screen. A touch display screen is a multi-touch sensitive screen capable of sensing a touch or slide operation performed at a plurality of points on the screen at the same time. The user uses a finger to perform touch operation on the graphical user interface, and when the graphical user interface detects the touch operation, different virtual objects in the graphical user interface of the game are controlled to perform actions corresponding to the touch operation. For example, the game may be any one of a leisure game, an action game, a role-playing game, a strategy game, a sports game, a game of chance, and the like. Wherein the game may include a virtual scene of the game drawn on a graphical user interface. Further, one or more virtual objects, such as virtual characters, controlled by the user (or player) may be included in the virtual scene of the game. Additionally, one or more obstacles, such as railings, ravines, walls, etc., may also be included in the virtual scene of the game to limit movement of the virtual objects, e.g., to limit movement of one or more objects to a particular area within the virtual scene. Optionally, the virtual scene of the game also includes one or more elements, such as skills, points, character health, energy, etc., to provide assistance to the player, provide virtual services, increase points related to player performance, etc. In addition, the graphical user interface may also present one or more indicators to provide instructional information to the player. For example, a game may include a player-controlled virtual object and one or more other virtual objects (such as an enemy character). In one embodiment, one or more other virtual objects are controlled by other players of the game. For example, one or more other virtual objects may be computer controlled, such as a robot using Artificial Intelligence (AI) algorithms, to implement a human-machine fight mode. For example, the virtual objects possess various skills or capabilities that the game player uses to achieve the goal. For example, the virtual object possesses one or more weapons, props, tools, etc. that may be used to eliminate other objects from the game. Such skills or capabilities may be activated by a player of the game using one of a plurality of preset touch operations with a touch display screen of the terminal. The processor may be configured to present a corresponding game screen in response to an operation instruction generated by a touch operation of a user.
It should be noted that the scene schematic diagram of the game screen processing system shown in fig. 1 is merely an example, the image processing system and the scene described in the embodiment of the present application are for more clearly illustrating the technical solution of the embodiment of the present application, and do not form a limitation on the technical solution provided in the embodiment of the present application, and as a person having ordinary skill in the art knows, with the evolution of the game screen processing system and the occurrence of a new service scene, the technical solution provided in the embodiment of the present application is also applicable to similar technical problems.
In view of the foregoing problems, embodiments of the present application provide a game screen processing method, device, computer device, and storage medium, which can improve efficiency of processing a game screen in a game. The following are detailed below. It should be noted that the following description of the embodiments is not intended to limit the preferred order of the embodiments.
The present application provides a game screen processing method, which can be executed by a terminal or a server, and the game screen processing method is described as an example executed by a terminal in the present application.
Referring to fig. 2, fig. 2 is a schematic flow chart illustrating a game screen processing method according to an embodiment of the present application. The specific flow of the game picture processing method can be as follows:
101. and determining the initial position of the target virtual item in a preset game picture.
The target virtual prop refers to a weapon prop used by a current game player in the game to attack other virtual characters, for example, the target virtual prop may be a virtual gun, a virtual dagger, a virtual bomb, or the like.
In the embodiment of the application, the game may be a shooting game at a first person perspective, that is, an FPS game, which refers to a shooting type electronic game that is played with the first person perspective of the player as a main perspective, where a view of a current virtual character is displayed in a game screen of the shooting game at the first person perspective, and the current virtual character does not appear in the game screen, so that the player can experience visual impact brought by the game personally.
The game picture comprises part of game scenes, target virtual props and part of other virtual characters of the game. The current game player can observe the change of the game scene through the game picture to observe whether other virtual characters are close to the game scene, so as to determine the attack target. The preset game picture refers to a game picture generated by displaying the collected game scene visual field in a preset display size. For example, the preset display size may be: 1600x900 (width x height).
The initial position refers to the position of the target virtual item in a preset game picture.
In some embodiments, in order to quickly determine the initial position of the target virtual item, the step "determining the initial position of the target virtual item on the preset game screen" may include the following operations:
acquiring an initial scene position of the target virtual prop in a game scene and a preset display size;
an initial position is calculated based on the initial scene position and a preset display size.
The game scene refers to a virtual character, a virtual item in the game and a 3D (3 Dimensions) game space in which a virtual environment exists. The initial scene position refers to the 3D coordinates of the target virtual item in the game scene. For example, the initial scene position may be (a, b, c), where a, b, c respectively represent a value on each coordinate axis of the three-dimensional space.
And obtaining an initial position for a preset display size according to the initial scene position, namely converting the three-dimensional coordinate position of the target virtual prop in the game scene into a two-dimensional coordinate position in the game picture.
In the embodiment of the present application, any three-dimensional coordinate position in the game scene may be converted to a two-dimensional coordinate position in the game screen using an official vector operation expression transform (conversion position) of a UE4 (unknown Engine4, illusion 4) Engine.
Specifically, the Transform expression transforms three-channel vector values from one reference coordinate system to another. By default, all shader computations for a texture are done in tangential space. Vector constants, camera vectors, and ray vectors are transformed into tangential space before they are used in the material. The Transform expression allows the transformation of these vectors from the tangent space to a global space, local space or view space coordinate system. In addition, it allows the global and local spatial vectors to be converted to any other reference coordinate system. The TransformPosition expression can convert any location in the screen space to the target space specified by the TransformType variable of the expression.
Further, based on the Transform expression and the TransformPosition expression, the initial position of the target virtual item at the initial scene position in the game scene in the game picture with the preset display size can be calculated.
In some embodiments, to ensure display consistency of the view of the game scene, before the step "determining an initial position of the target virtual item in the preset game screen", the following steps may be further included:
acquiring a target scene field angle of the virtual camera in the horizontal direction;
and generating a preset game picture based on the game scene in the field angle range of the target scene.
Wherein the virtual camera is used for capturing a game scene. The target scene field angle refers to a horizontal field angle that can maximally capture a horizontal field of view.
Since different display devices may have different display screen ratios, for horizontal and vertical angles of view, when one direction is locked, the other direction cannot be compatible. In the FPS game, the map scene is mostly horizontal terrain, and enemies often appear in the horizontal direction. Moreover, most of hand games of the FPS game are horizontal operation playing methods, and players are more used to operate in the left and right directions. Therefore, the importance of the horizontal direction is much greater than that of the vertical direction in view fairness. Therefore, on the premise that the game picture is locked in one direction and the other direction cannot be considered, the game fairness can be ensured by limiting the horizontal viewing angle.
Furthermore, the game scenes in the field angle range of the target scene are acquired, and the preset game pictures are generated according to the acquired game scene contents, so that the consistency of the field in the horizontal direction can be ensured, and the fair fight among different game players is ensured.
102. And determining the angle of the target weapon field of the target virtual prop in the current game picture.
The target weapon visual field angle refers to a visual field angle set for the target virtual prop alone. After the horizontal view of the game scene is defined, for the display devices with different display sizes, the vertical view may be different, that is, the smaller the width in the display size is, the narrower the view is, and the game picture in the vertical direction needs to be cut. And the position of the target virtual prop in the game picture is positioned below the picture, so that the target virtual prop can be cut out when the game picture is cut. Therefore, the display of the target virtual item is incomplete, and the operation of a game player is influenced.
In the embodiment of the application, the field angle of the game scene and the field angle of the target virtual prop are divided, and the field angle is set for the target virtual prop independently, so that the display integrity of the target virtual prop can be ensured.
In some embodiments, in order to ensure the display effect of the target virtual item, the step "determining the target weapon field angle of the target virtual item in the current game screen" may include the following operations:
acquiring weapon types of the target virtual props;
a target weapon field of view is determined from a plurality of preset weapon field of view angles based on the weapon type.
Specifically, the virtual props of different weapon types may have different appearance shapes and different sizes, and the different virtual props may have different expressions under the same weapon field angle, so that different weapon field angles, that is, preset weapon field angles, may be set for the virtual props of different weapon types, and an optimal display field angle is set for the virtual prop of each weapon type.
For example, weapon types may include: a first weapon type, a second weapon type, a third weapon type, a fourth weapon type, etc. The first weapon type corresponds to a first preset weapon field angle, the second weapon type corresponds to a second preset weapon field angle, the third weapon type corresponds to a third preset weapon field angle, and the fourth weapon type corresponds to a fourth preset weapon field angle. The weapon type acquired to the target virtual item may be a first weapon type, and then it may be further determined that the target weapon field angle of the target virtual item may be: a first preset weapon field of view.
103. And adjusting the initial position based on the field angle of the target weapon and the target display size of the display screen to obtain the adjusted position.
The display screen refers to a display screen of the current terminal, and the target display size refers to a display size of the display screen, for example, the target display size may be 2100x 900. After the target weapon field angle and the target display size are determined, the initial position can be adjusted to obtain an adjusted position.
In some embodiments, to improve the position adjustment efficiency, the step of "adjusting the initial position based on the target weapon field angle and the target display size of the display screen, resulting in the adjusted position" may include the following operations:
acquiring a preset display proportion of a preset game picture and a preset scene field angle;
processing the initial position according to a preset display proportion and a target display size to obtain the current position of the target virtual prop in the current game picture;
determining a position change coefficient according to a preset scene field angle and a target weapon field angle;
and determining the adjusted position of the target virtual prop in the current game picture based on the current position and the position change coefficient.
The preset game picture refers to a game picture displayed on a default terminal display screen in a game design process, and the preset display ratio refers to a display ratio of the default terminal display screen, for example, the preset display ratio may be 16/9 (i.e., screen width/height). The preset scene view angle refers to a scene view angle for acquiring a game scene to obtain a preset game picture. For example, the preset scene angle may be: 90 degrees, etc.
In some embodiments, in order to improve the position calculation accuracy, the step "processing the initial position according to the preset display scale and the target display size to obtain the current position of the target virtual item in the current game screen" may include the following operations:
calculating a target display scale of the display screen according to the target display size;
calculating the ratio of the target display proportion to a preset display proportion to obtain a display proportion matching coefficient;
and calculating the multiplication value of the initial position and the display scale matching coefficient to obtain the current position.
Specifically, the target display scale refers to a ratio of a width to a height in the target display size.
For example, the target display size may be 2100x900, and the target display scale is calculated as: 2000/900 ═ 20/9.
Further, a ratio of the target display scale to the preset display scale is calculated, and the ratio can be used as a display scale matching coefficient.
For example, the target display scale may be: 20/9, the preset display scale may be: 16/9, calculating the ratio of the target display scale to the preset display scale: (20/9)/(16/9) 1.25.
The calculation of the multiplication value of the initial position and the display scale matching coefficient refers to the multiplication of the coordinate of the initial position and the display scale matching coefficient, that is, the multiplication of each coordinate value in the coordinate of the initial position and the display scale matching coefficient respectively to obtain a new coordinate, so as to obtain the current position. The current position refers to a position of the target virtual item after adjustment in a preset game picture, that is, a position when the current game picture is displayed.
For example, the initial position may be (1000, 500), and the display scale matching coefficient may be: 1.25. specifically, the product of the initial position and the display scale matching coefficient may be: (1000x1.25, 500x1.25) ═ 1250,625, that is, the coordinates of the current position may be: (1250, 625).
In some embodiments, in order to improve the position calculation accuracy, the step "determining the position change coefficient according to the preset scene angle and the target weapon angle" may include the following operations:
determining a scene focal length of a preset game picture based on a preset scene field angle;
determining a weapon focal length of the target virtual prop based on the target weapon field angle;
and calculating the ratio of the scene focal length to the weapon focal length to obtain a position change coefficient.
Wherein, the scene focal length is calculated according to the preset scene field angle: firstly, the radian value of a preset scene field angle can be calculated, and a radian calculation formula can be as follows: preset scene field angle x PI (circumferential ratio)/360. For example, the preset scene angle may be 90 degrees, and the radian value is 90xPI/360 0.5235.
Further, after calculating the arc value, according to the focal length calculation formula: the focal length is tan (radian value) to calculate the scene focal length. Wherein tan represents the tangent value in a mathematical function.
For example, the camber value may be: 0.5235, if the scene focal length tan (0.5235) is 1, then the scene focal length is calculated as: 1.
similarly, the weapon focal length is calculated according to the above focal length calculation method, for example, the target weapon field angle may be 60 degrees, and the arc value is first calculated as: 0.3490 is 60x PI/360, and further 1.7322 is calculated as the weapon focal length tan (0.3490).
The position variation coefficient refers to a ratio of a scene focal length to a weapon focal length, for example, the scene focal length may be: the weapon focal length may be 1: 1.7322, the position change coefficient is calculated to be 1/1.7322-0.5773.
In some embodiments, to improve the position calculation accuracy, the step "determining an adjusted position of the target virtual item on the current game screen based on the current position and the position change coefficient" may include the following operations:
and calculating the multiplication value of the current position and the position change coefficient to obtain the adjusted position.
For example, the current position may be (1250,625), and the position change factor may be: 0.5773, calculating the multiplication value of the current position and the position change coefficient (1250x0.5773, 625x0.5773) to (721.625,360.8125), where there may be a small value in the calculated position coordinate value, and in order to reduce the calculation amount, the small value may be removed, then (721.625,360.8125) may be processed to (722, 361), and then the adjusted position may be: (722, 361). Or in order to ensure the accuracy of the display position, the decimal value is not processed, which is determined according to the actual situation.
104. And determining the target scene position of the target virtual prop in the game scene according to the adjusted position.
The target scene position refers to a position of the target virtual item in a game scene after the target virtual item is adjusted according to the field angle of the target weapon.
In some embodiments, in order to improve the display effect of the virtual item, the step "determining the target scene position of the target virtual item in the game scene according to the adjusted position" may include the following operations:
acquiring an initial scene position of a target virtual prop in a game scene;
and processing the initial scene position according to the adjusted position to obtain the target scene position.
The initial scene position refers to a position of the target virtual item in a game scene in a game picture generated in a preset display size. For example, the initial scene position may be: (a, b, c).
Further, the initial scene position is processed according to the adjusted position, that is, the adjusted position is converted to a position in the game scene, so that the target scene position can be obtained.
In some embodiments, in order to ensure the accuracy of the position adjustment, the step "processing the initial scene position according to the adjusted position to obtain the target scene position" may include the following operations:
determining position offset information of the initial scene position based on the adjusted position;
and superposing the initial scene position and the position deviation information to obtain a target scene position.
The position offset information refers to a position distance difference between a position obtained after the adjusted position is converted into a game scene and an initial scene position of the target virtual item in the game scene.
In some embodiments, to improve the position calculation efficiency, the step "determining the position offset information of the initial scene position based on the adjusted position" may include the following operations:
acquiring a first coordinate system to which the adjusted position belongs and a second coordinate system to which the initial scene position belongs;
projecting the adjusted position to a second coordinate system based on the relation between the first coordinate system and the second coordinate system to obtain a target position of the adjusted position in the second coordinate system;
and calculating the offset distance between the initial scene position and the target position to obtain position offset information.
The first coordinate system refers to a coordinate system corresponding to the game screen, and may be a two-dimensional coordinate system, that is, a coordinate system formed by two coordinate axis directions. The second coordinate system refers to a coordinate system corresponding to the game scene, and may be a three-dimensional coordinate system, that is, a coordinate system formed by three coordinate axis directions.
Specifically, the adjusted position in the first coordinate system is projected to the second coordinate system, an official vector operation expression (conversion position) of a UE4 (unknown Engine4, illusion 4) Engine may be used, and the conversion position expression may convert a three-channel vector value from one reference coordinate system to another reference coordinate system, that is, may convert the adjusted position in the first coordinate system to a position in the second coordinate system, so as to obtain a target position.
The initial scene position is a position in a game scene, the target position is a position in the game scene, the initial scene position and the target position are in the same coordinate system, and the offset distance between the initial scene position coordinate and the target position coordinate can be calculated according to the initial scene position coordinate and the target position coordinate.
For example, the initial scene position coordinates may be: (50, 100, 120), the target location coordinates may be: (60, 150, 80), then calculating the offset distance of the initial scene position from the target position may be: (60, 150, 80) - (50, 100, 120) ═ 10, 50, -40), that is, the positional deviation information is obtained as: (10, 50, -40).
Further, after the position offset information is obtained through calculation, the initial scene position and the position offset information may be superimposed to obtain the target scene position.
For example, the positional offset information is: (10, 50, -40), the initial scene position may be: (50, 100, 120), the initial scene position and the position offset information are superimposed, and the obtained target scene position may be: (60, 150, 80).
105. And updating the current game picture based on the target scene position and the target virtual prop.
In some embodiments, to enhance the player's gaming experience, the step "update the game screen based on the target scene location and the target virtual item" may include the following operations:
adjusting the target virtual prop from an initial scene position in a game scene to a target scene position;
and updating and displaying the target virtual prop after the position is adjusted on the current game picture.
The target scene position is the position of the target virtual item after being adjusted in the game scene, and the initial scene position of the target virtual item in the game scene is adjusted to the target scene position, so that the adjustment of the target virtual item in the game picture display process can be realized.
Furthermore, in order to improve the display effect of the target virtual prop, the target virtual prop after position adjustment can be displayed in the current game picture, and therefore the field angles of the scene picture and the target virtual prop are respectively set, so that the horizontal field of view of the scene is guaranteed, and the display effect of the virtual prop is improved.
The embodiment of the application discloses a game picture processing method, which comprises the steps of transforming world space coordinates of a virtual item under a first person visual angle in a game scene into coordinates in a game picture displayed on a terminal screen; matching the coordinates in the game picture with the terminal screen ratio on the terminal screen; transforming (projecting) the coordinates of the new game picture obtained after matching back to world space coordinates; further, calculating the offset of world space coordinates before and after matching; the offset value is applied to world space coordinates to serve as correction, and the position of the virtual prop can be adjusted independently. Therefore, all pictures of a game scene are collected through one virtual camera, and then the positions of the virtual props in the scene pictures are adjusted according to the game picture processing method, so that the problem of performance consumption increase caused by respectively capturing the pictures of the game and the virtual props through two virtual cameras is solved. Meanwhile, the virtual prop and the scene visual field are split to be respectively controlled, so that the purposes of guaranteeing fair visual field and competitive performance and giving consideration to the display effect and the performance quality of the virtual prop are achieved.
Based on the above description, the game screen processing method of the present application will be further described below by way of example. Referring to fig. 3, fig. 3 is a schematic flow chart of another game screen processing method according to an embodiment of the present application, and taking the game screen processing method specifically applied to a game screen processing of a server for a shooting game at a first-person perspective as an example, a specific flow may be as follows:
201. the server obtains an initial scene space position of the target virtual gun in a game scene of the first-person perspective shooting game.
In the embodiment of the application, in order to guarantee the game experience of the player, the target virtual gun in the game picture is fixedly displayed with the hand and a cutting arm of the current virtual character. The target virtual gun is bound with the display positions of the hand and the arm, and the positions of the hand and the arm are adjusted subsequently when the position of the target virtual gun is adjusted.
Wherein the target virtual gun is composed of a plurality of vertexes in the game scene, and the initial scene space position comprises the space coordinates of each vertex in the scene space. The game scene can be a three-dimensional scene, namely a coordinate system formed by three coordinate axes. The spatial coordinates may be three-dimensional spatial coordinates.
For example, a target virtual firearm includes a plurality of vertices: first vertex, second vertex, third vertex, fourth vertex, etc., then the initial scene space position includes: the spatial coordinates of the first vertex, the spatial coordinates of the second vertex, the spatial coordinates of the third vertex, the spatial coordinates of the fourth vertex, and the like.
202. And the server converts the scene space position into a game picture to obtain the initial game picture position of the target virtual gun in the game picture.
The game picture is displayed through the terminal display screen, and the game picture can be a two-dimensional picture, namely a coordinate system formed by two coordinate axes.
Specifically, in the scene space position conversion value game picture, that is, the space coordinates of each vertex of the target virtual gun in the game scene are converted into the coordinates in the game picture, that is, the screen coordinates in the display screen. The spatial coordinates of each vertex of the target virtual gun in the game scene may be converted to the coordinates in the game screen by using a vector operation expression (conversion position) of the UE4 engine, and the specific conversion manner may be as described in the above embodiments, which is not described herein.
For example, referring to fig. 4, fig. 4 is a schematic view of an application scenario of a game screen processing method according to an embodiment of the present application. In fig. 4, a screen coordinate system is configured with the horizontal direction of the display screen as the X axis and the vertical direction of the display screen as the Y axis. The upper left vertex of the display screen is used as the origin 0 of the coordinate system, the horizontal right direction is used as the positive direction of the X axis, and the vertical downward direction is used as the positive direction of the Y axis.
For example, in fig. 4, the vertex S may be a vertex of the target virtual gun, and in the display screen, the coordinates of the vertex S may be (X1, Y1), which indicates that the distance value in the X-axis direction is X1 and the distance value in the Y-axis direction is Y1. Therefore, the coordinates of each vertex of the target virtual gun in the screen coordinate system are obtained, and the initial game picture position of the target virtual gun is obtained.
203. The server acquires the gun field angle and the screen ratio and determines the target game picture position according to the gun field angle, the screen ratio and the game picture position.
The screen proportion comprises a default screen proportion and a target screen proportion, wherein the default screen proportion is the length and width proportion of the default display screen, and the target screen proportion is the length and width proportion of the designated display screen.
The gun view angle is set for the target virtual gun independently and can be set according to the type of the target virtual gun.
In the embodiment of the present application, the calculation formula of the target game screen position is as follows:
new view space vertex position. xy ═ view space vertex position. xy: (view size. w. view size. h): base focal length/tan (weapnfov. PI/360.0).
Further, the above formula can be refined as follows:
new view space vertex position. xy ═ view space vertex position. xy: (view size. w/view size. h)/(reference view size. w/reference view size. h): reference focal length/tan (weapon fov PI/360.0).
Wherein, the parameters in the above formula are defined as follows:
new viewspacevertexposition. xy: a target game frame position for each vertex of the target virtual gun;
xy view space vertex position: an initial game frame position of each vertex of the target virtual gun;
③ viewsize. w and viewsize. h: the length and the width of a display screen are specified, and the length and the width of a terminal can also be specified;
and fourthly, standard Viewsize.w and standard Viewsize.h: the length and width of the default terminal screen, that is, the default length and width of the screen space of the virtual camera, can be understood as the length and width of the default terminal screen before conversion is not performed (in the first step, the target virtual gun is converted from the world space position to the screen space, which is actually converted to the default terminal screen, and in the second step, the matching is performed by matching the length and width of the target virtual gun on the default terminal screen with the length and width of the designated terminal screen);
the standard focal length: the FOV can be calculated from the default world scene through a focal length/field angle formula;
sixthly, the WeaponFOV: the set FOV value of the virtual gun.
For example, the reference viewsize.w and the reference viewsize.h may be: 1600 and 900, i.e., 1600 and 900, respectively, the length and width of the default terminal screen (the default screen of 16:9 is more common than it is). If the default world scene FOV is 90 degrees and the reference focal length is 1, and the terminal screen length and width are specified, that is, view size.w and view size.h can be 2100 and 900, respectively, and the weipon FOV can be configured to be 60 degrees, the process of adjusting the initial game screen position to the target game screen position is as follows: each vertex on the target virtual firearm for the 1600:900 default screen ratio is matched to the designated terminal screen of 2100:900 and the target virtual firearm FOV is configured to be 60 degrees (there are no specific units for 1600, 900, 2100, only to demonstrate screens of 16:9 and 21: 9).
Specifically, please refer to fig. 5, and fig. 5 is a schematic view of another application scenario of the game screen processing method according to the embodiment of the present application. In fig. 5, on the default screen with size (W: 1600, H: 900), a vertex Q on the target virtual gun is determined, and the screen space coordinates of the vertex Q may be: (1400, 700).
Further, taking vertex Q (1400, 700) as an example, the position transformation process of the vertex in the target virtual gun is demonstrated (other vertices are the same). Adopting the above calculation formula:
new view space vertex position. xy ═ view space vertex position. xy: (view size. w/view size. h)/(reference view size. w/reference view size. h): reference focal length/tan (weather fov PI/360.0)
First, viewspacevertexposition.xy (viewsize.w/viewsize.h)/(benchmark viewsize.w/benchmark viewsize.h) is calculated, which is a step for associating the vertex with the target terminal screen zoom ratio at the default terminal screen, i.e., transforming the vertex screen coordinates from the default terminal screen to the target terminal screen.
The specific calculation process may be: calculating the aspect ratio of the target terminal screen as follows according to the target terminal screen size (viewsize. w, viewsize. h) ═ (2100, 900):
ViewSize.w/ViewSize.h=2100/900≈2.3。
then, by the default screen size (reference viewsize.w reference viewsize.h) ═ 1600, 900, the default screen aspect ratio is calculated as:
reference viewsize.w/reference viewsize.h 1600/900 ≈ 1.78.
Further, calculating (viewsize.w/viewsize.h)/(benchmark viewsize.w/benchmark viewsize.h), which is the matching coefficient of the target terminal screen with respect to the default terminal screen.
Xy and matching coefficient for the initial vertex coordinate position viewspacevertexposition: (Viewsize. w/Viewsize. h)/(Viewsize. w/Viewsize. h) to yield:
x=1400*2.3/1.78≈1838.0;
y=700*2.3/1.78≈919.0。
here, the WeaponFOVScaler is a reference focal length/tan (weaponffov PI/360.0), and this step is to substitute the configured weaponffov (target virtual gun FOV) into the calculation formula to realize the separation of the virtual gun FOV from the world FOV and the configuration of the virtual gun supporter FOV, that is, to adjust the target virtual gun performance, that is, the calculated result weaponfvscler represents the zoom value of the FOV to the vertex coordinate.
The specific calculation process may be: and calculating the reference focal length. For example, the world scene FOV is 90 degrees. Firstly, converting the world scene FOV angle into radian: 90.0 PI/360.0 0.5235, then the focal length is obtained according to the focal length/field angle formula: tan (0.5235) ═ 1.0. That is, when the world scene FOV is 90, the focal length is 1, and therefore 1 is the reference focal length.
Further, the weipon fov can be configured to be 60 degrees, and then weipon fovrcaler is 1/tan (60.0 PI/360.0) 0.5773. The calculated result is the WeaponFOVScaler with the WeaponFOV set to 60. Wherein the value of the WeaponFOVScaler may vary depending on the value of the configured WeaponFOV, representing the effect of the configured WeaponFOV on the coordinates of the vertices.
Then, viewspacevertexposition.xy (viewsize.w/viewsize.h)/(benchmark viewsize.w/benchmark viewsize.h) × benchmark focal length/tan (weapnfov PI/360.0) was calculated. The specific calculation may be as follows:
NewViewSpaceVertexPosition.x=1,838.0*0.5773≈1061.1;
NewViewSpaceVertexPosition.y=919.0*0.5773≈530.5。
the final result, new view space vertex position, xy (1061.1, 530.5), is the transformed coordinates of the vertex Q. The above is the position transformation process of the vertex Q, and for other vertices in the target virtual gun, the same is performed in the above manner, that is, the transformed coordinates of all vertices of the target virtual gun can be obtained, so as to obtain the position of the target game picture.
204. And the server converts the position of the target game picture into a game scene to obtain the space position of the converted scene.
Specifically, the position of the target game frame is converted to the game scene, that is, the position of the target virtual gun in the terminal screen coordinate system is converted to the position in the scene space coordinate system, and the conversion mode may be a conversion mode of a vector operation expression transform position, which may be referred to in the above embodiments and is not described herein in detail.
205. And the server adjusts the initial scene space position based on the converted scene space position to obtain the target scene space position of the target virtual gun in the game scene.
After the converted scene space position is determined, a position offset value of the initial scene space position and the converted scene space position is calculated, the initial scene space position is adjusted based on the position offset value, so that the position of the target virtual gun in the game scene is adjusted, then the position of the target virtual gun in the game picture is updated based on the adjusted scene position, and the display effect of the target virtual gun can be improved.
For example, please refer to fig. 6, and fig. 6 is a schematic view of another application scenario of the game screen processing method according to the embodiment of the present application. In the game picture above fig. 6, the display of the target virtual gun is incomplete, and the position of the target virtual gun in the game scene is updated through the position adjustment mode, so that the target virtual gun is completely displayed in the game picture, thereby improving the operation hand feeling of the player and further improving the game experience of the player.
The embodiment of the application discloses a game picture processing method, which comprises the following steps: the method comprises the steps that a server obtains an initial scene space position of a target virtual gun in a game scene of a first-person visual angle shooting game, the scene space position is converted into a game picture, the initial game picture position of the target virtual gun in the game picture is obtained, the gun visual angle and the screen proportion are obtained, the target game picture position is determined according to the gun visual angle, the screen proportion and the game picture position, the game picture position is converted into the game scene, the converted scene space position is obtained, the initial scene space position is adjusted based on the converted scene space position, and the target scene space position of the target virtual gun in the game scene is obtained. Therefore, the user can quickly send the acquisition request to the target virtual character with the largest number of the picked target game props, the user operation can be simplified, and the game experience of the user is improved.
In order to better implement the game image processing method provided by the embodiment of the present application, the embodiment of the present application further provides a game image processing device based on the game image processing method. The meaning of the noun is the same as that in the game picture processing method, and specific implementation details can refer to the description in the method embodiment.
Referring to fig. 7, fig. 7 is a block diagram of a game screen processing device according to an embodiment of the present application, where the device includes:
a first determining unit 301, configured to determine an initial position of a target virtual item in a preset game screen;
a second determining unit 302, configured to determine a target weapon field angle of the target virtual item in the current game screen;
an adjusting unit 303, configured to adjust the initial position based on the target weapon field angle and the target display size of the display screen, to obtain an adjusted position;
a third determining unit 304, configured to determine a target scene position of the target virtual item in the game scene according to the adjusted position;
an updating unit 305, configured to update the current game screen based on the target scene position and the target virtual item.
In some embodiments, the adjusting unit 303 may include:
the first acquisition subunit is used for acquiring a preset display proportion of the preset game picture and a preset scene field angle;
the first processing subunit is configured to process the initial position according to the preset display scale and the target display size, so as to obtain a current position of the target virtual prop in the current game picture;
the first determining subunit is used for determining a position change coefficient according to the preset scene field angle and the target weapon field angle;
and the second determining subunit is configured to determine, based on the current position and the position change coefficient, an adjusted position of the target virtual item on the current game screen.
In some embodiments, the first processing subunit may be specifically configured to:
calculating a target display scale of the display screen according to the target display size;
calculating the ratio of the target display proportion to the preset display proportion to obtain a display proportion matching coefficient;
and calculating a multiplication value of the initial position and the display scale matching coefficient to obtain the current position.
In some embodiments, the first determining subunit may be specifically configured to:
determining a scene focal length of the preset game picture based on the preset scene field angle;
determining a weapon focal length of the target virtual prop based on the target weapon field angle;
and calculating the ratio of the scene focal length to the weapon focal length to obtain the position change coefficient.
In some embodiments, the second determining subunit may be specifically configured to:
and calculating a multiplication value of the current position and the position change coefficient to obtain the adjusted position.
In some embodiments, the third determining unit 304 may include:
the second obtaining subunit is configured to obtain an initial scene position of the target virtual item in the game scene;
and the second processing subunit is used for processing the initial scene position according to the adjusted position to obtain the target scene position.
In some embodiments, the second processing subunit may be specifically configured to:
determining position offset information for the initial scene position based on the adjusted position;
and superposing the initial scene position and the position offset information to obtain the target scene position.
In some embodiments, the second processing subunit may be further specifically configured to:
acquiring a first coordinate system to which the adjusted position belongs and a second coordinate system to which the initial scene position belongs;
projecting the adjusted position to the second coordinate system based on the relation between the first coordinate system and the second coordinate system to obtain a target position of the adjusted position in the second coordinate system;
calculating the offset distance between the initial scene position and the target position to obtain the position offset information;
and superposing the initial scene position and the position offset information to obtain the target scene position.
In some embodiments, the first determining unit 301 may include:
the third obtaining subunit is configured to obtain an initial scene position of the target virtual item in the game scene and a preset display size;
a calculating subunit, configured to calculate the initial position based on the initial scene position and the preset display size.
In some embodiments, the second determining unit 302 may include:
the fourth acquisition subunit is used for acquiring the weapon type of the target virtual prop;
a third determining subunit for determining the target weapon field of view from a plurality of preset weapon field of view angles based on the weapon type.
In some embodiments, the update unit comprises:
an adjusting subunit, configured to adjust the target virtual item from an initial scene position in the game scene to the target scene position;
and the display subunit is used for updating and displaying the target virtual prop after the adjustment position on the current game picture.
In some embodiments, the apparatus may further comprise:
the acquisition unit is used for acquiring the field angle of a target scene of a virtual camera in the horizontal direction, and the virtual camera is used for acquiring the game scene;
the generating unit is used for generating the preset game picture based on the game scenes in the field angle range of the target scene.
The embodiment of the application discloses a game picture processing device, which determines the initial position of a target virtual item in a preset game picture through a first determination unit 301; a second determining unit 302, configured to determine a target weapon field angle of the target virtual item in the current game screen; the adjusting unit 303 adjusts the initial position based on the target weapon field angle and the target display size of the display screen to obtain an adjusted position; a third determining unit 304 determines a target scene position of the target virtual item in the game scene according to the adjusted position; update unit 305 updates the current game screen based on the target scene location and the target virtual item. Thus, the efficiency of processing the game picture in the game can be improved.
Correspondingly, the embodiment of the application also provides a computer device, and the computer device can be a terminal. As shown in fig. 8, fig. 8 is a schematic structural diagram of a computer device according to an embodiment of the present application. The computer apparatus 500 includes a processor 501 having one or more processing cores, a memory 502 having one or more computer-readable storage media, and a computer program stored on the memory 502 and executable on the processor. The processor 501 is electrically connected to the memory 502. Those skilled in the art will appreciate that the computer device configurations illustrated in the figures are not meant to be limiting of computer devices and may include more or fewer components than those illustrated, or some components may be combined, or a different arrangement of components.
The processor 501 is a control center of the computer device 500, connects various parts of the entire computer device 500 using various interfaces and lines, performs various functions of the computer device 500 and processes data by running or loading software programs and/or modules stored in the memory 502, and calling data stored in the memory 502, thereby monitoring the computer device 500 as a whole.
In this embodiment of the application, the processor 501 in the computer device 500 loads instructions corresponding to processes of one or more applications into the memory 502, and the processor 501 runs the applications stored in the memory 502, so as to implement various functions as follows:
determining an initial position of the target virtual prop in a preset game picture; determining a target weapon field angle of the target virtual prop in the current game picture; adjusting the initial position based on the target weapon field angle and the target display size of the display screen to obtain an adjusted position; determining the target scene position of the target virtual prop in the game scene according to the adjusted position; and updating the current game picture based on the target scene position and the target virtual prop.
The above operations can be implemented in the foregoing embodiments, and are not described in detail herein.
Optionally, as shown in fig. 8, the computer device 500 further includes: touch-sensitive display screen 503, radio frequency circuit 504, audio circuit 505, input unit 506 and power 507. The processor 501 is electrically connected to the touch display screen 503, the radio frequency circuit 504, the audio circuit 505, the input unit 506, and the power supply 507, respectively. Those skilled in the art will appreciate that the computer device configuration illustrated in FIG. 8 does not constitute a limitation of computer devices, and may include more or fewer components than those illustrated, or some components may be combined, or a different arrangement of components.
The touch display screen 503 can be used for displaying a graphical user interface and receiving an operation instruction generated by a user acting on the graphical user interface. The touch display screen 503 may include a display panel and a touch panel. The display panel may be used, among other things, to display information entered by or provided to a user and various graphical user interfaces of the computer device, which may be made up of graphics, text, icons, video, and any combination thereof. Alternatively, the Display panel may be configured in the form of a Liquid Crystal Display (LCD), an Organic Light-Emitting Diode (OLED), or the like. The touch panel may be used to collect touch operations of a user on or near the touch panel (for example, operations of the user on or near the touch panel using any suitable object or accessory such as a finger, a stylus pen, and the like), and generate corresponding operation instructions, and the operation instructions execute corresponding programs. Alternatively, the touch panel may include two parts, a touch detection device and a touch controller. The touch detection device detects the touch direction of a user, detects a signal brought by touch operation and transmits the signal to the touch controller; the touch controller receives touch information from the touch sensing device, converts the touch information into touch point coordinates, sends the touch point coordinates to the processor 501, and can receive and execute commands sent by the processor 501. The touch panel may overlay the display panel, and when the touch panel detects a touch operation thereon or nearby, the touch panel transmits the touch operation to the processor 501 to determine the type of the touch event, and then the processor 501 provides a corresponding visual output on the display panel according to the type of the touch event. In the embodiment of the present application, the touch panel and the display panel may be integrated into the touch display screen 503 to implement input and output functions. However, in some embodiments, the touch panel and the touch panel can be implemented as two separate components to perform the input and output functions. That is, the touch display 503 can also be used as a part of the input unit 506 to implement an input function.
The rf circuit 504 may be used for transceiving rf signals to establish wireless communication with a network device or other computer device via wireless communication, and for transceiving signals with the network device or other computer device.
Audio circuitry 505 may be used to provide an audio interface between a user and a computer device through speakers, microphones. The audio circuit 505 may transmit the electrical signal converted from the received audio data to a speaker, and convert the electrical signal into a sound signal for output; on the other hand, the microphone converts the collected sound signal into an electrical signal, which is received by the audio circuit 505 and converted into audio data, which is then processed by the audio data output processor 501, and then transmitted to, for example, another computer device via the rf circuit 504, or output to the memory 502 for further processing. The audio circuitry 505 may also include an earbud jack to provide communication of a peripheral headset with the computer device.
The input unit 506 may be used to receive input numbers, character information, or user characteristic information (e.g., fingerprint, iris, facial information, etc.), and generate keyboard, mouse, joystick, optical, or trackball signal inputs related to user settings and function control.
The power supply 507 is used to power the various components of the computer device 500. Optionally, the power supply 507 may be logically connected to the processor 501 through a power management system, so as to implement functions of managing charging, discharging, power consumption management, and the like through the power management system. The power supply 507 may also include any component including one or more dc or ac power sources, recharging systems, power failure detection circuitry, power converters or inverters, power status indicators, and the like.
Although not shown in fig. 8, the computer device 500 may further include a camera, a sensor, a wireless fidelity module, a bluetooth module, etc., which are not described in detail herein.
In the foregoing embodiments, the descriptions of the respective embodiments have respective emphasis, and for parts that are not described in detail in a certain embodiment, reference may be made to related descriptions of other embodiments.
As can be seen from the above, the computer device provided in this embodiment determines an initial position of the target virtual item in the preset game screen; determining a target weapon field angle of the target virtual prop in the current game picture; adjusting the initial position based on the target weapon field angle and the target display size of the display screen to obtain an adjusted position; determining the target scene position of the target virtual prop in the game scene according to the adjusted position; and updating the current game picture based on the target scene position and the target virtual prop.
It will be understood by those skilled in the art that all or part of the steps of the methods of the above embodiments may be performed by instructions or by associated hardware controlled by the instructions, which may be stored in a computer readable storage medium and loaded and executed by a processor.
To this end, embodiments of the present application provide a computer-readable storage medium, in which a plurality of computer programs are stored, and the computer programs can be loaded by a processor to execute the steps in any one of the game picture processing methods provided by the embodiments of the present application. For example, the computer program may perform the steps of:
determining an initial position of the target virtual prop in a preset game picture;
determining a target weapon field angle of the target virtual prop in the current game picture;
adjusting the initial position based on the target weapon field angle and the target display size of the display screen to obtain an adjusted position;
determining the target scene position of the target virtual prop in the game scene according to the adjusted position;
and updating the current game picture based on the target scene position and the target virtual prop.
The above operations can be implemented in the foregoing embodiments, and are not described in detail herein.
Wherein the storage medium may include: read Only Memory (ROM), Random Access Memory (RAM), magnetic or optical disks, and the like.
Since the computer program stored in the storage medium can execute the steps in any game picture processing method provided in the embodiments of the present application, the beneficial effects that can be achieved by any game picture processing method provided in the embodiments of the present application can be achieved, and detailed descriptions are omitted here for the sake of detail in the foregoing embodiments.
The foregoing describes in detail a game screen processing method, device, storage medium, and computer apparatus provided in the embodiments of the present application, and specific examples are applied herein to explain the principles and implementations of the present application, and the description of the foregoing embodiments is only used to help understand the method and core ideas of the present application; meanwhile, for those skilled in the art, according to the idea of the present application, there may be variations in the specific embodiments and the application scope, and in summary, the content of the present specification should not be construed as a limitation to the present application.

Claims (15)

1. A game picture processing method is applied to a terminal, a current game picture of a first person visual angle is displayed through a display screen of the terminal, the current game picture at least comprises a part of game scenes and virtual props, and the method comprises the following steps:
determining an initial position of the target virtual prop in a preset game picture;
determining a target weapon field angle of the target virtual prop in the current game picture;
adjusting the initial position based on the target weapon field angle and the target display size of the display screen to obtain an adjusted position;
determining the target scene position of the target virtual prop in the game scene according to the adjusted position;
and updating the current game picture based on the target scene position and the target virtual prop.
2. The method of claim 1, wherein the adjusting the initial position based on the target weapon field of view angle and a target display size of the display screen to obtain an adjusted position comprises:
acquiring a preset display proportion of the preset game picture and a preset scene field angle;
processing the initial position according to the preset display proportion and the target display size to obtain the current position of the target virtual prop in the current game picture;
determining a position change coefficient according to the preset scene field angle and the target weapon field angle;
and determining the adjusted position of the target virtual prop in the current game picture based on the current position and the position change coefficient.
3. The method according to claim 2, wherein the processing the initial position according to the preset display scale and the target display size to obtain the current position of the target virtual item in the current game screen comprises:
calculating a target display scale of the display screen according to the target display size;
calculating the ratio of the target display proportion to the preset display proportion to obtain a display proportion matching coefficient;
and calculating a multiplication value of the initial position and the display scale matching coefficient to obtain the current position.
4. The method of claim 2, wherein determining the position change coefficient according to the preset scene field angle and the target weapon field angle comprises:
determining a scene focal length of the preset game picture based on the preset scene field angle;
determining a weapon focal length of the target virtual prop based on the target weapon field angle;
and calculating the ratio of the scene focal length to the weapon focal length to obtain the position change coefficient.
5. The method of claim 2, wherein determining the adjusted position of the target virtual item at the current game screen based on the current position and the position change coefficient comprises:
and calculating a multiplication value of the current position and the position change coefficient to obtain the adjusted position.
6. The method of claim 1, wherein said determining a target scene location of the target virtual item in the game scene from the adjusted location comprises:
acquiring an initial scene position of the target virtual prop in the game scene;
and processing the initial scene position according to the adjusted position to obtain the target scene position.
7. The method of claim 6, wherein the processing the initial scene position according to the adjusted position to obtain the target scene position comprises:
determining position offset information for the initial scene position based on the adjusted position;
and superposing the initial scene position and the position offset information to obtain the target scene position.
8. The method of claim 7, wherein determining the position offset information of the initial scene position based on the adjusted position comprises:
acquiring a first coordinate system to which the adjusted position belongs and a second coordinate system to which the initial scene position belongs;
projecting the adjusted position to the second coordinate system based on the relation between the first coordinate system and the second coordinate system to obtain a target position of the adjusted position in the second coordinate system;
and calculating the offset distance between the initial scene position and the target position to obtain the position offset information.
9. The method of claim 1, wherein determining the initial position of the target virtual item in the preset game screen comprises:
acquiring an initial scene position of the target virtual prop in the game scene and a preset display size;
calculating the initial position based on the initial scene position and the preset display size.
10. The method of claim 1, wherein said determining a target weapon field of view of said target virtual item in said current game screen comprises:
acquiring the weapon type of the target virtual prop;
determining the target weapon field of view from a plurality of preset weapon field of view angles based on the weapon type.
11. The method of claim 1, wherein said updating the game screen based on the target scene location and the target virtual item comprises:
adjusting the target virtual item from an initial scene position in the game scene to the target scene position;
and updating and displaying the target virtual prop after the position is adjusted on the current game picture.
12. The method according to any one of claims 1 to 11, wherein prior to determining an initial position of the target virtual item in a preset game screen, further comprising:
acquiring a target scene field angle of a virtual camera in the horizontal direction, wherein the virtual camera is used for acquiring the game scene;
and generating the preset game picture based on the game scenes in the field angle range of the target scene.
13. A game screen processing apparatus, characterized in that the apparatus comprises:
the first determining unit is used for determining the initial position of the target virtual item in a preset game picture;
the second determination unit is used for determining the target weapon field angle of the target virtual item in the current game picture;
the adjusting unit is used for adjusting the initial position based on the target weapon field angle and the target display size of the display screen to obtain an adjusted position;
a third determining unit, configured to determine a target scene position of the target virtual item in the game scene according to the adjusted position;
and the updating unit is used for updating the current game picture based on the target scene position and the target virtual prop.
14. A computer device comprising a memory, a processor and a computer program stored on the memory and running on the processor, wherein the processor implements the game picture processing method according to any one of claims 1 to 12 when executing the program.
15. A storage medium storing a plurality of instructions adapted to be loaded by a processor to perform the game picture processing method according to any one of claims 1 to 12.
CN202110871656.0A 2021-07-30 2021-07-30 Game picture processing method and device, computer equipment and storage medium Pending CN113546407A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110871656.0A CN113546407A (en) 2021-07-30 2021-07-30 Game picture processing method and device, computer equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110871656.0A CN113546407A (en) 2021-07-30 2021-07-30 Game picture processing method and device, computer equipment and storage medium

Publications (1)

Publication Number Publication Date
CN113546407A true CN113546407A (en) 2021-10-26

Family

ID=78105010

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110871656.0A Pending CN113546407A (en) 2021-07-30 2021-07-30 Game picture processing method and device, computer equipment and storage medium

Country Status (1)

Country Link
CN (1) CN113546407A (en)

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH10290886A (en) * 1997-02-18 1998-11-04 Sega Enterp Ltd Image processing device and image processing method
US20080102951A1 (en) * 2006-11-01 2008-05-01 Nintendo Co., Ltd. Storage medium storing a game program, game apparatus, and game control method
WO2018177170A1 (en) * 2017-03-27 2018-10-04 网易(杭州)网络有限公司 Display control method and apparatus for game picture, storage medium and electronic device
CN109908574A (en) * 2019-02-22 2019-06-21 网易(杭州)网络有限公司 Game role control method, device, equipment and storage medium
CN111135556A (en) * 2019-12-31 2020-05-12 网易(杭州)网络有限公司 Virtual camera control method and device, electronic equipment and storage medium
CN111420402A (en) * 2020-03-18 2020-07-17 腾讯科技(深圳)有限公司 Virtual environment picture display method, device, terminal and storage medium
CN111729306A (en) * 2020-06-24 2020-10-02 网易(杭州)网络有限公司 Game character transmission method, device, electronic equipment and storage medium
CN112494929A (en) * 2020-12-04 2021-03-16 四三九九网络股份有限公司 Implementation method of game controller serving cloud game scenes

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH10290886A (en) * 1997-02-18 1998-11-04 Sega Enterp Ltd Image processing device and image processing method
US20080102951A1 (en) * 2006-11-01 2008-05-01 Nintendo Co., Ltd. Storage medium storing a game program, game apparatus, and game control method
WO2018177170A1 (en) * 2017-03-27 2018-10-04 网易(杭州)网络有限公司 Display control method and apparatus for game picture, storage medium and electronic device
CN109908574A (en) * 2019-02-22 2019-06-21 网易(杭州)网络有限公司 Game role control method, device, equipment and storage medium
CN111135556A (en) * 2019-12-31 2020-05-12 网易(杭州)网络有限公司 Virtual camera control method and device, electronic equipment and storage medium
CN111420402A (en) * 2020-03-18 2020-07-17 腾讯科技(深圳)有限公司 Virtual environment picture display method, device, terminal and storage medium
CN111729306A (en) * 2020-06-24 2020-10-02 网易(杭州)网络有限公司 Game character transmission method, device, electronic equipment and storage medium
CN112494929A (en) * 2020-12-04 2021-03-16 四三九九网络股份有限公司 Implementation method of game controller serving cloud game scenes

Similar Documents

Publication Publication Date Title
US11703993B2 (en) Method, apparatus and device for view switching of virtual environment, and storage medium
CN107982918B (en) Game game result display method and device and terminal
CN111589128A (en) Operation control display method and device based on virtual scene
CN112138386A (en) Volume rendering method and device, storage medium and computer equipment
CN113426124B (en) Display control method and device in game, storage medium and computer equipment
JP7186901B2 (en) HOTSPOT MAP DISPLAY METHOD, DEVICE, COMPUTER DEVICE AND READABLE STORAGE MEDIUM
CN113082707A (en) Virtual object prompting method and device, storage medium and computer equipment
CN114522423A (en) Virtual object control method and device, storage medium and computer equipment
CN112206517A (en) Rendering method, device, storage medium and computer equipment
WO2024082753A1 (en) Game indicator generation method and apparatus, computer device, and storage medium
WO2024011894A1 (en) Virtual-object control method and apparatus, and storage medium and computer device
CN112245914B (en) Viewing angle adjusting method and device, storage medium and computer equipment
US20220274017A1 (en) Method and apparatus for displaying virtual scene, terminal, and storage medium
CN115382201A (en) Game control method and device, computer equipment and storage medium
CN113546407A (en) Game picture processing method and device, computer equipment and storage medium
CN114522429A (en) Virtual object control method and device, storage medium and computer equipment
CN114159788A (en) Information processing method, system, mobile terminal and storage medium in game
CN113101661A (en) Accessory assembling method and device, storage medium and computer equipment
CN112843697A (en) Image processing method and device, storage medium and computer equipment
CN113398564B (en) Virtual character control method, device, storage medium and computer equipment
CN115430150A (en) Game skill release method and device, computer equipment and storage medium
CN115970282A (en) Virtual lens control method and device, storage medium and computer equipment
CN115518375A (en) Game word skipping display method and device, computer equipment and storage medium
CN114037783A (en) Animation playing method, device, equipment, readable storage medium and program product
CN115970284A (en) Attack method and device of virtual weapon, storage medium and computer equipment

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination