CN111494948B - Editing method of game lens, electronic equipment and storage medium - Google Patents

Editing method of game lens, electronic equipment and storage medium Download PDF

Info

Publication number
CN111494948B
CN111494948B CN202010317742.2A CN202010317742A CN111494948B CN 111494948 B CN111494948 B CN 111494948B CN 202010317742 A CN202010317742 A CN 202010317742A CN 111494948 B CN111494948 B CN 111494948B
Authority
CN
China
Prior art keywords
waypoint
area
region
identifier
virtual
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202010317742.2A
Other languages
Chinese (zh)
Other versions
CN111494948A (en
Inventor
杜博
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Netease Hangzhou Network Co Ltd
Original Assignee
Netease Hangzhou Network Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Netease Hangzhou Network Co Ltd filed Critical Netease Hangzhou Network Co Ltd
Priority to CN202010317742.2A priority Critical patent/CN111494948B/en
Publication of CN111494948A publication Critical patent/CN111494948A/en
Application granted granted Critical
Publication of CN111494948B publication Critical patent/CN111494948B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/50Controlling the output signals based on the game progress
    • A63F13/52Controlling the output signals based on the game progress involving aspects of the displayed game scene
    • A63F13/525Changing parameters of virtual cameras
    • A63F13/5252Changing parameters of virtual cameras using two or more virtual cameras concurrently or sequentially, e.g. automatically switching between fixed virtual cameras when a character changes room or displaying a rear-mirror view in a car-driving game

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Processing Or Creating Images (AREA)

Abstract

The embodiment of the application provides an editing method of a game lens, which provides a graphical user interface through first terminal equipment, and comprises the following steps: displaying a virtual scene to be edited through the graphical user interface; responding to a waypoint editing instruction, and setting a waypoint region identifier in the virtual scene to be edited; responding to the shot editing instruction, and determining initial shot parameters of the virtual camera corresponding to the waypoint area identifier; the waypoint region identifier is configured to trigger the current lens parameter corresponding to the virtual character to be changed into the target lens parameter corresponding to the initial lens parameter when the virtual character is in the region corresponding to the waypoint region identifier. The method realizes the rapid and efficient three-dimensional lens control obtained in the game, namely, the controllable lens direction can be ensured, and an art developer only needs to make a scene model and a map in a specific direction, so that the rendering performance is saved, and the optimal art effect expression is achieved.

Description

Editing method of game lens, electronic equipment and storage medium
Technical Field
The present application relates to the field of computers, and in particular, to a game lens editing method, an electronic device, and a storage medium.
Background
With the development of computer network technology and the development of mobile device platforms, games of virtual three-dimensional scenes are increasingly favored by players on mobile platforms. In a game, the performance of the shots intuitively affects the game experience of the player. If the main angle is assisted by the change of the lens in the process of walking in the game, a more real world experience is brought to game players.
Most three-dimensional games currently on the market generally employ a free view angle and a fixed view angle of the locked direction in the control of the lens. If the camera posture adopts a free view angle, the art scene is required to be manufactured, models and maps are required to be manufactured in all directions, and the cost on rendering performance is high. If a fixed viewing angle with a locked direction is adopted, the expressive force of the three-dimensional scene is greatly reduced, and the scene is too monotonous.
Disclosure of Invention
In view of the foregoing, the present application has been made to provide an editing method of a game lens, an electronic device, and a storage medium, which overcome or at least partially solve the foregoing problems, including:
in order to solve the above-mentioned problem, an embodiment of the present application discloses a method for editing a game lens, which provides a graphical user interface through a first terminal device, and is characterized in that the method includes:
displaying a virtual scene to be edited through the graphical user interface;
setting a waypoint region identifier in the virtual scene to be edited in response to a waypoint editing instruction,
responding to the shot editing instruction, and determining initial shot parameters of the virtual camera corresponding to the waypoint area identifier; the waypoint region identifier is configured to trigger the current lens parameter corresponding to the virtual character to be changed into the target lens parameter corresponding to the initial lens parameter when the virtual character is in the region corresponding to the waypoint region identifier.
Preferably, the step of setting a waypoint region identifier in the virtual scene to be edited in response to the waypoint editing instruction includes:
determining the virtual scene to be edited in a preset waypoint region setting page; the waypoint region setting page comprises an add waypoint region identification control and a save control;
and adding the waypoint region identification in the virtual scene to be edited in response to the waypoint editing instruction of the control for adding the waypoint region identification.
Preferably, the step of determining initial lens parameters of the virtual camera corresponding to the waypoint area identifier in response to the lens editing instruction includes:
responding to the shot editing instruction, and setting initial shot parameters for the virtual camera corresponding to the waypoint area identifier;
and responding to a trigger instruction of the storage control, and storing the waypoint area identifier and the initial lens parameters.
Preferably, after the step of saving the waypoint area identifier and the initial lens parameter in response to the trigger instruction for the save control, the method further includes:
responding to a selected operation acting on a virtual scene to be edited, and determining a virtual camera to be edited, wherein the virtual camera to be edited is a virtual camera corresponding to a waypoint area identifier corresponding to the selected operation;
and responding to a camera editing instruction, and adjusting initial lens parameters corresponding to the virtual camera to be edited.
Preferably, the waypoint area identifier comprises at least one of the following: the road point identification system comprises a gradual change road point area identification and a fixed road point area identification, wherein the road point area corresponding to the gradual change road point area identification is a gradual change road point area, and the road point area corresponding to the fixed road point area identification is a fixed road point area; the gradual change route point area is configured to determine a first target lens parameter corresponding to a first initial lens parameter according to the position of the virtual character when the virtual character is in the gradual change route point area; the fixed waypoint area is configured to determine a corresponding second target lens parameter according to a second initial lens parameter when the virtual character is within the fixed waypoint area.
Preferably, the gradual change road point area comprises a first area and a second area; the first area is a circular area taking the gradual change road point area mark as a circle center; the second area is the rest area except the first area in the gradual change road point area; the fixed waypoint area is a circular area taking the fixed waypoint area mark as a circle center.
Preferably, when the virtual camera to be edited corresponds to the gradient waypoint area identifier, the step of responding to the camera editing instruction and adjusting the initial lens parameters corresponding to the virtual camera to be edited includes:
responding to a first selected operation on the first area in the virtual scene to be edited, and adjusting the circular area range corresponding to the first area;
and responding to a second selected operation on the second area in the virtual scene to be edited, and adjusting the range of the second area.
Preferably, the initial lens parameters include at least one of: the position and orientation of the lens.
Preferably, the waypoint area identifier is configured to determine a position of the virtual character when the virtual character is in an area corresponding to the waypoint area identifier, and trigger to change a current lens parameter corresponding to the virtual character to a second target lens parameter equal to the second initial lens parameter when the position of the virtual character is in the fixed waypoint area.
Preferably, the virtual scene to be edited is further provided with a default lens parameter, where the default lens parameter is used to determine the default lens parameter as a target lens parameter when the virtual character is not located in the waypoint area, and the waypoint area identifier is configured to trigger to change the current lens parameter corresponding to the virtual character into the first target lens parameter calculated by adopting the first initial lens parameter and the default lens parameter if the position of the virtual character is located in the second area in the gradual change waypoint area.
Preferably, the waypoint area identifier is configured to trigger to change the current lens parameter corresponding to the virtual character to a first target lens parameter equal to the first initial lens parameter if the position of the virtual character is in the first area in the gradual waypoint area.
The embodiment of the application also provides electronic equipment, which comprises a processor, a memory and a computer program stored on the memory and capable of running on the processor, wherein the computer program realizes the steps of the editing method of the game lens when being executed by the processor.
The embodiment of the application also provides a computer readable storage medium, wherein the computer readable storage medium stores a computer program, and the computer program realizes the steps of the editing method of the game lens when being executed by a processor.
The application has the following advantages:
in the embodiment of the application, a virtual scene to be edited is displayed through a graphical user interface, a waypoint editing instruction is responded, a waypoint area identifier is set in the virtual scene to be edited, and initial lens parameters of a virtual camera corresponding to the waypoint area identifier are determined in response to the lens editing instruction; the waypoint region identifier is configured to trigger the current lens parameter corresponding to the virtual character to be changed into the target lens parameter corresponding to the initial lens parameter when the virtual character is in the region corresponding to the waypoint region identifier. The method has the advantages that the rapid and efficient three-dimensional lens control obtained in the game is realized, the controllable lens direction can be ensured, the art developer only needs to manufacture a scene model and a map in a specific direction, the rendering performance is saved, the three-dimensional lens can be ensured to change the gesture according to the established design, the defect of monotonous effect is avoided, and the optimal art effect expression is achieved.
Drawings
In order to more clearly illustrate the technical solutions of the present application, the drawings that are needed in the description of the present application will be briefly described below, it being obvious that the drawings in the following description are only some embodiments of the present application, and that other drawings may be obtained according to these drawings without inventive effort to a person skilled in the art.
FIG. 1 is a flow chart of steps of a method for editing a game piece according to an embodiment of the present application;
FIG. 2 is a flowchart illustrating steps of another method for editing a game piece according to an embodiment of the present application;
FIG. 3 is a schematic diagram of a waypoint region identifier setting page according to an embodiment of the present application;
fig. 4 is a schematic diagram of a gradual change road point area according to an embodiment of the present application;
FIG. 5 is a schematic diagram of a fixed waypoint region according to an embodiment of the present application;
fig. 6 is a schematic diagram of an attribute information display page of a waypoint area identifier according to an embodiment of the present application;
FIG. 7 is a schematic diagram of a gradual change waypoint region crossing a fixed waypoint region according to an embodiment of the present application;
FIG. 8 is a schematic diagram of a fixed waypoint area crossing a fixed waypoint area according to an embodiment of the present application;
fig. 9 is an edit schematic diagram of a game lens according to an embodiment of the present application.
Detailed Description
In order that the above-recited objects, features and advantages of the present application will become more readily apparent, a more particular description of the application will be rendered by reference to the appended drawings and appended detailed description. It will be apparent that the described embodiments are some, but not all, embodiments of the application. All other embodiments, which can be made by those skilled in the art based on the embodiments of the application without making any inventive effort, are intended to be within the scope of the application.
The editing method of the game lens in one embodiment of the present disclosure may be run on a terminal device or a server. The terminal device may be a local terminal device. When the editing method of the game lens runs on the server, the editing method of the game lens can be realized and executed based on a cloud interaction system, wherein the cloud interaction system comprises the server and the client device.
In an alternative embodiment, various cloud applications may be run under the cloud interaction system, for example: and (5) cloud game. Taking cloud game as an example, cloud game refers to a game mode based on cloud computing. In the cloud game operation mode, the game program operation main body and the game picture presentation main body are separated, the storage and operation of the game lens editing method are completed on the cloud game server, and the client device is used for receiving and sending data and presenting the game picture, for example, the client device can be a display device with a data transmission function close to a user side, such as a mobile terminal, a television, a computer, a palm computer and the like; but the terminal device for information processing is cloud game server of cloud. When playing the game, the player operates the client device to send an operation instruction to the cloud game server, the cloud game server runs the game according to the operation instruction, codes and compresses data such as game pictures and the like, returns the data to the client device through a network, and finally decodes the data through the client device and outputs the game pictures.
In an alternative embodiment, the terminal device may be a local terminal device. Taking a game as an example, the local terminal device stores a game program and is used to present a game screen. The local terminal device is used for interacting with the player through the graphical user interface, namely, conventionally downloading and installing the game program through the electronic device and running. The manner in which the local terminal device provides the graphical user interface to the player may include a variety of ways, for example, it may be rendered for display on a display screen of the terminal, or provided to the player by holographic projection. For example, the local terminal device may include a display screen for presenting a graphical user interface including game visuals, and a processor for running the game, generating the graphical user interface, and controlling the display of the graphical user interface on the display screen.
Referring to fig. 1, a flowchart of steps of an editing method of a game lens according to an embodiment of the present application is shown, and a graphical user interface is provided through a first terminal device. The first terminal device may be the aforementioned local terminal device or a client device. The method specifically comprises the following steps:
step 101, displaying a virtual scene to be edited through the graphical user interface;
the game editing application program is run on the local terminal device or the client device, and a graphical user interface is provided on the local terminal device or the client device, wherein the displayed content of the graphical user interface at least partially comprises a part or all of the virtual scene to be edited, and the specific form of the virtual scene to be edited can be square or other shapes (such as a circle and the like).
By setting the virtual camera in the virtual scene to be edited, the editing of the virtual scene to be edited can be completed, so that the virtual scene to be edited becomes a virtual scene. The virtual camera moves in the virtual scene, providing the user with different perspectives in the virtual scene.
Step 102, setting a waypoint region identifier in the virtual scene to be edited in response to a waypoint editing instruction;
after receiving the waypoint editing instruction, the terminal device can set a waypoint region identifier in a region corresponding to the waypoint editing instruction in the virtual scene to be edited according to the waypoint editing instruction, wherein each waypoint region identifier has a corresponding range.
Step 103, responding to the shot editing instruction, and determining initial shot parameters of the virtual camera corresponding to the waypoint area identifier; the waypoint region identifier is configured to trigger the current lens parameter corresponding to the virtual character to be changed into the target lens parameter corresponding to the initial lens parameter when the virtual character is in the region corresponding to the waypoint region identifier.
After the waypoint area identifiers are set, the terminal equipment receives a lens editing instruction, wherein the lens editing instruction corresponds to each waypoint area identifier and carries initial lens parameters of the corresponding virtual camera, such as coordinates of the virtual camera in a virtual scene to be edited, the orientation of the virtual camera and the like. After the initial lens parameters of the virtual camera are set for the waypoint area identifier, when the virtual character is located in the area corresponding to the waypoint area in the virtual scene, the current lens parameters corresponding to the virtual character are automatically triggered to be changed into target lens parameters, and the target lens parameters are equal to the initial lens parameters of the waypoint area identifier.
In the embodiment of the application, a virtual scene to be edited is displayed through a graphical user interface, a waypoint editing instruction is responded, a waypoint area identifier is set in the virtual scene to be edited, and initial lens parameters of a virtual camera corresponding to the waypoint area identifier are determined in response to the lens editing instruction; when the virtual character is in the area corresponding to the road point area identifier, the road point area identifier is configured to trigger the current lens parameter corresponding to the virtual character to be changed into the target lens parameter corresponding to the initial lens parameter, so that the visual angle of the virtual scene can be changed according to different positions of the virtual character, the controllable lens direction is ensured, the art developer only needs to make a scene model and a map in a specific direction, the rendering performance is saved, the defect of monotonous effect is avoided, and the optimal art effect expression is achieved.
Referring to fig. 2, a flowchart illustrating steps of another method for editing a game lens according to an embodiment of the present application is provided, where a graphical user interface is provided by a first terminal device, which may be the aforementioned local terminal device or a client device. The method specifically comprises the following steps:
step 201, displaying a virtual scene to be edited through the graphical user interface;
step 201 is similar to step 101, and the detailed description will refer to step 101, and will not be repeated here.
Step 202, setting a waypoint region identifier in the virtual scene to be edited in response to a waypoint editing instruction;
in a preferred embodiment of the present application, the step 202 further comprises the sub-steps of:
determining the virtual scene to be edited in a preset waypoint region identification setting page; the waypoint region identification setting page comprises an addition waypoint region identification control and a storage control;
and adding the waypoint region identification in the virtual scene to be edited in response to the waypoint editing instruction of the control for adding the waypoint region identification.
The developer may determine the virtual scene to be edited through a waypoint region identifier setting page preset in the editor, as an example, as shown in fig. 3, fig. 3 is a waypoint region identifier setting page, and the virtual scene to be edited, for which the waypoint region identifier is desired to be set, may be selected at the virtual scene to be edited selection region 301 above the page. After completing the selection of the virtual scene to be edited, the developer may input a waypoint editing instruction by triggering the add waypoint region identifier control on the waypoint region identifier setting page, as shown in fig. 3, in the functional control region 302 below the waypoint region identifier setting page, an add waypoint region identifier control 3021 is provided, and as an example, the triggering manner may be a clicking operation, and the developer may add the waypoint region identifier in the virtual scene to be edited by clicking the add waypoint region identifier control 3021. In the adding process, a virtual character model is generated in the virtual scene to be edited and used for representing the road point area identification.
The road point area identifier can be divided into two types, one is a gradual road point area identifier, and as shown in fig. 4, the gradual road point area identifier corresponds to the gradual road point area and comprises two parts. The first area is an inner circle area taking the gradual change road point area mark as a circle center; the second region is a circular region other than the first region. Another waypoint region identifier is a fixed waypoint region identifier, as shown in fig. 5. The fixed waypoint area identifier corresponds to a circular area with the fixed waypoint area identifier as a circle center. When the virtual character is in the gradual change route point area, determining a first target lens parameter according to a first initial lens parameter of the virtual camera set in the gradual change route point area; and when the virtual character is in the fixed waypoint area, the second target lens parameter may be determined according to the second initial lens parameter of the virtual camera set in the fixed waypoint area.
Step 203, determining initial lens parameters of the virtual camera corresponding to the waypoint area identifier in response to the lens editing instruction;
in a preferred embodiment of the application, said step 203 further comprises the sub-steps of:
responding to the shot editing instruction, and setting initial shot parameters for the virtual camera corresponding to the waypoint area identifier;
and responding to a trigger instruction of the storage control, and storing the waypoint area identifier and the initial lens parameters.
After the waypoint area identifier is added, a developer can set initial lens parameters for the virtual camera corresponding to the added waypoint area identifier in the virtual scene to be edited according to different requirements on the visual angle when the virtual character is positioned at different positions in the virtual scene. The initial lens parameters may include a position and an orientation of the lens, and as an example, the position of the lens may be coordinates (x, y, z), and the orientation may be a horizontal angle and a vertical angle of elevation of the lens, and by setting different initial lens parameters, the position and angle of the lens may be changed, thereby displaying different virtual scene angles. Meanwhile, for different types of dew point area identifiers, the range of the set initial lens parameter function is also different. The second initial lens parameters corresponding to the fixed waypoint area identifier act on the whole range of the fixed waypoint area, and the first initial lens parameters corresponding to the gradual change waypoint area identifier only act on the first area in the gradual change waypoint area.
After the initial lens parameters are set in the virtual scene to be edited, the waypoint area identification setting page can display attribute information of the waypoint area identification. As shown in fig. 6, fig. 6 is a display page of each item of attribute information of the waypoint area identifier, which includes a default lens offset setting area 401 for setting default lens parameters, and when the position of the virtual character is not located in the area corresponding to the waypoint area identifier, setting the current lens parameters corresponding to the virtual character with the default lens parameters, and displaying the view angle of the virtual scene; the waypoint region identifier list region 402 is configured to display all the currently added waypoint region identifiers, and a developer can select a certain waypoint region identifier from the waypoint region identifier list through a click operation; the waypoint region identifier attribute display region 403 is used for displaying detailed data of the currently selected waypoint region identifier, such as coordinates, names of virtual scenes to be edited, and the like.
In the waypoint area identifier setting page, a save control is further included, and a developer can save the added dew point area identifier and the set initial lens parameters by triggering the save control, as shown in fig. 3, where the save control 3022 is located in the functionality control area 302. In an example, the triggering mode may be a clicking operation, and after completing the addition of the waypoint area identifier and the setting of the initial lens parameters, the developer clicks the save control, so that the dew point area identifier and the initial lens parameters can be saved.
In addition, the waypoint area identifier setting page can also comprise a mode switching control for switching an editing mode and a testing mode, wherein the editing mode is used for performing operations such as adding the waypoint area identifier, setting initial lens parameters and the like, and the testing mode is used for testing the added waypoint area identifier and the set initial lens parameters and checking the effect in actual operation; the deleting control is used for deleting the added waypoint region identifier; the road point area identification list refreshing control is used for refreshing the road point area identification list and displaying the latest road point area identification list; the hidden/displayed waypoint region identification control is used for hiding the waypoint region identification in the virtual scene to be edited; and the default lens parameter locking control is used for locking the default lens parameter in the virtual scene to be edited, so that misoperation is avoided. The developer can copy, select multiple times, cut and the like the waypoint region identifier and the initial lens parameters in the waypoint region identifier setting page and the virtual scene to be edited through a preset operation mode, as shown in table 1, which is an example of the operation of the keyboard and the mouse according to the embodiment of the application, however, other operation modes can be adopted by those skilled in the art, and the application is not limited to this.
TABLE 1
Step 204, determining a virtual camera to be edited in response to a selected operation acting on a virtual scene to be edited, wherein the virtual camera to be edited is a virtual camera corresponding to a waypoint area identifier corresponding to the selected operation;
after the waypoint region identifier setting page stores the added waypoint region identifier and the initial lens parameters of the corresponding virtual camera, a developer can also enter a virtual scene to be edited, and the virtual camera corresponding to the added waypoint region identifier is selected.
Step 205, responding to a camera editing instruction, and adjusting initial lens parameters corresponding to the virtual camera to be edited;
specifically, the developer may adjust the initial lens parameters corresponding to the virtual camera by operating the added waypoint region identifier in the virtual scene to be edited, as shown in table 2, which is an operation mode for adjusting the initial lens parameters. Of course, these operation modes are also suitable for adjusting the default lens parameters, and meanwhile, a person skilled in the art can also adjust the initial lens parameters corresponding to the virtual camera by adopting other operation modes according to own requirements, which is not limited in the present application.
TABLE 2
In a preferred embodiment of the present application, when the virtual camera to be edited corresponds to the gradual change waypoint area identifier, the step 205 includes the following sub-steps:
responding to a first selected operation on the first area in the virtual scene to be edited, and adjusting the circular area range corresponding to the first area;
and responding to a second selected operation on the second area in the virtual scene to be edited, and adjusting the range of the second area.
When the region range corresponding to the gradual change road point region identifier is required to be adjusted, a developer can select a first region in the gradual change road point region corresponding to the gradual change road point region identifier in a virtual scene to be edited by triggering operation, such as clicking, and then adjust the radius corresponding to the first region by pressing a ctrl key through a mouse pulley. And similarly, selecting a second area in the gradual change road point area, and then adjusting the radius of the outer circle of the gradual change road point area by utilizing a mouse wheel by pressing a ctrl key so as to adjust the range of the second area.
In addition, in order to make the visual angle change of the virtual scene smoother and reduce the complexity of determining the target lens parameters, the embodiment of the application sets the priority of the fixed waypoint area identifier as the highest priority, namely when the virtual character is positioned at the intersection position of the fixed waypoint area and the gradual change waypoint area, as shown in fig. 7, the left side is the gradual change waypoint area, the right side is the fixed waypoint area, an intersection area is arranged between the two, and the initial lens parameters corresponding to the fixed waypoint area identifier with higher priority are determined as the target lens parameters. In addition, the gradual change road point areas can not be intersected, the fixed road point areas can be intersected, initial lens parameters of the fixed road point area identifiers with the intersected areas are the same, as shown in fig. 8, and fig. 8 shows two intersected fixed road point area identifiers. It should be noted that the priority setting may be performed according to specific requirements of a developer, for example, the priority of the gradual change waypoint area identifier is set to be the highest priority, which is not limited by the present application.
In a preferred embodiment of the present application, the waypoint area identifier is configured to determine a position of the virtual character when the virtual character is in an area corresponding to the waypoint area identifier, and trigger to change a current lens parameter corresponding to the virtual character to a second target lens parameter equal to the second initial lens parameter when the position of the virtual character is in the fixed waypoint area.
And determining the coordinates of the position of the virtual character in the virtual scene, traversing all the fixed waypoint areas in the virtual scene, judging whether the coordinates are in the fixed waypoint areas, and if so, determining the initial lens parameters of the corresponding fixed waypoint areas as target lens parameters.
In a preferred embodiment of the present application, the virtual scene to be edited is further provided with a default lens parameter, where the default lens parameter is used to determine the default lens parameter as a target lens parameter when the virtual character is not located in the waypoint area, and the waypoint area identifier is configured to trigger to change the current lens parameter corresponding to the virtual character to the first target lens parameter calculated by using the first initial lens parameter and the default lens parameter if the position of the virtual character is located in the second area in the gradual change waypoint area.
When the coordinates of the virtual character are not in the fixed waypoint region, traversing all the gradual change waypoint region identifiers in the virtual scene, judging whether the coordinates are in the circular ring region of the gradual change waypoint region corresponding to the gradual change waypoint region identifiers, if so, adopting initial lens parameters and default lens parameters corresponding to the inner circle of the gradual change waypoint region, and calculating the target lens parameters according to an interpolation algorithm. Specifically, assuming that the initial lens parameter corresponding to the inner circle is Pb, the default lens parameter is Pa, the distance between the virtual character and the center of the inner circle is h, the radius of the inner circle is R1, the radius of the gradual change path point region is R2, and the target lens parameter is P, the calculation mode of the interpolation algorithm is as follows:
in a preferred embodiment of the present application, the waypoint area identifier is configured to trigger to change a current lens parameter corresponding to the virtual character to a first target lens parameter equal to the first initial lens parameter if the position of the virtual character is within the first area in the gradual waypoint area.
And when the coordinate is not in the second area in the gradual change route point area, judging whether the coordinate is in the inner circle area in the gradual change route point area, and if so, determining the initial lens parameter corresponding to the inner circle area as the target lens parameter. If not, the coordinate is judged not to be in any waypoint area, and the default lens parameter is determined as the target lens parameter.
In addition, when the developer sets the gradient waypoint region to the highest priority, the target shot parameters may be determined by:
determining the position of the virtual character;
traversing all gradual change road point area identifiers in the virtual scene, and judging whether the position is in a circular area in the gradual change road point area or not;
if yes, calculating target lens parameters by adopting initial lens parameters and default lens parameters corresponding to an inner circle region in the gradual change route point region based on an interpolation algorithm;
if not, judging whether the position is in the inner circle region of the gradual change road point region;
if yes, determining initial lens parameters corresponding to the inner circle region in the gradual change route point region as target lens parameters;
if not, judging whether the position is in the fixed road point area or not;
if yes, determining initial lens parameters corresponding to the fixed waypoint area as target lens parameters;
if not, determining the default lens parameters as target lens parameters.
By applying the embodiment of the application, the set page is used for setting the waypoint region identifier and the initial lens parameters, judging the relation between the position of the virtual character in the virtual scene and the waypoint region, confirming the target lens parameters according to the judging result and determining the visual angle of the virtual scene based on the target lens parameters, so that the visual angle of the virtual scene can be dynamically changed according to the position of the virtual character, and the virtual scene is better displayed while the consumption of rendering resources is reduced.
In order to enable those skilled in the art to better understand the effects achieved by the present application, the present application is illustrated by way of an example which sets the fixed waypoint region identifier to the highest priority, but it should be understood that the present application is not limited thereto.
As shown in fig. 9, for a virtual scene for which the waypoint area identifier has been set, when the virtual character moves from outside the gradation waypoint area identifier 501 toward the center of the gradation waypoint area identifier 501, the angle of view of the virtual scene changes according to the change of the target lens parameter. When the virtual character is located in an area above the gradient waypoint area identifier 501 and does not belong to any waypoint area identifier, the target lens parameter is a default lens parameter. When the virtual character moves into the circular ring area of the gradual change road point area identifier 501 and is not in contact with the inner circle area and is not in contact with the fixed road point area identifiers 502 and 503, the target lens parameters are calculated according to the initial lens parameters and the default lens parameters corresponding to the inner circle area of the gradual change road point area identifier 501 and the circle center distance from the gradual change road point area identifier 501 through an interpolation algorithm. When the virtual character enters the inner circle region from the circular ring region of the gradient route point region identifier 501, the target lens parameter becomes the initial lens parameter corresponding to the inner circle region. The virtual character selects to go on to the left, and when the virtual character enters a position where the fixed waypoint area identifier 502 and the annular area of the gradual waypoint area identifier 501 intersect, the target lens parameter becomes an initial lens parameter corresponding to the fixed waypoint area identifier 502 because the fixed waypoint area has the highest priority. Then the virtual character moves forward, and the target lens parameters remain unchanged until the virtual character leaves the fixed waypoint area identifier 502, at this time, the virtual character is located in the gradual change waypoint area identifier 504, and the radius of the inner circle of the gradual change waypoint area identifier 504 is equal to the radius of the waypoint area, so that the virtual character does not have a circular ring area, and at this time, the target lens parameters are initial lens parameters corresponding to the inner circle area of the gradual change waypoint area identifier 504. In the moving process of the virtual character, the target lens parameters are also changed continuously according to the road point area, so that the visual angle of the virtual scene is changed continuously according to the movement of the virtual character, and an excellent visual effect is presented.
An embodiment of the present application also provides an electronic device that may include a processor, a memory, and a computer program stored on the memory and capable of running on the processor, the computer program implementing the steps of the editing method of a game lens as above when executed by the processor.
An embodiment of the present application also provides a computer-readable storage medium having stored thereon a computer program which, when executed by a processor, implements the steps of the editing method of a game lens as described above.
In this specification, each embodiment is described in a progressive manner, and each embodiment is mainly described by differences from other embodiments, and identical and similar parts between the embodiments are all enough to be referred to each other.
It will be apparent to those skilled in the art that embodiments of the present application may be provided as a method, apparatus, or computer program product. Accordingly, embodiments of the present application may take the form of an entirely hardware embodiment, an entirely software embodiment or an embodiment combining software and hardware aspects. Furthermore, embodiments of the application may take the form of a computer program product on one or more computer-usable storage media (including, but not limited to, disk storage, CD-ROM, optical storage, etc.) having computer-usable program code embodied therein.
Embodiments of the present application are described with reference to flowchart illustrations and/or block diagrams of methods, terminal devices (systems), and computer program products according to embodiments of the application. It will be understood that each flow and/or block of the flowchart illustrations and/or block diagrams, and combinations of flows and/or blocks in the flowchart illustrations and/or block diagrams, can be implemented by computer program instructions. These computer program instructions may be provided to a processor of a general purpose computer, special purpose computer, embedded processor, or other programmable data processing terminal device to produce a machine, such that the instructions, which execute via the processor of the computer or other programmable data processing terminal device, create means for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be stored in a computer-readable memory that can direct a computer or other programmable data processing apparatus to function in a particular manner, such that the instructions stored in the computer-readable memory produce an article of manufacture including instruction means which implement the function specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be loaded onto a computer or other programmable data processing apparatus to cause a series of operational steps to be performed on the computer or other programmable apparatus to produce a computer implemented process such that the instructions which execute on the computer or other programmable apparatus provide steps for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
While preferred embodiments of the present application have been described, additional variations and modifications in those embodiments may occur to those skilled in the art once they learn of the basic inventive concepts. It is therefore intended that the following claims be interpreted as including the preferred embodiment and all such alterations and modifications as fall within the scope of the embodiments of the application.
Finally, it is further noted that relational terms such as first and second, and the like are used solely to distinguish one entity or action from another entity or action without necessarily requiring or implying any actual such relationship or order between such entities or actions. Moreover, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or terminal that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or terminal. Without further limitation, an element defined by the phrase "comprising one … …" does not exclude the presence of other like elements in a process, method, article or terminal device comprising the element.
The above description of the editing method, electronic device and storage medium for game lens provided by the present application applies specific examples to illustrate the principles and embodiments of the present application, and the above examples are only used to help understand the method and core ideas of the present application; meanwhile, as those skilled in the art will have variations in the specific embodiments and application scope in accordance with the ideas of the present application, the present description should not be construed as limiting the present application in view of the above.

Claims (13)

1. A method for editing a game shot by providing a graphical user interface through a first terminal device, the method comprising:
displaying a virtual scene to be edited through the graphical user interface;
responding to a waypoint editing instruction, and setting a waypoint region identifier in the virtual scene to be edited; each road point area identifier has a corresponding area range;
responding to a lens editing instruction, and determining initial lens parameters of the virtual camera corresponding to the waypoint area identifier; the waypoint region identifier is configured to trigger the current lens parameter corresponding to the virtual character to be changed into the target lens parameter corresponding to the initial lens parameter when the virtual character is in the region corresponding to the waypoint region identifier;
and after the preset waypoint region identifier setting page stores the added waypoint region identifier and the corresponding initial lens parameters of the virtual camera, operating the added waypoint region identifier in the virtual scene to be edited so as to adjust the corresponding initial lens parameters of the virtual camera.
2. The method according to claim 1, wherein the step of setting a waypoint region identifier in the virtual scene to be edited in response to a waypoint editing instruction comprises:
determining the virtual scene to be edited in a preset waypoint region setting page; the waypoint region setting page comprises an add waypoint region identification control and a save control;
and adding the waypoint region identification in the virtual scene to be edited in response to the waypoint editing instruction of the control for adding the waypoint region identification.
3. The method of claim 2, wherein the step of determining initial lens parameters of the virtual camera corresponding to the waypoint region identification in response to the lens editing instructions comprises:
responding to the shot editing instruction, and setting initial shot parameters for the virtual camera corresponding to the waypoint area identifier;
and responding to a trigger instruction of the storage control, and storing the waypoint area identifier and the initial lens parameters.
4. The method of claim 3, wherein after the step of saving the waypoint region identifier and the initial lens parameters in response to a trigger instruction to the save control, further comprising:
responding to a selected operation acting on a virtual scene to be edited, and determining a virtual camera to be edited, wherein the virtual camera to be edited is a virtual camera corresponding to a waypoint area identifier corresponding to the selected operation;
and responding to a camera editing instruction, and adjusting initial lens parameters corresponding to the virtual camera to be edited.
5. The method of claim 1 or 2 or 3 or 4, wherein the waypoint region identification comprises at least one of: the road point identification system comprises a gradual change road point area identification and a fixed road point area identification, wherein the road point area corresponding to the gradual change road point area identification is a gradual change road point area, and the road point area corresponding to the fixed road point area identification is a fixed road point area; the gradual change route point area is configured to determine a first target lens parameter corresponding to a first initial lens parameter according to the position of the virtual character when the virtual character is in the gradual change route point area; the fixed waypoint area is configured to determine a corresponding second target lens parameter according to a second initial lens parameter when the virtual character is within the fixed waypoint area.
6. The method of claim 5, wherein the gradual change waypoint region comprises a first region and a second region; the first area is a circular area taking the gradual change road point area mark as a circle center; the second area is the rest area except the first area in the gradual change road point area; the fixed waypoint area is a circular area taking the fixed waypoint area mark as a circle center.
7. The method of claim 4, wherein the waypoint region identifier comprises a graded waypoint region identifier, the graded waypoint region comprising a first region and a second region, the first region being a circular region centered on the graded waypoint region identifier; the second area is the rest area except the first area in the gradual change road point area; when the virtual camera to be edited corresponds to the gradual change road point area identifier, the step of responding to a camera editing instruction and adjusting initial lens parameters corresponding to the virtual camera to be edited comprises the following steps:
responding to a first selected operation on the first area in the virtual scene to be edited, and adjusting the circular area range corresponding to the first area;
and responding to a second selected operation on the second area in the virtual scene to be edited, and adjusting the range of the second area.
8. The method of claim 1 or 2, wherein the initial lens parameters include at least one of: the position and orientation of the lens.
9. The method of claim 6, wherein the waypoint region identifier is configured to determine a location of the virtual character when the virtual character is within a region to which the waypoint region identifier corresponds, and trigger a change of a current lens parameter corresponding to the virtual character to a second target lens parameter equal to the second initial lens parameter when the location of the virtual character is within the fixed waypoint region.
10. The method according to claim 9, wherein the virtual scene to be edited is further provided with a default lens parameter, the default lens parameter is used for determining the default lens parameter as a target lens parameter when the virtual character is not located in the waypoint area, and the waypoint area identifier is configured to trigger changing a current lens parameter corresponding to the virtual character to the first target lens parameter calculated by using the first initial lens parameter and the default lens parameter if the position of the virtual character is located in the second area in the gradual waypoint area.
11. The method of claim 10, wherein the waypoint region identifier is configured to trigger a change of a current lens parameter corresponding to the virtual character to a first target lens parameter equal to the first initial lens parameter if the position of the virtual character is within the first region in the graded waypoint region.
12. An electronic device comprising a processor, a memory and a computer program stored on the memory and executable on the processor, the computer program implementing the steps of the method of editing a game lens according to any one of claims 1 to 11 when executed by the processor.
13. A computer-readable storage medium, on which a computer program is stored, which computer program, when being executed by a processor, implements the steps of the method of editing a game lens according to any one of claims 1 to 11.
CN202010317742.2A 2020-04-21 2020-04-21 Editing method of game lens, electronic equipment and storage medium Active CN111494948B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010317742.2A CN111494948B (en) 2020-04-21 2020-04-21 Editing method of game lens, electronic equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010317742.2A CN111494948B (en) 2020-04-21 2020-04-21 Editing method of game lens, electronic equipment and storage medium

Publications (2)

Publication Number Publication Date
CN111494948A CN111494948A (en) 2020-08-07
CN111494948B true CN111494948B (en) 2023-11-17

Family

ID=71876273

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010317742.2A Active CN111494948B (en) 2020-04-21 2020-04-21 Editing method of game lens, electronic equipment and storage medium

Country Status (1)

Country Link
CN (1) CN111494948B (en)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112738393B (en) * 2020-12-25 2022-08-09 珠海西山居移动游戏科技有限公司 Focusing method and device
CN115866224A (en) * 2022-11-25 2023-03-28 中国联合网络通信集团有限公司 Scene switching method and device

Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110694271A (en) * 2019-10-21 2020-01-17 网易(杭州)网络有限公司 Camera attitude control method and device in game scene and electronic equipment

Patent Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110694271A (en) * 2019-10-21 2020-01-17 网易(杭州)网络有限公司 Camera attitude control method and device in game scene and electronic equipment

Also Published As

Publication number Publication date
CN111494948A (en) 2020-08-07

Similar Documents

Publication Publication Date Title
US11099654B2 (en) Facilitate user manipulation of a virtual reality environment view using a computing device with a touch sensitive surface
US11467721B2 (en) Augmented reality for the Internet of Things
US11684858B2 (en) Supplemental casting control with direction and magnitude
US8249263B2 (en) Method and apparatus for providing audio motion feedback in a simulated three-dimensional environment
CN111298431B (en) Construction method and device in game
JP2023517917A (en) VIRTUAL SCENE DISPLAY METHOD, APPARATUS, DEVICE, AND COMPUTER PROGRAM
CA2925906A1 (en) Three-dimensional (3d) browsing
CN111494948B (en) Editing method of game lens, electronic equipment and storage medium
EP2950274B1 (en) Method and system for generating motion sequence of animation, and computer-readable recording medium
US11893696B2 (en) Methods, systems, and computer readable media for extended reality user interface
CN114443945A (en) Display method of application icons in virtual user interface and three-dimensional display equipment
TW202217541A (en) Location adjusting method, device, equipment, storage medium, and program product for virtual buttons
CN112051956A (en) House source interaction method and device
CN109814867B (en) Virtual model building method and system
Jing Design and implementation of 3D virtual digital campus-based on unity3d
US10726621B2 (en) Traversal selection of components for a geometric model
CN116501209A (en) Editing view angle adjusting method and device, electronic equipment and readable storage medium
CN111467799A (en) Coordinate conversion method and device, electronic equipment and storage medium
CN112169313B (en) Game interface setting method and device, electronic equipment and storage medium
CN115129224A (en) Movement control method and device, storage medium and electronic equipment
KR102618644B1 (en) Method and apparatus for generating composite image using 3d model
CN117414584A (en) Editing method and device for scene component in game, electronic equipment and medium
CN115904192A (en) Interface display method and device, electronic equipment and readable storage medium
CN117959704A (en) Virtual model placement method and device, electronic equipment and readable storage medium
CN118113186A (en) Panoramic roaming method, device, equipment and readable storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant