CN111494948A - Game lens editing method, electronic equipment and storage medium - Google Patents

Game lens editing method, electronic equipment and storage medium Download PDF

Info

Publication number
CN111494948A
CN111494948A CN202010317742.2A CN202010317742A CN111494948A CN 111494948 A CN111494948 A CN 111494948A CN 202010317742 A CN202010317742 A CN 202010317742A CN 111494948 A CN111494948 A CN 111494948A
Authority
CN
China
Prior art keywords
waypoint
area
virtual
region
edited
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202010317742.2A
Other languages
Chinese (zh)
Other versions
CN111494948B (en
Inventor
杜博
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Netease Hangzhou Network Co Ltd
Original Assignee
Netease Hangzhou Network Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Netease Hangzhou Network Co Ltd filed Critical Netease Hangzhou Network Co Ltd
Priority to CN202010317742.2A priority Critical patent/CN111494948B/en
Publication of CN111494948A publication Critical patent/CN111494948A/en
Application granted granted Critical
Publication of CN111494948B publication Critical patent/CN111494948B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/50Controlling the output signals based on the game progress
    • A63F13/52Controlling the output signals based on the game progress involving aspects of the displayed game scene
    • A63F13/525Changing parameters of virtual cameras
    • A63F13/5252Changing parameters of virtual cameras using two or more virtual cameras concurrently or sequentially, e.g. automatically switching between fixed virtual cameras when a character changes room or displaying a rear-mirror view in a car-driving game

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Processing Or Creating Images (AREA)

Abstract

The embodiment of the invention provides a method for editing a game lens, which provides a graphical user interface through first terminal equipment and comprises the following steps: displaying a virtual scene to be edited through the graphical user interface; responding to a waypoint editing instruction, and setting waypoint area identifications in the virtual scene to be edited; responding to the lens editing instruction, and determining initial lens parameters of the virtual camera corresponding to the waypoint area identification; and the waypoint area identifier is configured to trigger the current lens parameters corresponding to the virtual character to be changed into the target lens parameters corresponding to the initial lens parameters when the virtual character is in the area corresponding to the waypoint area identifier. The method and the device have the advantages that the quick and efficient three-dimensional lens control of what you see is what you get in the game is realized, the direction of the lens can be controlled, art developers only need to make a scene model and a sticker in a specific direction, the rendering performance is saved, and the optimal art effect expression is achieved.

Description

Game lens editing method, electronic equipment and storage medium
Technical Field
The present invention relates to the field of computers, and in particular, to a method for editing a game scene, an electronic device, and a storage medium.
Background
With the development of computer network technology and the development of mobile device platforms, games with virtual three-dimensional scenes are more and more favored by players on mobile platforms. In a game, the appearance of shots intuitively affects the player's gaming experience. In the walking process of the chief actor in the game, if the chief actor is assisted by the change of the lens, the chief actor brings more real world experience to game players.
Most three-dimensional games on the market at present generally adopt a free visual angle and a fixed visual angle of a locking direction on the control of a lens. If the camera posture adopts a free visual angle, the art scene needs to be manufactured by manufacturing models and pictures in all directions, and the cost on the rendering performance is high. If a fixed visual angle of a locking direction is adopted, the expressive force of the three-dimensional scene is greatly reduced, and the scene expression is too monotonous.
Disclosure of Invention
In view of the above problems, the present invention is proposed to provide a game-lens editing method, an electronic device, and a storage medium that overcome or at least partially solve the above problems, including:
in order to solve the above problem, an embodiment of the present invention discloses a method for editing a game scene, which provides a graphical user interface through a first terminal device, and is characterized in that the method includes:
displaying a virtual scene to be edited through the graphical user interface;
responding to a waypoint editing instruction, setting waypoint area identifications in the virtual scene to be edited,
responding to the lens editing instruction, and determining initial lens parameters of the virtual camera corresponding to the waypoint area identification; and the waypoint area identifier is configured to trigger the current lens parameters corresponding to the virtual character to be changed into the target lens parameters corresponding to the initial lens parameters when the virtual character is in the area corresponding to the waypoint area identifier.
Preferably, the step of setting a waypoint area identifier in the virtual scene to be edited in response to the waypoint editing instruction includes:
determining the virtual scene to be edited in a preset waypoint area setting page; the waypoint region setting page comprises a waypoint region identification adding control and a storing control;
and responding to a waypoint editing instruction of the waypoint area identification adding control, and adding waypoint area identifications in the virtual scene to be edited.
Preferably, the step of determining initial lens parameters of the virtual camera corresponding to the waypoint region identifier in response to the lens editing instruction includes:
responding to the lens editing instruction, and setting initial lens parameters for the virtual camera corresponding to the waypoint area identification;
and responding to a trigger instruction of the saving control, and saving the waypoint area identification and the initial lens parameter.
Preferably, after the step of saving the waypoint area identifier and the initial lens parameter in response to the trigger instruction to the saving control, the method further includes:
responding to a selected operation acting on a virtual scene to be edited, and determining a virtual camera to be edited, wherein the virtual camera to be edited is a virtual camera corresponding to a waypoint area identifier corresponding to the selected operation;
and responding to a camera editing instruction, and adjusting the initial lens parameters corresponding to the virtual camera to be edited.
Preferably, the waypoint area identifications include at least one of: the method comprises the following steps that a gradual change waypoint area mark and a fixed waypoint area mark are included, wherein the waypoint area corresponding to the gradual change waypoint area mark is a gradual change waypoint area, and the waypoint area corresponding to the fixed waypoint area mark is a fixed waypoint area; the gradual change waypoint area is configured to determine a first target shot parameter corresponding to a first initial shot parameter according to the position of the virtual character when the virtual character is in the gradual change waypoint area; and the fixed waypoint area is configured to determine corresponding second target shot parameters according to the second initial shot parameters when the virtual character is in the fixed waypoint area.
Preferably, the fade waypoint region comprises a first region and a second region; the first area is a circular area with the gradual change waypoint area identification as the center of a circle; the second area is the rest area except the first area in the gradual change waypoint area; the fixed waypoint region is a circular region with the fixed waypoint region identification as the center of a circle.
Preferably, when the virtual camera to be edited corresponds to the gradual change waypoint area identifier, the step of adjusting the initial lens parameters corresponding to the virtual camera to be edited in response to the camera editing instruction includes:
responding to a first selected operation on the first area in the virtual scene to be edited, and adjusting the circular area range corresponding to the first area;
and responding to a second selected operation on the second area in the virtual scene to be edited, and adjusting the range of the second area.
Preferably, the initial lens parameters include at least one of: the position and orientation of the lens.
Preferably, the waypoint area identifier is configured to determine the position of the virtual character when the virtual character is located in the area corresponding to the waypoint area identifier, and trigger to change the current lens parameter corresponding to the virtual character into a second target lens parameter equal to the second initial lens parameter when the position of the virtual character is located in the fixed waypoint area.
Preferably, the virtual scene to be edited is further provided with default shot parameters, the default shot parameters are used to determine the default shot parameters as target shot parameters when the virtual character is not located in the waypoint region, and the waypoint region identifier is configured to trigger to change the current shot parameters corresponding to the virtual character into the first target shot parameters calculated by using the first initial shot parameters and the default shot parameters if the position of the virtual character is located in the second region in the gradual change waypoint region.
Preferably, the waypoint area identifier is configured to trigger to change the current shot parameter corresponding to the virtual character into a first target shot parameter equal to the first initial shot parameter if the position of the virtual character is in the first area in the gradual change waypoint area.
The embodiment of the invention also provides electronic equipment which comprises a processor, a memory and a computer program which is stored on the memory and can run on the processor, wherein when the computer program is executed by the processor, the steps of the editing method such as the game shot are realized.
The embodiment of the invention also provides a computer readable storage medium, a computer program is stored on the computer readable storage medium, and the steps of the editing method of the game lens are realized when the computer program is executed by a processor.
The invention has the following advantages:
in the embodiment of the invention, a virtual scene to be edited is displayed through a graphical user interface, a waypoint editing instruction is responded, a waypoint area identifier is set in the virtual scene to be edited, a lens editing instruction is responded, and initial lens parameters of a virtual camera corresponding to the waypoint area identifier are determined; and the waypoint area identifier is configured to trigger the current lens parameter corresponding to the virtual character to be changed into the target lens parameter corresponding to the initial lens parameter when the virtual character is in the area corresponding to the waypoint area identifier. The method has the advantages that the quick and efficient three-dimensional lens control of what you see is what you get in the game is realized, the direction of the lens can be guaranteed to be controllable, art developers only need to make a scene model and a sticker in a specific direction, the rendering performance is saved, the three-dimensional lens can be guaranteed to change the posture according to the set design, the defect of monotonous effect is avoided, and the optimal art effect expression is achieved.
Drawings
In order to more clearly illustrate the technical solution of the present invention, the drawings needed to be used in the description of the present invention will be briefly introduced below, and it is obvious that the drawings in the following description are only some embodiments of the present invention, and it is obvious for those skilled in the art to obtain other drawings based on these drawings without inventive exercise.
FIG. 1 is a flowchart illustrating steps of a method for editing a game scene according to an embodiment of the present invention;
FIG. 2 is a flowchart illustrating steps of another method for editing game shots according to an embodiment of the invention;
fig. 3 is a schematic diagram of a waypoint area identifier setting page provided in an embodiment of the present invention;
FIG. 4 is a schematic diagram of a gradual waypoint region provided by an embodiment of the invention;
FIG. 5 is a schematic diagram of a fixed waypoint region provided by an embodiment of the invention;
fig. 6 is a schematic diagram of an attribute information display page of a waypoint area identifier according to an embodiment of the present invention;
FIG. 7 is a schematic diagram of an intersection of a gradual waypoint region and a fixed waypoint region provided by an embodiment of the invention;
FIG. 8 is a schematic diagram of a fixed waypoint region intersecting a fixed waypoint region according to an embodiment of the invention;
fig. 9 is a schematic diagram illustrating editing of a game scene according to an embodiment of the present invention.
Detailed Description
In order to make the aforementioned objects, features and advantages of the present invention comprehensible, embodiments accompanied with figures are described in further detail below. It is to be understood that the embodiments described are only a few embodiments of the present invention, and not all embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
The method for editing the game shots in one embodiment of the present disclosure may be executed on a terminal device or a server. The terminal device may be a local terminal device. When the method for editing the game shots runs on the server, the method for editing the game shots can be implemented and executed based on a cloud interactive system, wherein the cloud interactive system comprises the server and the client device.
In an optional embodiment, various cloud applications may be run under the cloud interaction system, for example: and (5) cloud games. Taking a cloud game as an example, a cloud game refers to a game mode based on cloud computing. In the running mode of the cloud game, the running main body of the game program and the game picture presenting main body are separated, the storage and the running of the editing method of the game lens are finished on a cloud game server, and the client device is used for receiving and sending data and presenting the game picture, for example, the client device can be a display device with a data transmission function close to a user side, such as a mobile terminal, a television, a computer, a palm computer and the like; however, the terminal device performing the information processing is a cloud game server in the cloud. When a game is played, a player operates the client device to send an operation instruction to the cloud game server, the cloud game server runs the game according to the operation instruction, data such as game pictures and the like are encoded and compressed, the data are returned to the client device through a network, and finally the data are decoded through the client device and the game pictures are output.
In an alternative embodiment, the terminal device may be a local terminal device. Taking a game as an example, the local terminal device stores a game program and is used for presenting a game screen. The local terminal device is used for interacting with the player through a graphical user interface, namely, a game program is downloaded and installed and operated through an electronic device conventionally. The manner in which the local terminal device provides the graphical user interface to the player may include a variety of ways, for example, it may be rendered for display on a display screen of the terminal or provided to the player through holographic projection. For example, the local terminal device may include a display screen for presenting a graphical user interface including a game screen and a processor for running the game, generating the graphical user interface, and controlling display of the graphical user interface on the display screen.
Referring to fig. 1, a flowchart illustrating steps of a method for editing a game scene provided by an embodiment of the present invention is shown, where a graphical user interface is provided by a first terminal device. The first terminal device may be the aforementioned local terminal device or a client device. The method may specifically comprise the steps of:
step 101, displaying a virtual scene to be edited through the graphical user interface;
the game editing application program is run on the local terminal device or the client device, and the graphical user interface is provided on the local terminal device or the client device, the content displayed by the graphical user interface at least partially includes a partial or complete virtual scene to be edited, and the specific form of the virtual scene to be edited may be a square shape or other shapes (e.g., a circular shape, etc.).
By setting the virtual camera in the virtual scene to be edited, the editing of the virtual scene to be edited can be completed, so that the virtual scene becomes the virtual scene. The virtual camera moves in the virtual scene, providing the user with different perspectives in the virtual scene.
Step 102, responding to a waypoint editing instruction, and setting waypoint area identifications in the virtual scene to be edited;
after receiving the waypoint editing instructions, the terminal device may set waypoint area identifiers in an area corresponding to the waypoint editing instructions in the virtual scene to be edited according to the waypoint editing instructions, and each waypoint area identifier has a corresponding range.
103, responding to the lens editing instruction, and determining initial lens parameters of the virtual camera corresponding to the waypoint area identification; and the waypoint area identifier is configured to trigger the current lens parameters corresponding to the virtual character to be changed into the target lens parameters corresponding to the initial lens parameters when the virtual character is in the area corresponding to the waypoint area identifier.
After the waypoint area identifiers are set, the terminal device receives a lens editing instruction, the lens editing instruction corresponds to each waypoint area identifier and carries initial lens parameters of a corresponding virtual camera, such as coordinates of the virtual camera in a virtual scene to be edited, the orientation of the virtual camera and the like. After the initial lens parameters of the virtual camera are set for the waypoint area identifications, when the virtual character is located in the area corresponding to the waypoint area in the virtual scene, the current lens parameters corresponding to the virtual character are automatically triggered to be changed into the target lens parameters, and the target lens parameters are equal to the initial lens parameters of the waypoint area identifications.
In the embodiment of the invention, a virtual scene to be edited is displayed through a graphical user interface, a waypoint editing instruction is responded, a waypoint area identifier is set in the virtual scene to be edited, a lens editing instruction is responded, and initial lens parameters of a virtual camera corresponding to the waypoint area identifier are determined; the waypoint area identifier is configured to trigger the current lens parameters corresponding to the virtual character to be changed into the target lens parameters corresponding to the initial lens parameters when the virtual character is located in the area corresponding to the waypoint area identifier, so that the visual angle of the virtual scene can be changed according to different positions where the virtual character is located, the controllability of the lens direction is guaranteed, art developers only need to make scene models and pictures in specific directions, the rendering performance is saved, the defect of monotonous effect is avoided, and the optimal art effect expression is achieved.
Referring to fig. 2, a flowchart illustrating steps of another method for editing a game scene according to an embodiment of the present invention is shown, where a graphical user interface is provided through a first terminal device, where the first terminal device may be the aforementioned local terminal device or a client device. The method specifically comprises the following steps:
step 201, displaying a virtual scene to be edited through the graphical user interface;
step 201 is similar to step 101, and the detailed description may refer to step 101, which is not described herein again.
Step 202, responding to a waypoint editing instruction, and setting waypoint area identifications in the virtual scene to be edited;
in a preferred embodiment of the present invention, the step 202 further comprises the following sub-steps:
determining the virtual scene to be edited in a preset waypoint area identification setting page; the waypoint region identification setting page comprises a waypoint region identification adding control and a storing control;
and responding to a waypoint editing instruction of the waypoint area identification adding control, and adding waypoint area identifications in the virtual scene to be edited.
As an example, as shown in fig. 3, fig. 3 is a waypoint area identifier setting page, and a virtual scene to be edited, in which the waypoint area identifier is desired to be set, may be selected at a virtual scene selection area 301 above the page, where the virtual scene to be edited is desired to be set, by a developer may determine the virtual scene to be edited through a waypoint area identifier setting page preset in an editor. After the selection of the virtual scene to be edited is completed, a developer may input a waypoint editing instruction by triggering the addition waypoint region identifier control on the waypoint region identifier setting page, as shown in fig. 3, an addition waypoint region identifier control 3021 is provided in the function control region 302 below the waypoint region identifier setting page, as an example, the triggering mode may be a click operation, and the developer may add waypoint region identifiers in the virtual scene to be edited by clicking the addition waypoint region identifier control 3021. In the adding process, a virtual role model is generated in the virtual scene to be edited and used for representing the waypoint area identifications.
The waypoint area identifiers can be divided into two types, one is a gradual change waypoint area identifier, as shown in fig. 4, the gradual change waypoint area identifier corresponds to the gradual change waypoint area and comprises two parts. The first area is an inner circle area with the identification of the gradual change waypoint area as the center of a circle; the second region is a circular ring region except the first region. Another waypoint area identification is a fixed waypoint area identification as shown in fig. 5. The fixed waypoint area mark corresponds to a circular area which takes the fixed waypoint area mark as the center of a circle. When the virtual role is in the gradual change waypoint area, a first target lens parameter can be determined according to a first initial lens parameter of the virtual camera set in the gradual change waypoint area; and when the virtual character is in the fixed waypoint region, determining a second target lens parameter according to a second initial lens parameter of the virtual camera set in the fixed waypoint region.
Step 203, responding to the lens editing instruction, and determining initial lens parameters of the virtual camera corresponding to the waypoint area identification;
in a preferred embodiment of the present invention, the step 203 further comprises the following sub-steps:
responding to the lens editing instruction, and setting initial lens parameters for the virtual camera corresponding to the waypoint area identification;
and responding to a trigger instruction of the saving control, and saving the waypoint area identification and the initial lens parameter.
After the waypoint area identifiers are added, developers can set initial lens parameters for the virtual camera corresponding to the added waypoint area identifiers in the virtual scene to be edited according to different requirements on the visual angle when the virtual character is at different positions in the virtual scene. The initial lens parameters may include the position and orientation of the lens, and as an example, the position of the lens may be coordinates (x, y, z), and the orientation may be the horizontal rotation angle and the vertical elevation angle of the lens. Meanwhile, the range of the set initial lens parameter action is different for different types of dew point area marks. The second initial lens parameters corresponding to the fixed waypoint region identifiers are applied to the whole range of the fixed waypoint region, and for the gradual change waypoint region identifiers, the corresponding first initial lens parameters are applied to the first region in the gradual change waypoint region.
After the initial lens parameters are set in the virtual scene to be edited, the attribute information of the waypoint area identification can be displayed on the waypoint area identification setting page. As shown in fig. 6, fig. 6 is a page showing various attribute information of a waypoint area identifier, where the page includes a default shot offset setting area 401 for setting default shot parameters, and when the position of the virtual character is not located in the area corresponding to the waypoint area identifier, the current shot parameters corresponding to the virtual character are set by using the default shot parameters, so as to display the view angle of the virtual scene; a waypoint region identifier list region 402 for displaying all waypoint region identifiers that have been added so far, and a developer can select a certain waypoint region identifier from the waypoint region identifier list by clicking operation; a waypoint region identifier attribute display region 403, configured to display detailed data of the currently selected waypoint region identifier, such as coordinates, a name of a virtual scene to be edited, and the like.
In the waypoint area identifier setting page, a save control is further included, and the developer can save the added dew point area identifier and the set initial lens parameter by triggering the save control, as shown in fig. 3, the save control 3022 is located in the function control area 302. As an example, the triggering manner may be a click operation, and after the addition of the waypoint area identifier and the setting of the initial lens parameter are completed, the developer clicks the saving control, so that the dew point area identifier and the initial lens parameter can be saved.
In addition, the waypoint region identifier setting page may further include a mode switching control for switching an editing mode and a testing mode, the editing mode is used for performing operations such as adding waypoint region identifiers and setting initial lens parameters, and the testing mode is used for testing the added waypoint region identifiers and the set initial lens parameters and checking the effect in actual operation; a deleting control used for deleting the added waypoint area identification; the waypoint region identification list refreshing control is used for refreshing the waypoint region identification list and displaying the latest waypoint region identification list; the control part is used for hiding/displaying the waypoint area identification and is used for hiding the waypoint area identification in the virtual scene to be edited; and the default lens parameter locking control is used for locking the default lens parameters in the virtual scene to be edited so as to avoid misoperation. A developer may also perform operations such as copying, multi-selecting, and cutting the waypoint area identifier and the initial shot parameter in the waypoint area identifier setting page and the virtual scene to be edited through a preset operation mode, as shown in table 1, which is an example of performing operations by using a keyboard and a mouse in the embodiment of the present invention.
Figure BDA0002460178740000091
Figure BDA0002460178740000101
Figure BDA0002460178740000111
TABLE 1
Step 204, responding to a selected operation acting on a virtual scene to be edited, and determining a virtual camera to be edited, wherein the virtual camera to be edited is a virtual camera corresponding to a waypoint area identifier corresponding to the selected operation;
after the added waypoint area identifications and the initial lens parameters of the corresponding virtual cameras are stored in the waypoint area identification setting page, developers can enter a virtual scene to be edited and select the virtual camera corresponding to the added waypoint area identification.
Step 205, responding to a camera editing instruction, and adjusting initial lens parameters corresponding to the virtual camera to be edited;
specifically, the developer may adjust the initial lens parameter corresponding to the virtual camera by operating the added waypoint area identifier in the virtual scene to be edited, as shown in table 2, which is an operation mode for adjusting the initial lens parameter. Of course, these operation modes are also suitable for adjusting the default lens parameters, and meanwhile, a person skilled in the art may also adjust the initial lens parameters corresponding to the virtual camera by using other operation modes according to the needs of the person skilled in the art, which is not limited in the present invention.
Figure BDA0002460178740000121
TABLE 2
In a preferred embodiment of the present invention, when the virtual camera to be edited corresponds to the gradual change waypoint region identifier, the step 205 includes the following sub-steps:
responding to a first selected operation on the first area in the virtual scene to be edited, and adjusting the circular area range corresponding to the first area;
and responding to a second selected operation on the second area in the virtual scene to be edited, and adjusting the range of the second area.
When the area range corresponding to the gradual change waypoint area identifier is to be adjusted, a developer can select a first area in the gradual change waypoint area corresponding to the gradual change waypoint area identifier through a triggering operation, such as clicking, in the virtual scene to be edited, and then adjust the radius corresponding to the first area by pressing a ctrl key and using a mouse pulley. And similarly, selecting a second area in the gradual change waypoint area, and then adjusting the radius of the excircle of the gradual change waypoint area by pressing a ctrl key and utilizing a mouse pulley, thereby adjusting the range of the second area.
In addition, in order to make the change of the view angle of the virtual scene smoother and to reduce the complexity of determining the target lens parameters, in the embodiment of the present invention, the priority of the fixed waypoint area identifier is set to be the highest priority, that is, when the virtual character is located at the intersection position of the fixed waypoint area and the gradual change waypoint area, as shown in fig. 7, the left side is the gradual change waypoint area, the right side is the fixed waypoint area, and an intersection area is located between the fixed waypoint area identifier and the gradual change waypoint area, and the initial lens parameters corresponding to the fixed waypoint area identifier with a higher priority are determined as the target lens parameters. In addition, the gradual change waypoint regions may not be intersected, the fixed waypoint regions may be intersected, and the initial lens parameters of the fixed waypoint region identifiers having the intersected regions are the same, as shown in fig. 8, where fig. 8 is two intersected fixed waypoint region identifiers. It should be noted that the setting of the priority may be set according to the specific requirements of the developer, for example, the priority of the gradual change waypoint area identifier is set as the highest priority, which is not limited by the present invention.
In a preferred embodiment of the present invention, the waypoint area identifier is configured to determine a position of the virtual character when the virtual character is located in an area corresponding to the waypoint area identifier, and trigger to change the current lens parameter corresponding to the virtual character into a second target lens parameter equal to the second initial lens parameter when the position of the virtual character is located in the fixed waypoint area.
The coordinates of the positions of the virtual characters in the virtual scene are determined, all fixed waypoint areas in the virtual scene are traversed, whether the coordinates are located in the fixed waypoint areas or not can be judged, and if yes, the initial lens parameters of the corresponding fixed waypoint area identifications are determined as the target lens parameters.
In a preferred embodiment of the present invention, the virtual scene to be edited further has default shot parameters, where the default shot parameters are used to determine the default shot parameters as target shot parameters when the virtual character is not located in the waypoint region, and the waypoint region identifier is configured to trigger to change the current shot parameters corresponding to the virtual character into the first target shot parameters calculated by using the first initial shot parameters and the default shot parameters if the position of the virtual character is located in the second region in the gradual change waypoint region.
When the coordinates of the virtual character are not in the fixed waypoint area, traversing all the gradual change waypoint area identifications in the virtual scene, judging whether the coordinates are in the circular ring area of the gradual change waypoint area corresponding to the gradual change waypoint area identifications, if so, adopting initial lens parameters and default lens parameters corresponding to the inner circle of the gradual change waypoint area, and calculating target lens parameters according to an interpolation algorithm. Specifically, assuming that an initial lens parameter corresponding to the inner circle is Pb, a default lens parameter is Pa, a distance from the virtual character to the center of the inner circle is h, a radius of the inner circle is R1, a radius of the gradual change waypoint region is R2, and a target lens parameter is P, a calculation mode of the interpolation algorithm is as follows:
Figure BDA0002460178740000131
in a preferred embodiment of the present invention, the waypoint area identifier is configured to trigger to change the current lens parameter corresponding to the virtual character into a first target lens parameter equal to the first initial lens parameter if the position of the virtual character is in the first area in the gradual change waypoint area.
And when the coordinate is not in the second area in the gradual change waypoint area, judging whether the coordinate is in an inner circle area in the gradual change waypoint area, and if so, determining the initial lens parameters corresponding to the inner circle area as the target lens parameters. If not, the coordinate is judged not to be in any path point area, and the default lens parameter is determined as the target lens parameter.
Further, when the developer sets the fade waypoint region to the highest priority, the target shot parameters may be determined by:
determining the position of the virtual character;
traversing all the identifiers of the gradual change waypoint areas in the virtual scene, and judging whether the position is in a circular ring area in the gradual change waypoint area;
if so, calculating target lens parameters by adopting initial lens parameters and default lens parameters corresponding to the inner circle region in the gradual change waypoint region based on an interpolation algorithm;
if not, judging whether the position is in the inner circle area of the gradual change waypoint area or not;
if so, determining the initial lens parameters corresponding to the inner circle area in the gradual change waypoint area as target lens parameters;
if not, judging whether the position is in a fixed waypoint area or not;
if so, determining the initial lens parameters corresponding to the fixed waypoint area as target lens parameters;
if not, determining the default lens parameters as the target lens parameters.
By applying the embodiment of the invention, the waypoint area identification and the initial lens parameter are set through the preset waypoint area identification setting page, the relation between the position of the virtual character in the virtual scene and the waypoint area is judged, the target lens parameter is confirmed according to the judgment result, and the visual angle of the virtual scene is determined based on the target lens parameter, so that the visual angle of the virtual scene can be dynamically changed according to the position of the virtual character, and the virtual scene is better displayed while the rendering resource consumption is reduced.
In order to enable those skilled in the art to better understand the effect achieved by the solution of the present invention, the present application is exemplified below by an example that sets the fixed waypoint area identification as the highest priority, but it should be understood that the present application is not limited thereto.
As shown in fig. 9, for the virtual scene with the waypoint area identifiers already set, when the virtual character moves from outside the gradual change waypoint area identifier 501 to the center of the gradual change waypoint area identifier 501, the angle of view of the virtual scene changes according to the change of the target lens parameters. When the virtual character is located above the gradual change waypoint area identifier 501 and does not belong to any waypoint area identifier, the target shot parameter is a default shot parameter. When the virtual character moves to the circular ring area of the gradual change waypoint area identifier 501, and is not in contact with the inner circle area, and is not in contact with the fixed waypoint area identifiers 502 and 503, the target lens parameters are calculated through an interpolation algorithm according to the initial lens parameters and the default lens parameters corresponding to the inner circle area of the gradual change waypoint area identifier 501, and the distance from the center of the circle of the gradual change waypoint area identifier 501. When the virtual character enters the inner circle area from the ring area of the gradual change waypoint area identification 501, the target lens parameter is changed into the initial lens parameter corresponding to the inner circle area. The virtual character selects to continue to move to the left, and when the virtual character enters the position where the fixed waypoint area identifier 502 and the circular ring area of the gradual change waypoint area identifier 501 intersect, the target shot parameter becomes the initial shot parameter corresponding to the fixed waypoint area identifier 502 because the priority of the fixed waypoint area is the highest. Then, the virtual character moves forward all the time, the target lens parameter remains unchanged until the virtual character leaves the fixed waypoint area identifier 502, at this time, the virtual character is located in the gradual change waypoint area identifier 504, the radius of the inner circle of the gradual change waypoint area identifier 504 is equal to the radius of the waypoint area, so that no circular ring area exists, and at this time, the target lens parameter is the initial lens parameter corresponding to the inner circle area of the gradual change waypoint area identifier 504. In the moving process of the virtual character, the target lens parameters are continuously changed according to the located waypoint area, so that the visual angle of the virtual scene is continuously changed according to the movement of the virtual character, and an excellent visual effect is presented.
An embodiment of the present invention further provides an electronic device, which may include a processor, a memory, and a computer program stored in the memory and capable of being executed on the processor, wherein the computer program, when executed by the processor, implements the steps of the editing method, such as playing a game.
An embodiment of the present invention further provides a computer-readable storage medium, on which a computer program is stored, where the computer program, when executed by a processor, implements the steps of the editing method such as playing a game.
The embodiments in the present specification are described in a progressive manner, each embodiment focuses on differences from other embodiments, and the same and similar parts among the embodiments are referred to each other.
As will be appreciated by one skilled in the art, embodiments of the present invention may be provided as a method, apparatus, or computer program product. Accordingly, embodiments of the present invention may take the form of an entirely hardware embodiment, an entirely software embodiment or an embodiment combining software and hardware aspects. Furthermore, embodiments of the present invention may take the form of a computer program product embodied on one or more computer-usable storage media (including, but not limited to, disk storage, CD-ROM, optical storage, and the like) having computer-usable program code embodied therein.
Embodiments of the present invention are described with reference to flowchart illustrations and/or block diagrams of methods, terminal devices (systems), and computer program products according to embodiments of the invention. It will be understood that each flow and/or block of the flow diagrams and/or block diagrams, and combinations of flows and/or blocks in the flow diagrams and/or block diagrams, can be implemented by computer program instructions. These computer program instructions may be provided to a processor of a general purpose computer, special purpose computer, embedded processor, or other programmable data processing terminal to produce a machine, such that the instructions, which execute via the processor of the computer or other programmable data processing terminal, create means for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be stored in a computer-readable memory that can direct a computer or other programmable data processing terminal to function in a particular manner, such that the instructions stored in the computer-readable memory produce an article of manufacture including instruction means which implement the function specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be loaded onto a computer or other programmable data processing terminal to cause a series of operational steps to be performed on the computer or other programmable terminal to produce a computer implemented process such that the instructions which execute on the computer or other programmable terminal provide steps for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
While preferred embodiments of the present invention have been described, additional variations and modifications of these embodiments may occur to those skilled in the art once they learn of the basic inventive concepts. Therefore, it is intended that the appended claims be interpreted as including preferred embodiments and all such alterations and modifications as fall within the scope of the embodiments of the invention.
Finally, it should also be noted that, herein, relational terms such as first and second, and the like may be used solely to distinguish one entity or action from another entity or action without necessarily requiring or implying any actual such relationship or order between such entities or actions. Also, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or terminal that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or terminal. Without further limitation, an element defined by the phrase "comprising an … …" does not exclude the presence of other like elements in a process, method, article, or terminal that comprises the element.
The above detailed description is given to the editing method of the game lens, the electronic device and the storage medium, and the principle and the implementation of the present invention are explained by applying a specific example, and the description of the above embodiment is only used to help understanding the method and the core idea of the present invention; meanwhile, for a person skilled in the art, according to the idea of the present invention, there may be variations in the specific embodiments and the application scope, and in summary, the content of the present specification should not be construed as a limitation to the present invention.

Claims (13)

1. A method for editing a game scene, wherein a graphical user interface is provided through a first terminal device, the method comprising:
displaying a virtual scene to be edited through the graphical user interface;
responding to a waypoint editing instruction, and setting waypoint area identifications in the virtual scene to be edited;
responding to the lens editing instruction, and determining initial lens parameters of the virtual camera corresponding to the waypoint area identification; and the waypoint area identifier is configured to trigger the current lens parameters corresponding to the virtual character to be changed into the target lens parameters corresponding to the initial lens parameters when the virtual character is in the area corresponding to the waypoint area identifier.
2. The method according to claim 1, wherein the step of setting a waypoint region identifier in the virtual scene to be edited in response to the waypoint editing instruction comprises:
determining the virtual scene to be edited in a preset waypoint area setting page; the waypoint region setting page comprises a waypoint region identification adding control and a storing control;
and responding to a waypoint editing instruction of the waypoint area identification adding control, and adding waypoint area identifications in the virtual scene to be edited.
3. The method of claim 2, wherein the step of determining initial lens parameters of the virtual camera corresponding to the waypoint region identifications in response to the lens editing instructions comprises:
responding to the lens editing instruction, and setting initial lens parameters for the virtual camera corresponding to the waypoint area identification;
and responding to a trigger instruction of the saving control, and saving the waypoint area identification and the initial lens parameter.
4. The method according to claim 3, wherein after the step of saving the waypoint region identifiers and the initial lens parameters in response to the triggering instruction for the saving control, the method further comprises:
responding to a selected operation acting on a virtual scene to be edited, and determining a virtual camera to be edited, wherein the virtual camera to be edited is a virtual camera corresponding to a waypoint area identifier corresponding to the selected operation;
and responding to a camera editing instruction, and adjusting the initial lens parameters corresponding to the virtual camera to be edited.
5. The method of claim 1 or 2 or 3 or 4, wherein the waypoint area identifications comprise at least one of: the method comprises the following steps that a gradual change waypoint area mark and a fixed waypoint area mark are included, wherein the waypoint area corresponding to the gradual change waypoint area mark is a gradual change waypoint area, and the waypoint area corresponding to the fixed waypoint area mark is a fixed waypoint area; the gradual change waypoint area is configured to determine a first target shot parameter corresponding to a first initial shot parameter according to the position of the virtual character when the virtual character is in the gradual change waypoint area; and the fixed waypoint area is configured to determine corresponding second target shot parameters according to the second initial shot parameters when the virtual character is in the fixed waypoint area.
6. The method of claim 5, wherein the fade waypoint region comprises a first region and a second region; the first area is a circular area with the gradual change waypoint area identification as the center of a circle; the second area is the rest area except the first area in the gradual change waypoint area; the fixed waypoint region is a circular region with the fixed waypoint region identification as the center of a circle.
7. The method according to claim 6, wherein when the virtual camera to be edited corresponds to the gradual waypoint region identification, the adjusting the initial lens parameters corresponding to the virtual camera to be edited in response to the camera editing instruction comprises:
responding to a first selected operation on the first area in the virtual scene to be edited, and adjusting the circular area range corresponding to the first area;
and responding to a second selected operation on the second area in the virtual scene to be edited, and adjusting the range of the second area.
8. The method according to claim 1 or 2, wherein the initial shot parameters comprise at least one of: the position and orientation of the lens.
9. The method according to claim 6, wherein the waypoint area identifier is configured to determine the position of the virtual character when the virtual character is located in the area corresponding to the waypoint area identifier, and to trigger the current lens parameter corresponding to the virtual character to be changed to a second target lens parameter equal to the second initial lens parameter when the position of the virtual character is located in the fixed waypoint area.
10. The method according to claim 9, wherein a default shot parameter is further set in the virtual scene to be edited, the default shot parameter is used to determine the default shot parameter as a target shot parameter when the virtual character is not located in the waypoint region, and the waypoint region identifier is configured to trigger changing of a current shot parameter corresponding to the virtual character into the first target shot parameter calculated by using the first initial shot parameter and the default shot parameter if the position of the virtual character is located in the second region in the gradual waypoint region.
11. The method according to claim 10, wherein the waypoint region identifier is configured to trigger a change of a current shot parameter corresponding to the virtual character to a first target shot parameter equal to the first initial shot parameter if the position of the virtual character is within the first region in the gradual change waypoint region.
12. An electronic device comprising a processor, a memory and a computer program stored on the memory and capable of running on the processor, the computer program, when executed by the processor, implementing the steps of the method of editing a game shot as claimed in any one of claims 1 to 11.
13. A computer-readable storage medium, on which a computer program is stored, which, when executed by a processor, implements the steps of the method of editing game footage as claimed in any one of claims 1 to 11.
CN202010317742.2A 2020-04-21 2020-04-21 Editing method of game lens, electronic equipment and storage medium Active CN111494948B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010317742.2A CN111494948B (en) 2020-04-21 2020-04-21 Editing method of game lens, electronic equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010317742.2A CN111494948B (en) 2020-04-21 2020-04-21 Editing method of game lens, electronic equipment and storage medium

Publications (2)

Publication Number Publication Date
CN111494948A true CN111494948A (en) 2020-08-07
CN111494948B CN111494948B (en) 2023-11-17

Family

ID=71876273

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010317742.2A Active CN111494948B (en) 2020-04-21 2020-04-21 Editing method of game lens, electronic equipment and storage medium

Country Status (1)

Country Link
CN (1) CN111494948B (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112738393A (en) * 2020-12-25 2021-04-30 珠海西山居移动游戏科技有限公司 Focusing method and device
CN115866224A (en) * 2022-11-25 2023-03-28 中国联合网络通信集团有限公司 Scene switching method and device

Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110694271A (en) * 2019-10-21 2020-01-17 网易(杭州)网络有限公司 Camera attitude control method and device in game scene and electronic equipment

Patent Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110694271A (en) * 2019-10-21 2020-01-17 网易(杭州)网络有限公司 Camera attitude control method and device in game scene and electronic equipment

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112738393A (en) * 2020-12-25 2021-04-30 珠海西山居移动游戏科技有限公司 Focusing method and device
CN112738393B (en) * 2020-12-25 2022-08-09 珠海西山居移动游戏科技有限公司 Focusing method and device
CN115866224A (en) * 2022-11-25 2023-03-28 中国联合网络通信集团有限公司 Scene switching method and device

Also Published As

Publication number Publication date
CN111494948B (en) 2023-11-17

Similar Documents

Publication Publication Date Title
US11684858B2 (en) Supplemental casting control with direction and magnitude
CN111298431B (en) Construction method and device in game
JP7447299B2 (en) Adaptive display method and device for virtual scenes, electronic equipment, and computer program
CN112755516B (en) Interactive control method and device, electronic equipment and storage medium
CN112891943B (en) Lens processing method and device and readable storage medium
CN111494948B (en) Editing method of game lens, electronic equipment and storage medium
CN113262476B (en) Position adjusting method and device of operation control, terminal and storage medium
CN113069759A (en) Scene processing method and device in game and electronic equipment
CN112090073A (en) Game display method and device
JP2022532909A (en) Change anime character
CN105630160A (en) Virtual reality using interface system
CN105320410A (en) Method and device for touch control on touch terminal
US20210304632A1 (en) Dynamic scenario creation in virtual reality simulation systems
CN111766989B (en) Interface switching method and device
Jing Design and implementation of 3D virtual digital campus-based on unity3d
CN111330287B (en) Bullet screen display method and device in game, electronic equipment and storage medium
US9558578B1 (en) Animation environment
CN116501209A (en) Editing view angle adjusting method and device, electronic equipment and readable storage medium
CN116109737A (en) Animation generation method, animation generation device, computer equipment and computer readable storage medium
CN112169313A (en) Game interface setting method and device, electronic equipment and storage medium
Thorn Unity 2018 By Example: Learn about game and virtual reality development by creating five engaging projects
Thorn Unity 5. x by Example
WO2024146246A1 (en) Interaction processing method and apparatus for virtual scene, electronic device and computer storage medium
WO2024060924A1 (en) Interaction processing method and apparatus for virtual scene, and electronic device and storage medium
CN118203835A (en) Virtual model reloading method and device, storage medium and electronic device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant