WO2023005234A1 - 虚拟资源的投放控制方法、装置、计算机设备及存储介质 - Google Patents

虚拟资源的投放控制方法、装置、计算机设备及存储介质 Download PDF

Info

Publication number
WO2023005234A1
WO2023005234A1 PCT/CN2022/082121 CN2022082121W WO2023005234A1 WO 2023005234 A1 WO2023005234 A1 WO 2023005234A1 CN 2022082121 W CN2022082121 W CN 2022082121W WO 2023005234 A1 WO2023005234 A1 WO 2023005234A1
Authority
WO
WIPO (PCT)
Prior art keywords
virtual
virtual scene
target
resource
resource indicator
Prior art date
Application number
PCT/CN2022/082121
Other languages
English (en)
French (fr)
Inventor
王鑫
刘双
Original Assignee
网易(杭州)网络有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 网易(杭州)网络有限公司 filed Critical 网易(杭州)网络有限公司
Priority to JP2023552232A priority Critical patent/JP2024507595A/ja
Publication of WO2023005234A1 publication Critical patent/WO2023005234A1/zh

Links

Classifications

    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/55Controlling game characters or game objects based on the game progress
    • A63F13/57Simulating properties, behaviour or motion of objects in the game world, e.g. computing tyre load in a car race game
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/20Input arrangements for video game devices
    • A63F13/21Input arrangements for video game devices characterised by their sensors, purposes or types
    • A63F13/214Input arrangements for video game devices characterised by their sensors, purposes or types for locating contacts on a surface, e.g. floor mats or touch pads
    • A63F13/2145Input arrangements for video game devices characterised by their sensors, purposes or types for locating contacts on a surface, e.g. floor mats or touch pads the surface being also a display device, e.g. touch screens
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/50Controlling the output signals based on the game progress
    • A63F13/52Controlling the output signals based on the game progress involving aspects of the displayed game scene
    • A63F13/525Changing parameters of virtual cameras
    • A63F13/5258Changing parameters of virtual cameras by dynamically adapting the position of the virtual camera to keep a game object or game character in its viewing frustum, e.g. for tracking a character or a ball
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/50Controlling the output signals based on the game progress
    • A63F13/53Controlling the output signals based on the game progress involving additional visual information provided to the game scene, e.g. by overlay to simulate a head-up display [HUD] or displaying a laser sight in a shooting game
    • A63F13/537Controlling the output signals based on the game progress involving additional visual information provided to the game scene, e.g. by overlay to simulate a head-up display [HUD] or displaying a laser sight in a shooting game using indicators, e.g. showing the condition of a game character on screen
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/10Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by input arrangements for converting player-generated signals into game device control signals
    • A63F2300/1068Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by input arrangements for converting player-generated signals into game device control signals being specially adapted to detect the point of contact of the player on a surface, e.g. floor mat, touch pad
    • A63F2300/1075Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by input arrangements for converting player-generated signals into game device control signals being specially adapted to detect the point of contact of the player on a surface, e.g. floor mat, touch pad using a touch screen
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/30Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by output arrangements for receiving control signals generated by the game device
    • A63F2300/308Details of the user interface
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/60Methods for processing data by generating or executing the game program
    • A63F2300/66Methods for processing data by generating or executing the game program for rendering three dimensional images
    • A63F2300/6661Methods for processing data by generating or executing the game program for rendering three dimensional images for changing the position of the virtual camera

Definitions

  • the present application relates to the technical field of games, in particular to a virtual resource distribution control method, device, computer equipment and storage medium.
  • the embodiments of the present application provide a method, device, computer equipment, and storage medium for controlling placement of virtual resources, which can enable players in a game to accurately place target virtual resources at a certain position in a virtual scene.
  • the embodiment of the present application provides a method for controlling delivery of virtual resources, including:
  • the virtual object is configured to perform a game behavior in response to a touch operation on the graphical user interface
  • the target virtual resource In response to the use trigger operation of the target virtual resource, displaying a second virtual scene and a resource indicator located in the second virtual scene through the graphical user interface, wherein the resource indicator is used to visually indicate that the target virtual where resources are placed;
  • the embodiment of the present application also provides a device for placing virtual throwing resources, including:
  • the first display unit is configured to display a first virtual scene and a virtual object located in the first virtual scene through a graphical user interface, and the virtual object is configured to perform a game behavior in response to a touch operation on the graphical user interface;
  • the second display unit is configured to display a second virtual scene and a resource indicator located in the second virtual scene through the graphical user interface in response to a trigger operation on the use of the target virtual resource, wherein the resource indicator uses visually indicating the location of the target virtual resource;
  • a moving unit configured to control the resource indicator to move in the second virtual scene in response to a moving operation on the resource indicator
  • a determining unit configured to determine a reference placement position of the resource indicator in the second virtual scene in response to a position confirmation instruction for the resource indicator
  • a placing unit configured to determine a target placement position of the target virtual resource in the first virtual scene according to the reference placement position, and place the target virtual resource at the target placement position of the first virtual scene.
  • the second virtual scene has a scene layout corresponding to the first virtual scene.
  • the second virtual scene includes a second scene element
  • the second scene element is used to represent at least part of the first scene element in the first virtual scene
  • the second scene element is in the The position in the second virtual scene is used to characterize the position of the first scene element in the first virtual scene.
  • the device also includes:
  • a display range of the second virtual scene in the graphical user interface is determined according to the position of the resource indicator in the second virtual scene.
  • the second display unit is also used for:
  • An initial display range of the second virtual scene in the graphical user interface is determined according to the position of the virtual object in the first virtual scene when the use trigger operation occurs.
  • the second display unit is also used for:
  • the initial position of the resource indicator in the second virtual scene is determined according to the position and/or orientation of the virtual object in the first virtual scene when the use trigger operation occurs.
  • the second display unit is also used for:
  • the second virtual scene is the first virtual scene after hiding preset virtual objects, and the preset virtual objects include one or more of player virtual characters, non-player virtual characters and/or virtual prop objects kind.
  • the second display unit is also used for:
  • the second virtual scene is displayed through the second display area.
  • the graphical user interface for displaying the second virtual scene includes a movement control for controlling the movement of the resource indicator in the second virtual scene, and the movement control includes a horizontal movement control and a vertical movement control. controls, the mobile unit is also used to:
  • the resource indicator In response to a touch operation on the vertical movement control, the resource indicator is controlled to move in the vertical direction of the second virtual scene.
  • the moving operation includes a dragging operation
  • the moving unit is also used for:
  • the resource indicator In response to the drag operation on the resource indicator in the second virtual scene, the resource indicator is controlled to move in the second virtual scene.
  • the mobile unit is also used for:
  • a transition segment is displayed, the transition segment includes a second virtual scene transformed when the resource indicator is moved.
  • the resource indicator includes a reference drop point and a simulated use shape
  • the simulated use shape is used to simulate the use of the target virtual resource based on the reference drop point when the target virtual resource is in a different position of the first virtual scene.
  • the determination unit is also used to:
  • the reference placement position of the resource indicator in the second virtual scene is determined according to the position of the reference placement point in the second virtual scene and the reference rendering range of the simulated use shape.
  • the determining unit is also used for:
  • a rendering range of the simulated use shape of the resource indicator is determined according to the quantity of the target virtual resource to be delivered and the spatial layout of the second virtual scene.
  • the target virtual resource includes a target delivery point and a target usage shape
  • the target usage shape includes a rendering shape after using the target virtual resource at the target delivery point
  • the delivery unit is further used for:
  • the position corresponding to the reference placement point in the first virtual scene which is the target placement point of the target placement point
  • a rendering range corresponding to the simulated use shape which is the target rendering range of the target use shape
  • a target placement position of the target virtual resource is determined in the first virtual scene according to the target placement point and the target rendering range.
  • the device is also used for:
  • the first virtual scene is displayed.
  • the graphical user interface for displaying the first virtual scene includes an attack control
  • the attack control is used to instruct the virtual object to launch an attack in the first virtual scene
  • the determining unit is further configured to:
  • a position confirmation instruction of the resource indicator is generated.
  • the embodiment of the present application provides a virtual resource placement control method, device, computer equipment and storage medium, when the player wants to place the target virtual resource at the target placement position in the first virtual scene, the target virtual resource can be used Trigger the operation, and then move the resource indicator in the second virtual scene that appears until it moves to the reference placement position corresponding to the target placement position in the second virtual scene, so that the terminal can determine the target placement in the first virtual scene according to the reference placement position Directly place the target virtual resource at the target location, so that the player can accurately place the target virtual resource at the target location of the game scene, reducing the skill requirements for players to place virtual resources.
  • Fig. 1 is a schematic diagram of the system of the virtual throwing resource throwing device provided by the embodiment of the present application;
  • FIG. 2 is a schematic flowchart of a method for controlling delivery of virtual resources provided by an embodiment of the present application
  • FIG. 3 is a schematic diagram of a graphical user interface displaying a first virtual scene provided by an embodiment of the present application
  • Fig. 4 is a schematic diagram of a graphical user interface displaying the second virtual scene displayed in response to a trigger operation provided by an embodiment of the present application;
  • Fig. 5 is a schematic diagram of the reference placement position provided in the embodiment of the present application in the second virtual scene
  • Fig. 6 is a schematic diagram of placing target virtual resources in the first virtual scene provided by the embodiment of the present application.
  • Fig. 7 is another schematic flowchart of the method for controlling delivery of virtual resources provided by the embodiment of the present application.
  • FIG. 8 is a schematic structural diagram of a device for controlling placement of virtual resources provided by an embodiment of the present application.
  • FIG. 9 is a schematic structural diagram of a computer device provided by an embodiment of the present application.
  • Embodiments of the present application provide a virtual resource allocation control method, device, computer equipment, and storage medium.
  • the method for controlling delivery of virtual resources in the embodiment of the present application may be executed by a computer device, where the computer device may be a terminal or a server.
  • the terminal can be a smart phone, a tablet computer, a notebook computer, a touch screen, a game console, a personal computer (Personal Computer, PC), Personal Digital Assistant (Personal Digital Assistant, PDA) and other terminal devices, the terminal may also include a client, which may be a game application client, a browser client carrying a game program, or an instant messaging client.
  • the server can be an independent physical server, or a server cluster or distributed system composed of multiple physical servers, or it can provide cloud services, cloud databases, cloud computing, cloud functions, cloud storage, network services, cloud communication, intermediate Cloud servers for basic cloud computing services such as software services, domain name services, security services, content distribution network services, and big data and artificial intelligence platforms.
  • the terminal device stores a game application program and is used to present a virtual scene in a graphical user interface.
  • a game application is downloaded, installed and run through a terminal device, and a virtual scene is displayed on a graphical user interface.
  • the terminal device may provide the virtual scene to the user in various manners, for example, rendering and displaying the virtual scene on the display screen of the terminal device, or presenting the virtual scene through holographic projection.
  • the terminal device may include a touch screen and a processor.
  • the touch screen is used to present a virtual scene and receive operation instructions generated by the user acting on the graphical user interface.
  • the processor is used to run the game, generate game screens, and respond to operation instructions. , and control the graphical user interface and display the virtual scene on the touch screen.
  • Cloud gaming refers to a gaming method based on cloud computing.
  • the main body of running the game application program and the main body of the game screen presentation are separated, and the storage and operation of the delivery control method of virtual resources are completed on the cloud game server.
  • the presentation of the game screen is completed on the client side of the cloud game.
  • the cloud game client is mainly used for receiving and sending game data and presenting the game screen.
  • the cloud game client can be a display with data transmission function near the user side.
  • the terminal device that processes game data is a cloud game server in the cloud.
  • the user operates the cloud game client to send operation instructions to the cloud game server, and the cloud game server runs the game according to the operation instructions, encodes and compresses data such as game screens, and returns them to the cloud game client through the network.
  • the client decodes and outputs the game screen.
  • FIG. 1 is a system diagram of a device for controlling placement of virtual resources provided by an embodiment of the present application.
  • the system may include at least one terminal 101 and at least one game server 102 .
  • the terminal 101 held by the user can be connected to the game server 102 of different games through different networks 103.
  • the network can be a wireless network or a wired network
  • the wireless network can be a wireless local area network (WLAN), a local area network (LAN), a cellular network, 2G network, 3G network, 4G network, 5G network, etc.
  • the terminal is used to display the first virtual scene and the virtual objects located in the first virtual scene through a graphical user interface, and the virtual objects are configured to respond to touch operations on the graphical user interface Executing the game behavior; in response to the use trigger operation of the target virtual resource, displaying a second virtual scene and a resource indicator located in the second virtual scene through a graphical user interface, wherein the resource indicator is used to visually indicate the placement position of the target virtual resource ;
  • control the movement of the resource indicator in the second virtual scene in response to the position confirmation instruction for the resource indicator, determine the reference placement position of the resource indicator in the second virtual scene; according to the reference The placement position determines the target placement position of the target virtual resource in the first virtual scene, and places the
  • the game server is used to send the GUI to the terminal.
  • the virtual resource delivery control device may be integrated in a terminal device, and the terminal device may include smart phones, notebook computers, tablet computers, and personal computers.
  • the embodiment of the present application provides a virtual resource distribution control method, which can be executed by the processor of the terminal, as shown in Figure 2, the specific process of the virtual resource distribution control method mainly includes steps 201 to 205, detailed described as follows:
  • Step 201 displaying a first virtual scene and virtual objects located in the first virtual scene through a graphical user interface, and the virtual objects are configured to execute game behaviors in response to touch operations on the graphical user interface.
  • the game screen displayed on the display screen of the terminal can be displayed in the first virtual scene of the graphical user interface displaying the first virtual scene It has game props, and/or multiple virtual objects (buildings, trees, mountains, etc.) that constitute the game world environment.
  • the placement positions of virtual objects such as buildings, mountains, and walls in the first virtual scene constitute the spatial layout of the first virtual scene.
  • the game corresponding to the game application program may be a first-person shooter game, a multiplayer online role-playing game, and so on. For example, in the schematic diagram of the graphical user interface showing the first virtual scene as shown in FIG.
  • the graphical user interface of the virtual scene may also include a movement control 301 for controlling the movement of the virtual object, a resource control 305 for triggering the use trigger operation of the target virtual resource, an attack control 303 for controlling the virtual object to attack, and other skill controls 304 .
  • the virtual object may be a game character operated by the player through the game application program.
  • the virtual object may be a virtual character (such as a simulated character or an anime character), a virtual animal, and the like.
  • the game behavior of the virtual object in the first virtual scene includes but is not limited to: at least one of adjusting body posture, crawling, walking, running, riding, flying, jumping, driving, picking up, shooting, attacking, throwing, and releasing skills .
  • Step 202 in response to the use trigger operation of the target virtual resource, display the second virtual scene and the resource indicator in the second virtual scene through the GUI, wherein the resource indicator is used to visually indicate the placement position of the target virtual resource.
  • virtual resources can be set in the game.
  • Virtual resources can include props and skills, and virtual resources can be resources that need to be thrown, such as Cluster bombs, cluster missiles, smoke bombs, etc.
  • Players can control virtual objects to drop cluster bombs to a certain point within the field of vision, so that multiple consecutive explosions will occur within the range selected by the player, which can quickly defeat more players.
  • Virtual resources can be delivered directly by virtual objects or through virtual vehicles.
  • the use triggering operation of the target virtual resource is an operation required for the virtual object to use the target virtual resource in the virtual scene, and the use triggering operations of different virtual resources may be the same or different.
  • the use trigger operation may be operations such as clicking, long pressing, and/or double-clicking.
  • the graphical user interface displaying the first virtual scene may include a resource trigger control, and when the player performs a touch operation on the resource trigger control, the use trigger operation of the target virtual resource may be triggered.
  • different virtual resources may correspond to the same resource trigger control, or may correspond to different resource trigger controls.
  • the second virtual scene when the player makes a use trigger operation, the second virtual scene is displayed, and the second virtual scene has a scene layout corresponding to the first virtual scene, which can be simulated by using a virtual simulation entity.
  • the layout of each virtual simulation entity in the second virtual scene is the same as the layout of the corresponding virtual entity in the first virtual scene.
  • the shape of the virtual simulation entity in the second virtual scene is the same as that of the corresponding virtual entity in the first virtual scene, however, the surface of the virtual entity in the first virtual scene has the same shape as that of the corresponding object in real life. Color, texture, etc., the virtual simulation entity in the second virtual scene does not have the color, texture, etc.
  • each virtual simulation entity in the second virtual scene is the same as the relative positional relationship between each virtual entity in the first virtual scene, and the size of each virtual simulation entity in the second virtual scene can be the same as that of the first virtual simulation entity.
  • the corresponding virtual entities in the virtual scene are the same, and can also be scaled down or enlarged according to the same proportion as the corresponding virtual entities in the first virtual scene.
  • the second virtual scene contains a second scene element
  • the second scene element is used to represent at least part of the first scene element in the first virtual scene
  • the position of the second scene element in the second virtual scene It is used to represent the position of the first scene element in the first virtual scene. That is to say, the second scene element corresponds to the first scene element one by one, the position of the second scene element in the second virtual scene is the same as the position of the corresponding first scene element in the first virtual scene, and the second scene element is the same as the position of the first scene element in the first virtual scene. Attributes such as shape and size of the corresponding first scene elements are the same.
  • the first scene element and the second scene element may be virtual buildings, virtual walls, virtual rivers, etc. in the virtual scene.
  • the display range of the second virtual scene in the GUI is bound to the resource indicator, and when the resource pointer moves in the second virtual scene, the display range of the second virtual scene in the GUI It will change with the movement of the resource indicator, therefore, the method further includes: determining the display range of the second virtual scene in the graphical user interface according to the position of the resource indicator in the second virtual scene.
  • the display range of the second virtual scene in the graphical user interface will change with the movement of the resource indicator in the second virtual scene, it is said that in the above step "displaying the second virtual scene through the graphical user interface Before the second virtual scene and the resource indicator located in the second virtual scene, it is necessary to determine the initial display range of the second virtual scene, specifically: according to the position of the virtual object in the first virtual scene when the trigger operation occurs, An initial display range of the second virtual scene in the graphical user interface is determined.
  • "displaying the second virtual scene through the graphical user interface" in the above step may be: hiding the first virtual scene in the graphical user interface and triggering the display of the second virtual scene in the graphical user interface.
  • the second virtual scene may not be a newly added virtual scene independent of the first virtual scene, but obtained by simplifying the first virtual scene.
  • the second virtual scene may be a hidden In the first virtual scene after the preset virtual object, the preset virtual object includes one or more of a player virtual character, a non-player virtual character and/or a virtual prop object.
  • the player virtual character may be a virtual object operated by the current player and/or a player virtual character operated by other players participating in the game
  • the non-player virtual character may be a non-player virtual character operated by a terminal but not by a player participating in the game
  • the virtual prop object may be a virtual object that assists the player's virtual character in the game, for example, an attacking weapon, a riding mount, and so on.
  • the first virtual scene and the second virtual scene can also be displayed simultaneously in the graphical user interface.
  • the above-mentioned step "displaying the second virtual scene through the graphical user interface” can be: A second display area is determined in the graphical user interface of the scene, and the area range of the second display area is smaller than the area range corresponding to the graphical user interface; the second virtual scene is displayed through the second display area.
  • the position and size of the second display area in the graphical user interface are not limited, and can be flexibly set according to actual conditions.
  • the resource indicator is used to indicate the reference placement position of the target virtual resource in the second virtual scene, and the resource indicator may have the same shape and size as the target virtual resource, or may be the same as the target virtual resource. Shapes and sizes vary.
  • the GUI displaying the first virtual scene includes virtual objects located in the first virtual scene
  • the GUI displaying the second virtual scene may not include virtual objects located in the second virtual scene , so as to prevent the position of the enemy virtual object from being seen in the second virtual scene when setting the target placement position of the target virtual resource through the resource indicator, thereby ensuring the fairness of the game.
  • the second virtual scene may include a virtual simulation entity 408 formed according to a virtual building 308 and a virtual simulation entity 408 generated according to an obstacle 306.
  • the graphical user interface displaying the second virtual scene also includes a horizontal movement control 401 for controlling the resource indicator 409 to move horizontally in the second virtual scene, and a vertical control 401 for controlling the resource indicator 409 to move vertically in the second virtual scene.
  • Step 203 in response to the moving operation on the resource indicator, control the resource indicator to move in the second virtual scene.
  • the player can move the resource indicator by operating the movement control.
  • the graphical user interface displaying the second virtual scene includes a movement control for controlling the movement of the resource indicator in the second virtual scene.
  • the movement control includes horizontal movement Controls and vertically moving controls.
  • the resource indicator In response to a touch operation on the horizontal movement control, the resource indicator is controlled to move in the horizontal direction of the second virtual scene;
  • the resource indicator In response to a touch operation on the vertical movement control, the resource indicator is controlled to move in the vertical direction of the second virtual scene.
  • the graphical user interface displaying the first virtual scene may include an object movement control for controlling the movement of the virtual object in the first virtual scene, and when the trigger operation is triggered in response to the use of the target virtual resource, the display of the second After the graphical user interface of the second virtual scene, the object movement control in the graphical user interface displaying the first virtual scene can be changed into a mobile control of the mobile resource indicator, so that after the second virtual scene is generated, only resource indication can be performed The movement of the controller, but not the movement of the virtual object.
  • the player can also move the resource indicator directly on the terminal screen by dragging the resource indicator with a finger or a mouse pointer, etc.
  • the moving operation includes a drag operation
  • the above step 203 "Controlling the resource indicator to move in the second virtual scene in response to a moving operation on the resource indicator" may be:
  • the resource indicator In response to a drag operation on the resource indicator in the second virtual scene, the resource indicator is controlled to move in the second virtual scene.
  • the The change process in this case, after “controlling the movement of the resource indicator in the second virtual scene in response to the moving operation on the resource indicator” in the above step 203 may include: displaying a transition segment, the transition segment includes when the resource indicator moves Transformed second virtual scene.
  • the resource indicator when the player triggers the move operation on the resource indicator, the resource indicator will move in the second virtual scene. Before the player performs the move operation on the resource indicator, it needs to be determined that the resource indicator is in the second virtual scene.
  • the position in the initial display range of the resource indicator may specifically be: according to the position and/or orientation of the virtual object in the first virtual scene when the trigger operation occurs, determine the initial position of the resource indicator in the second virtual scene.
  • the terminal when the game is running, the terminal can load the first virtual scene and the second virtual scene at the same time, and before the player makes a use trigger operation, the GUI displays the first virtual scene.
  • the terminal may hide the first virtual scene in the GUI, and display the second virtual scene in the GUI.
  • the terminal determines the initial position of the resource indicator in the initial display range of the second virtual scene, it may first obtain the spatial coordinates of the virtual object in the first virtual scene as (x, y, z), and display the second virtual object in the graphical user interface.
  • the offset between the first virtual scene and the second virtual scene can be set to 0, or can be flexibly set according to actual conditions.
  • Step 204 in response to the instruction for confirming the position of the resource indicator, determine the reference placement position of the resource indicator in the second virtual scene.
  • the resource indicator in order to enable the player to better set the target virtual resource in a place that is effective for the actual game situation according to the actual game situation, the resource indicator can be made to include a reference drop point and a simulated use shape, wherein the simulated use The shape is used to simulate the rendering shape of the target virtual resource in different positions of the first virtual scene, based on the use of the reference drop point, so that the player can see the rendering of the target virtual resource after use in the second virtual scene according to the resource indicator effect, further adjusting the reference placement position of the resource indicator in the second virtual scene according to the impact of the rendering effect of the resource indicator on the actual game situation.
  • "determine the reference placement position of the resource indicator in the second virtual scene in response to the instruction for confirming the position of the resource indicator" may be:
  • the simulated use shape of the resource indicator The rendering extent of the will change.
  • the smoke rendering range after the smoke bomb is used will change with the walls in the first virtual environment and the height of the drop point of the smoke bomb in the first virtual scene. Therefore, for To better enable the target virtual resource to have a more beneficial effect on the actual game situation in the first virtual scene, it is necessary to determine the position of the resource indicator in the second virtual scene according to the reference drop point and the simulated use shape of the resource indicator. Refer to placements.
  • the target virtual resources placed in a target placement location are different in number, and the target rendering range after the target virtual resources are used will also be different. Therefore, the resource indicator can also be determined according to the set target virtual resources.
  • the simulation uses the rendering range of the shape. In this case, before the above step of "determining the shape of the simulated use shape of the resource indicator according to the spatial layout of the second virtual scene", it may also include: in response to the number setting operation of the target virtual resource, determine the number of the target virtual resource Delivery quantity.
  • the above-mentioned step of "determining the rendering range of the simulated use shape of the resource indicator according to the spatial layout of the second virtual scene" may include: according to the delivery quantity of the target virtual resource and The spatial layout of the second virtual scene determines the rendering range of the shape used by the simulation of the resource indicator.
  • an attack control in order to enable the player to control the virtual object operated to attack other hostile virtual objects, an attack control can be set in the graphical user interface displaying the first virtual scene, and the attack control is used to indicate that the virtual object is in the first Launch attacks in virtual scenarios.
  • the attack control has no effect.
  • the attack control can be changed into a resource indicator position determination control.
  • the above steps “response to the resource indicator position confirmation instruction Before determining the reference placement position of the resource indicator in the second virtual scene, it also includes: converting the attack control into a position determination control of the resource indicator; in response to a touch operation on the position determination control, generating a position of the resource indicator Position confirmation command.
  • the reference placement position of the resource indicator in the second virtual scene is as follows The location of the resource indicator 501 shown in FIG. 5 .
  • Step 205 Determine the target placement position of the target virtual resource in the first virtual scene according to the reference placement position, and place the target virtual resource at the target placement position of the first virtual scene.
  • the target virtual resource includes a target delivery point and a target usage shape
  • the target usage shape includes a rendering shape after using the target virtual resource at the target delivery point.
  • the position corresponding to the reference delivery point in the first virtual scene which is the target delivery point of the target delivery point
  • the rendering range corresponding to the simulated use shape in the first virtual scene determines the rendering range corresponding to the simulated use shape in the first virtual scene, and use it as the target rendering range of the target use shape;
  • the target delivery position of the target virtual resource is determined in the first virtual scene.
  • the spatial position correspondence is the correspondence between each spatial point in the first virtual scene and each spatial point in the second virtual scene.
  • Determining the rendering range corresponding to the simulated shape in the first virtual scene may be to determine the key points that make up the simulated shape, or the corresponding points of all points that make up the simulated shape in the first virtual scene, and then according to the determined corresponding key Point to determine the target rendering range of the target virtual resource in the first virtual scene.
  • FIG. 6 it is a schematic diagram of placing target virtual resources in the first virtual scene. According to the reference placement position and reference rendering range of the resource indicator 501 shown in FIG. 5 , in the first virtual scene shown in FIG. 6 Determine the target placement location 601 of the target virtual resource.
  • the target virtual resource after determining the target delivery point of the target virtual resource in the first virtual scene and the target rendering range generated by using the target virtual resource at the target delivery point, the target virtual resource can be directly rendered at the target delivery position.
  • Target render range after resource usage after determining the target delivery point of the target virtual resource in the first virtual scene and the target rendering range generated by using the target virtual resource at the target delivery point.
  • the use of the target virtual resource in the first virtual scene can be canceled between the determined reference placement positions of the resource indicator.
  • the specific method may be: in response to the trigger operation on the use of the target virtual resource, display the item cancel area; in response to the trigger operation on the item cancel area, convert the GUI displaying the second virtual scene to display the first virtual scene The graphical user interface of the scene.
  • the display position and display shape of the prop canceling area in the GUI displaying the second virtual scene may not be limited, and may be flexibly set according to actual conditions.
  • multiple resource indications may be generated in the second virtual scene at one time move each resource indicator, and when the reference placement position of each resource indicator is determined, a position determination instruction is generated, so as to determine the target placement positions of multiple target virtual resources at one time, and at the same time, use targets at multiple target placement positions virtual resources.
  • the player when the player wants to drop the target virtual resource at the target drop position in the first virtual scene, he can trigger the use of the target virtual resource, and then interact with the first virtual resource.
  • Directly place the target virtual resource at the target location so that the player can accurately place the target virtual resource at the target location of the game scene, reducing the skill requirements for players to place virtual resources.
  • FIG. 7 is another schematic flow chart of the method for controlling delivery of virtual resources provided by the embodiment of the present application.
  • the specific process of this method can be as follows:
  • Step 701 displaying a first virtual scene and virtual objects located in the first virtual scene through a graphical user interface.
  • the game screen displayed on the display screen of the terminal may have game props in the first virtual scene of the graphical user interface displaying the first virtual scene, and /or multiple virtual objects (buildings, trees, mountains, etc.) included in the game world environment.
  • the placement positions of virtual objects such as buildings, mountains, and walls in the first virtual scene constitute the spatial layout of the first virtual scene.
  • Step 702 In response to the use trigger operation on the target virtual resource, display the second virtual scene and the resource indicator located in the second virtual scene through a graphical user interface.
  • the second virtual scene may be obtained by simplifying the first virtual scene, so as to display the second virtual scene.
  • Step 703 in response to a trigger operation on the movement control in the graphical user interface displaying the second virtual scene, control the movement of the resource indicator in the second virtual scene.
  • the mobile control includes a horizontal movement control and a vertical movement control.
  • the resource indicator In response to a touch operation on the horizontal movement control, the resource indicator is moved in the horizontal direction of the second virtual scene, and in response to a touch operation on the vertical movement control , to move the resource indicator in the vertical direction of the second virtual scene.
  • Step 704. During the moving process of the resource indicator, according to the spatial layout of the second virtual scene, determine the rendering range of the simulated use shape of the resource indicator.
  • the spatial layout of the second virtual scene is obtained in real time, and the rendering range of the simulated use shape of the resource indicator is determined according to the spatial layout of the second virtual scene.
  • Step 705 in response to the instruction for confirming the position of the resource indicator, obtain the reference rendering range of the shape used in the simulation, and the position of the reference drop point in the second virtual scene.
  • the resource indicator in response to the position confirmation instruction for the resource indicator, obtain the reference rendering range of the simulated use shape, and the height and ground projection coordinates of the reference drop point in the second virtual scene, and determine the reference drop point according to the height and the ground projection coordinates position in the second virtual scene.
  • Step 706 Determine the reference placement position of the resource indicator in the second virtual scene according to the position of the reference placement point in the second virtual scene and the reference rendering range of the simulated shape.
  • Step 707 Determine the target placement position of the target virtual resource in the first virtual scene according to the reference placement position, and place the target virtual resource at the target placement position of the first virtual scene.
  • the spatial position correspondence between the first virtual scene and the second virtual scene switch the GUI displaying the second virtual scene to the GUI displaying the first virtual scene, and according to the spatial position correspondence and the reference placement position,
  • the target placement position of the target virtual resource is determined in the first virtual scene, and the target virtual resource is placed at the target placement position of the first virtual scene.
  • the player when the player wants to drop the target virtual resource at the target drop position in the first virtual scene, he can trigger the use of the target virtual resource, and then interact with the first virtual resource.
  • Directly place the target virtual resource at the target location so that the player can accurately place the target virtual resource at the target location of the game scene, reducing the skill requirements for players to place virtual resources.
  • FIG. 8 is a schematic structural diagram of a device for controlling placement of virtual resources provided by an embodiment of the present application.
  • the device for controlling placement of virtual resources may include a first display unit 801 , a second display unit 802 , a moving unit 803 , a determination unit 804 and a placement unit 805 .
  • the first display unit 801 is configured to display the first virtual scene and the virtual objects located in the first virtual scene through a graphical user interface, and the virtual objects are configured to perform game behaviors in response to touch operations on the graphical user interface;
  • the second display unit 802 is configured to display a second virtual scene and a resource indicator located in the second virtual scene through a graphical user interface in response to a trigger operation on the use of the target virtual resource, wherein the resource indicator is used to visually indicate the target virtual where resources are placed;
  • a moving unit 803, configured to control the movement of the resource indicator in the second virtual scene in response to a moving operation on the resource indicator
  • a determining unit 804 configured to determine a reference placement position of the resource indicator in the second virtual scene in response to a position confirmation instruction for the resource indicator
  • the placement unit 805 is configured to determine a target placement position of the target virtual resource in the first virtual scene according to the reference placement position, and place the target virtual resource at the target placement position of the first virtual scene.
  • the second virtual scene has a scene layout corresponding to the first virtual scene.
  • the second virtual scene includes a second scene element
  • the second scene element is used to represent at least part of the first scene element in the first virtual scene
  • the position of the second scene element in the second virtual scene is used to represent The position of the first scene element in the first virtual scene.
  • the device also includes:
  • the display range of the second virtual scene in the graphical user interface is determined.
  • the second display unit 802 is also used for:
  • the initial display range of the second virtual scene in the GUI is determined according to the position of the virtual object in the first virtual scene when the trigger operation occurs.
  • the second display unit 802 is also used for:
  • the initial position of the resource indicator in the second virtual scene is determined according to the position and/or orientation of the virtual object in the first virtual scene when the use trigger operation occurs.
  • the second display unit 802 is also used for:
  • the first virtual scene is hidden on the graphical user interface, and the second virtual scene is triggered to be displayed on the graphical user interface.
  • the second virtual scene is the first virtual scene with preset virtual objects hidden, and the preset virtual objects include one or more of player virtual characters, non-player virtual characters and/or virtual prop objects.
  • the second display unit 802 is also used for:
  • the second virtual scene is displayed through the second display area.
  • the graphical user interface displaying the second virtual scene includes a movement control for controlling the movement of the resource indicator in the second virtual scene
  • the movement control includes a horizontal movement control and a vertical movement control
  • the movement unit 803 is also used for:
  • the resource indicator In response to a touch operation on the horizontal movement control, the resource indicator is controlled to move in the horizontal direction of the second virtual scene;
  • the resource indicator In response to a touch operation on the vertical movement control, the resource indicator is controlled to move in the vertical direction of the second virtual scene.
  • the moving operation includes a dragging operation
  • the moving unit 803 is also used for:
  • the resource indicator In response to a drag operation on the resource indicator in the second virtual scene, the resource indicator is controlled to move in the second virtual scene.
  • the mobile unit 803 is also used for:
  • a transition segment is displayed that includes a second virtual scene that changes when the resource pointer is moved.
  • the resource indicator includes a reference drop point and a simulated use shape.
  • the simulated use shape is used to simulate the target virtual resource at different positions in the first virtual scene. Based on the rendered shape after the reference drop point is used, the determination unit 804 also uses At:
  • the determining unit 804 is also used for:
  • the rendering range of the simulated use shape of the resource indicator is determined.
  • the target virtual resource includes a target delivery point and a target usage shape
  • the target usage shape includes a rendering shape after using the target virtual resource at the target delivery point
  • the delivery unit 805 is also used for:
  • the position corresponding to the reference delivery point in the first virtual scene which is the target delivery point of the target delivery point
  • the rendering range corresponding to the simulated use shape in the first virtual scene determines the rendering range corresponding to the simulated use shape in the first virtual scene, and use it as the target rendering range of the target use shape;
  • the target delivery position of the target virtual resource is determined in the first virtual scene.
  • the device is also used for:
  • the first virtual scene is displayed.
  • the graphical user interface displaying the first virtual scene includes an attack control
  • the attack control is used to instruct the virtual object to launch an attack in the first virtual scene
  • the determining unit 804 is further configured to:
  • a position confirmation instruction of the resource indicator is generated.
  • the player when the player wants to place the target virtual resource at the target delivery position in the first virtual scene, he can trigger the use of the target virtual resource, and then interact with the first virtual resource.
  • Directly place the target virtual resource at the target location so that the player can accurately place the target virtual resource at the target location of the game scene, reducing the skill requirements for players to place virtual resources.
  • FIG. 9 is a schematic structural diagram of a computer device provided by an embodiment of the present application.
  • the computer device 900 includes a processor 901 with one or more processing cores, a memory 902 with one or more computer-readable storage media, and computer programs stored in the memory 902 and operable on the processor.
  • the processor 901 is electrically connected with the memory 902 .
  • the structure of the computer equipment shown in the figure does not constitute a limitation to the computer equipment, and may include more or less components than those shown in the figure, or combine some components, or arrange different components.
  • the processor 901 is the control center of the computer device 900. It uses various interfaces and lines to connect various parts of the entire computer device 900, and runs or loads the software programs and/or modules stored in the memory 902, and calls the software programs stored in the memory 902. Execute various functions of the computer device 900 and process data, so as to monitor the computer device 900 as a whole.
  • the processor 901 in the computer device 900 will follow the steps below to load the instructions corresponding to the process of one or more application programs into the memory 902, and the processor 901 will run the instructions stored in the memory. 902, so as to realize various functions:
  • a first virtual scene and a virtual object located in the first virtual scene are displayed through a graphical user interface, and the virtual object is configured to execute a game behavior in response to a touch operation on the graphical user interface;
  • the user interface displays a second virtual scene and a resource indicator located in the second virtual scene, wherein the resource indicator is used to visually indicate the placement position of the target virtual resource; 2.
  • Move in the virtual scene in response to the position confirmation instruction for the resource indicator, determine the reference placement position of the resource indicator in the second virtual scene; determine the target placement position of the target virtual resource in the first virtual scene according to the reference placement position, And the target virtual resource is placed at the target delivery position of the first virtual scene.
  • the computer device 900 further includes: a touch screen 903 , a radio frequency circuit 904 , an audio circuit 905 , an input unit 906 and a power supply 907 .
  • the processor 901 is electrically connected to the touch screen 903 , the radio frequency circuit 904 , the audio circuit 905 , the input unit 906 and the power supply 907 respectively.
  • the structure of the computer device shown in FIG. 9 is not limited to the computer device, and may include more or less components than shown in the figure, or combine certain components, or arrange different components.
  • the touch display screen 903 can be used to display a graphical user interface and receive operation instructions generated by the user acting on the graphical user interface.
  • the touch display screen 903 may include a display panel and a touch panel.
  • the display panel can be used to display information input by or provided to the user and various graphical user interfaces of the computer equipment. These graphical user interfaces can be composed of graphics, text, icons, videos and any combination thereof.
  • a liquid crystal display Liquid Crystal Display, LCD
  • Organic Light-Emitting Diode Organic Light-Emitting Diode
  • OLED Organic Light-Emitting Diode
  • the touch panel can be used to collect the user's touch operations on or near it (such as the user's operation on or near the touch panel with a finger, stylus, or any suitable object or accessory) and generate corresponding operations instruction, and the operation instruction executes the corresponding program.
  • the touch panel may include two parts: a touch detection device and a touch controller. Among them, the touch detection device detects the user's touch orientation, and detects the signal brought by the touch operation, and transmits the signal to the touch controller; the touch controller receives the touch information from the touch detection device, converts it into contact coordinates, and sends it to the to the processor 901, and can receive and execute commands sent by the processor 901.
  • the touch panel can cover the display panel, and when the touch panel detects a touch operation on or near it, it will be sent to the processor 901 to determine the type of the touch event, and then the processor 901 will provide on the display panel according to the type of the touch event. corresponding visual output.
  • the touch panel and the display panel can be integrated into the touch display screen 903 to realize input and output functions.
  • the touch panel and the touch panel can be used as two independent components to implement input and output functions. That is, the touch screen 903 can also serve as a part of the input unit 906 to implement an input function.
  • the radio frequency circuit 904 can be used to send and receive radio frequency signals to establish wireless communication with network equipment or other computer equipment through wireless communication, and to send and receive signals with network equipment or other computer equipment.
  • the audio circuit 905 may be used to provide an audio interface between the user and the computer device through speakers, microphones.
  • the audio circuit 905 can transmit the electrical signal converted from the received audio data to the speaker, and the speaker converts it into an audio signal for output; on the other hand, the microphone converts the collected audio signal into an electrical signal, which is converted by the audio circuit 905
  • the audio data is sent to, for example, another computer device through the radio frequency circuit 904, or the audio data is output to the memory 902 for further processing.
  • Audio circuitry 905 may also include an earphone jack to provide communication of peripheral headphones with the computer device.
  • the input unit 906 can be used to receive input numbers, character information or user characteristic information (such as fingerprints, iris, face information, etc.), and generate keyboard, mouse, joystick, optical or trackball signal input related to user settings and function control .
  • character information or user characteristic information such as fingerprints, iris, face information, etc.
  • the power supply 907 is used to supply power to various components of the computer device 900 .
  • the power supply 907 may be logically connected to the processor 901 through a power management system, so as to implement functions such as management of charging, discharging, and power consumption through the power management system.
  • the power supply 907 may also include one or more DC or AC power supplies, recharging systems, power failure detection circuits, power converters or inverters, power status indicators and other arbitrary components.
  • the computer device 900 may also include a camera, a sensor, a Wi-Fi module, a Bluetooth module, etc., which will not be repeated here.
  • the computer device when the player wants to place the target virtual resource at the target placement position in the first virtual scene, he can make a use trigger operation of the target virtual resource, and then interact with the first virtual scene Moving the resource indicator in the second virtual scene with the same spatial layout until it moves to the reference placement position corresponding to the first virtual scene in the second virtual scene, so that the terminal can determine the target placement position in the first virtual scene according to the reference placement position , and directly place the target virtual resource at the target placement position, so that the player can accurately place the target virtual resource at the target placement position of the game scene, reducing the skill requirements for the player to place virtual resources.
  • the embodiment of the present application provides a computer-readable storage medium, which stores a plurality of computer programs that can be loaded by a processor to execute any virtual resource delivery provided by the embodiment of the present application.
  • the steps in the control method can perform the following steps:
  • a first virtual scene and a virtual object located in the first virtual scene are displayed through a graphical user interface, and the virtual object is configured to execute a game behavior in response to a touch operation on the graphical user interface;
  • the user interface displays a second virtual scene and a resource indicator located in the second virtual scene, wherein the resource indicator is used to visually indicate the placement position of the target virtual resource; 2.
  • Move in the virtual scene in response to the position confirmation instruction for the resource indicator, determine the reference placement position of the resource indicator in the second virtual scene; determine the target placement position of the target virtual resource in the first virtual scene according to the reference placement position, And the target virtual resource is placed at the target delivery position of the first virtual scene.
  • the storage medium may include: read only memory (Read Only Memory, ROM), random access memory (Random Access Memory, RAM), disk or CD, etc.
  • the computer program stored in the storage medium can execute the steps in any one of the virtual resource distribution control methods provided in the embodiments of the present application, therefore, the implementation of any of the virtual resources provided in the embodiments of the present application can be realized.
  • the beneficial effects that can be achieved by the delivery control method are detailed in the previous embodiments, and will not be repeated here.

Abstract

本申请公开一种虚拟资源的投放控制方法、装置、计算机设备及存储介质,当玩家想要在第一虚拟场景中的目标投放位置投放目标虚拟资源时,可以设置参照投放位置,终端根据参照投放位置在第一虚拟场景确定目标投放位置,直接在目标投放位置投放目标虚拟资源,使得玩家可以将目标虚拟资源精准地投放至游戏场景的目标投放位置。

Description

虚拟资源的投放控制方法、装置、计算机设备及存储介质
本申请要求于2021年7月30日提交中国专利局、申请号为202110872438.9、发明名称为“虚拟资源的投放控制方法、装置、计算机设备及存储介质”的中国专利申请的优先权,其全部内容通过引用结合在本申请中。
技术领域
本申请涉及游戏技术领域,具体涉及一种虚拟资源的投放控制方法、装置、计算机设备及存储介质。
背景技术
随着科技的发展,依托于电子设备平台运行的电子游戏成为人们休闲娱乐的重要活动,例如,第一人称射击游戏和第三人称射击游戏等。为了增加游戏的趣味性,虚拟道具和/技能等虚拟资源的使用是电子游戏的重要玩法,例如在电子游戏的虚拟场景中投放烟雾弹、手榴弹等等。然而,在电子游戏中,对玩家投放虚拟资源的技巧有较高的要求,玩家将虚拟资源精准地投放至游戏场景的某一位置有很大的难度。
技术问题
本申请实施例提供一种虚拟资源的投放控制方法、装置、计算机设备及存储介质,可以使游戏中的玩家将目标虚拟资源精准地投放至虚拟场景的某一位置。
技术解决方案
第一方面,本申请实施例提供一种虚拟资源的投放控制方法,包括:
通过图形用户界面显示第一虚拟场景和位于所述第一虚拟场景中的虚拟对象,所述虚拟对象配置为响应针对所述图形用户界面的触控操作执行游戏行为;
响应于对目标虚拟资源的使用触发操作,通过所述图形用户界面显示第二虚拟场景和位于所述第二虚拟场景中的资源指示器,其中,所述资源指示器用于可视化指示所述目标虚拟资源的投放位置;
响应于对所述资源指示器的移动操作,控制所述资源指示器在所述第二虚拟场景中移动;
响应于对所述资源指示器的位置确认指令,确定所述资源指示器在所述第二虚拟场景中的参照投放位置;
根据所述参照投放位置在所述第一虚拟场景中确定所述目标虚拟资源的目标投放位置,并在所述第一虚拟场景的目标投放位置投放所述目标虚拟资源。
第二方面,本申请实施例还提供一种虚拟投掷资源的投放装置,包括:
第一显示单元,用于通过图形用户界面显示第一虚拟场景和位于所述第一虚拟场景中的虚拟对象,所述虚拟对象配置为响应针对所述图形用户界面的触控操作执行游戏行为;
第二显示单元,用于响应于对目标虚拟资源的使用触发操作,通过所述图形用户界面显示第二虚拟场景和位于所述第二虚拟场景中的资源指示器,其中,所述资源指示器用于可视化指示所述目标虚拟资源的投放位置;
移动单元,用于响应于对所述资源指示器的移动操作,控制所述资源指示器在所述第二虚拟场景中移动;
确定单元,用于响应于对所述资源指示器的位置确认指令,确定所述资源指示器在所述第二虚拟场景中的参照投放位置;
投放单元,用于根据所述参照投放位置在所述第一虚拟场景中确定所述目标虚拟资源的目标投放位置,并在所述第一虚拟场景的目标投放位置投放所述目标虚拟资源。
可选的,所述第二虚拟场景具有与所述第一虚拟场景相对应的场景布局。
可选的,所述第二虚拟场景中包含第二场景元素,所述第二场景元素用于表征至少部分所述第一虚拟场景中的第一场景元素,所述第二场景元素在所述第二虚拟场景中的位置用于表征所述第一场景元素在所述第一虚拟场景中的位置。
可选的,所述装置还包括:
根据所述资源指示器在所述第二虚拟场景中的位置,确定所述第二虚拟场景在所述图形用户界面中的显示范围。
可选的,所述第二显示单元还用于:
根据所述使用触发操作发生时所述虚拟对象在所述第一虚拟场景中的位置,确定所述第二虚拟场景在所述图形用户界面中的初始显示范围。
可选的,所述第二显示单元还用于:
根据所述使用触发操作发生时所述虚拟对象在所述第一虚拟场景中的位置和/或朝向,确定所述资源指示器在所述第二虚拟场景中的初始位置。
可选的,所述第二显示单元还用于:
在所述图形用户界面隐藏所述第一虚拟场景,触发在所述图形用户界面显示第二虚拟场景。
可选的,所述第二虚拟场景为隐藏预设虚拟对象后的第一虚拟场景,所述预设虚拟对象包括玩家虚拟角色、非玩家虚拟角色和/或虚拟道具对象中的一种或多种。
可选的,所述第二显示单元还用于:
在显示所述第一虚拟场景的所述图形用户界面中确定第二显示区域,所述第二显示区域的区域范围小于所述图形用户界面对应的区域范围;
通过所述第二显示区域显示所述第二虚拟场景。
可选的,所述显示所述第二虚拟场景的图形用户界面包括用于控制所述资源指示器在所述第二虚拟场景中移动的移动控件,所述移动控件包括水平移动控件和竖直移动控件,所述移动单元还用于:
响应于对所述水平移动控件的触控操作,控制所述资源指示器在所述第二虚拟场景的水平方向上移动;
响应于对所述竖直移动控件的触控操作,控制所述资源指示器在所述第二虚拟场景的竖直方向上移动。
可选的,所述移动操作包括拖拽操作,所述移动单元还用于:
响应于在所述第二虚拟场景中对所述资源指示器的所述拖拽操作,控制所述资源指示器在所述第二虚拟场景中移动。
可选的,所述移动单元还用于:
显示过渡片段,所述过渡片段包括所述资源指示器移动时变换的第二虚拟场景。
可选的,所述资源指示器包括参照投放点和模拟使用形状,所述模拟使用形状用于模拟所述目标虚拟资源在所述第一虚拟场景的不同位置时,基于所述参照投放点使用后的渲染形状,所述确定单元还用于:
在所述资源指示器的移动过程中,获取所述第二虚拟场景的空间布局;
根据所述第二虚拟场景的空间布局,确定所述资源指示器的模拟使用形状的渲染范围;
响应于对所述资源指示器的位置确认指令,获取所述模拟使用形状的参照渲染范围,以及所述参照投放点在所述第二虚拟场景中的高度和地面投影坐标;
根据所述高度、所述地面投影坐标,确定所述参照投放点在所述第二虚拟场景中的位置;
根据所述参照投放点在所述第二虚拟场景中的位置和所述模拟使用形状的参照渲染范围,确定所述资源指示器在所述第二虚拟场景中的参照投放位置。
可选的,所述确定单元还用于:
响应于所述目标虚拟资源的个数设置操作,确定所述目标虚拟资源的投放数量;
根据所述目标虚拟资源的投放数量和所述第二虚拟场景的空间布局,确定所述资源指示器的模拟使用形状的渲染范围。
可选的,所述目标虚拟资源包括目标投放点和目标使用形状,所述目标使用形状包括在所述目标投放点使用所述目标虚拟资源后的渲染形状,所述投放单元还用于:
获取所述第一虚拟场景和所述第二虚拟场景的空间位置对应关系;
根据所述参照投放位置和所述空间位置对应关系,在所述第一虚拟场景中确定所述参照投放点对应的位置,为所述目标投放点的目标投放点;
根据所述参照投放位置和所述空间位置对应关系,在所述第一虚拟场景中确定所述模拟使用形状对应的渲染范围,为所述目标使用形状的目标渲染范围;
根据所述目标投放点和所述目标渲染范围,在所述第一虚拟场景中确定所述目标虚拟资源的目标投放位置。
可选的,所述装置还用于:
响应于对所述目标虚拟资源的使用触发操作,显示道具取消区域;
响应于对所述道具取消区域的触发操作,显示所述第一虚拟场景。
可选的,所述显示第一虚拟场景的图形用户界面包括攻击控件,所述攻击控件用于指示所述虚拟对象在所述第一虚拟场景中发动攻击,所述确定单元还用于:
将所述攻击控件转换为所述资源指示器的位置确定控件;
响应于对所述位置确定控件的触控操作,产生所述资源指示器的位置确认指令。
有益效果
本申请实施例提供一种虚拟资源的投放控制方法、装置、计算机设备及存储介质,当玩家想要在第一虚拟场景中的目标投放位置投放目标虚拟资源时,可以做出目标虚拟资源的使用触发操作,然后在出现的第二虚拟场景中移动资源指示器,直至移动到第二虚拟场景中与目标投放位置对应的参照投放位置,使得终端可以根据参照投放位置在第一虚拟场景中确定目标投放位置,直接在目标投放位置投放目标虚拟资源,使得玩家可以将目标虚拟资源精准地投放至游戏场景的目标投放位置,降低对玩家投放虚拟资源的技巧要求。
附图说明
图1是本申请实施例提供的虚拟投掷资源的投放装置的系统示意图;
图2是本申请实施例提供的虚拟资源的投放控制方法的流程示意图;
图3是本申请实施例提供的显示第一虚拟场景的图形用户界面的示意图;
图4是本申请实施例提供的响应使用触发操作显示的显示所述第二虚拟场景的图形用户界面的示意图;
图5是本申请实施例提供的参照投放位置在第二虚拟场景中的示意图;
图6是本申请实施例提供的在第一虚拟场景投放目标虚拟资源的示意图;
图7是本申请实施例提供的虚拟资源的投放控制方法的另一流程示意图;
图8是本申请实施例提供的虚拟资源的投放控制装置的结构示意图;
图9是本申请实施例提供的计算机设备的结构示意图。
本发明的实施方式
下面将结合本申请实施例中的附图,对本申请实施例中的技术方案进行清楚、完整地描述。显然,所描述的实施例仅仅是本发明一部分实施例,而不是全部的实施例。基于本发明中的实施例,本领域技术人员在没有作出创造性劳动前提下所获得的所有其他实施例,都属于本发明保护的范围。
本申请实施例提供一种虚拟资源的投放控制方法、装置、计算机设备及存储介质。具体地,本申请实施例的虚拟资源的投放控制方法可以由计算机设备执行,其中,该计算机设备可以为终端或者服务器等设备。该终端可以为智能手机、平板电脑、笔记本电脑、触控屏幕、游戏机、个人计算机(Personal Computer,PC)、个人数字助理(Personal Digital Assistant,PDA)等终端设备,终端还可以包括客户端,该客户端可以是游戏应用客户端、携带有游戏程序的浏览器客户端或即时通信客户端等。服务器可以是独立的物理服务器,也可以是多个物理服务器构成的服务器集群或者分布式系统,还可以是提供云服务、云数据库、云计算、云函数、云存储、网络服务、云通信、中间件服务、域名服务、安全服务、内容分发网络服务、以及大数据和人工智能平台等基础云计算服务的云服务器。
例如,当该虚拟资源的投放控制方法运行于终端时,终端设备存储有游戏应用程序并用于呈现图形用户界面中的虚拟场景。例如,通过终端设备下载安装游戏应用程序并运行,在图形用户界面显示虚拟场景。该终端设备将虚拟场景提供给用户的方式可以包括多种,例如,可以渲染显示在终端设备的显示屏上,或者,通过全息投影呈现虚拟场景。例如,终端设备可以包括触控显示屏和处理器,触控显示屏用于呈现虚拟场景以及接收用户作用于图形用户界面产生的操作指令,处理器用于运行该游戏、生成游戏画面、响应操作指令,以及控制图形用户界面和虚拟场景在触控显示屏上显示。
例如,当该虚拟资源的投放控制方法运行于服务器时,可以为云游戏。云游戏是指以云计算为基础的游戏方式。在云游戏的运行模式下,游戏应用程序的运行主体和游戏画面呈现主体是分离的,虚拟资源的投放控制方法地储存与运行是在云游戏服务器上完成的。而游戏画面呈现是在云游戏的客户端完成的,云游戏客户端主要用于游戏数据地接收、发送以及游戏画面呈现,例如,云游戏客户端可以是靠近用户侧的具有数据传输功能的显示设备,如,移动终端、电视机、计算机、掌上电脑、个人数字助理等,但是进行游戏数据处理的终端设备为云端的云游戏服务器。在进行游戏时,用户操作云游戏客户端向云游戏服务器发送操作指令,云游戏服务器根据操作指令运行游戏,将游戏画面等数据进行编码压缩,通过网络返回云游戏客户端,最后,通过云游戏客户端进行解码并输出游戏画面。
请参阅图1,图1为本申请实施例提供的虚拟资源的投放控制装置的系统示意图。该系统可以包括至少一个终端101和至少一个游戏服务器102。用户持有的终端101可以通过不同的网络103连接到不同游戏的游戏服务器102,例如,网络可以为无线网络或者有线网络,无线网络可以为无线局域网(WLAN)、局域网(LAN)、蜂窝网络、2G网络、3G网络、4G网络、5G网络等,终端用于显示通过图形用户界面显示第一虚拟场景和位于第一虚拟场景中的虚拟对象,虚拟对象配置为响应针对图形用户界面的触控操作执行游戏行为;响应于对目标虚拟资源的使用触发操作,通过图形用户界面显示第二虚拟场景和位于第二虚拟场景中的资源指示器,其中,资源指示器用于可视化指示目标虚拟资源的投放位置;响应于对资源指示器的移动操作,控制资源指示器在第二虚拟场景中移动;响应于对资源指示器的位置确认指令,确定资源指示器在第二虚拟场景中的参照投放位置;根据参照投放位置在第一虚拟场景中确定目标虚拟资源的目标投放位置,并在第一虚拟场景的目标投放位置投放目标虚拟资源。
游戏服务器用于向终端发送图形用户界面。
以下分别进行详细说明。需说明的是,以下实施例的描述顺序不作为对实施例优选顺序的限定。
本实施例将从虚拟资源的投放控制装置的角度进行描述,该虚拟资源的投放控制装置具体可以集成在终端设备中,该终端设备可以包括智能手机、笔记本电脑、平板电脑以及个人计算机等设备。
本申请实施例提供的一种虚拟资源的投放控制方法,该方法可以由终端的处理器执行,如图2所示,该虚拟资源的投放控制方法的具体流程主要包括步骤201至步骤205,详细说明如下:
步骤201、通过图形用户界面显示第一虚拟场景和位于第一虚拟场景中的虚拟对象,虚拟对象配置为响应针对图形用户界面的触控操作执行游戏行为。
在本申请实施例中,显示第一虚拟场景的图形用户界面为终端执行游戏应用程序之后,在终端的显示屏幕显示的游戏画面,显示第一虚拟场景的图形用户界面的第一虚拟场景中可以具有游戏道具,和/或构成游戏世界环境所包含的多个虚拟物体等(建筑、树木、山川等等)。第一虚拟场景中的建筑、山川、墙壁等虚拟物体的摆放位置,构成第一虚拟场景的空间布局。此外,游戏应用程序对应的游戏可以是第一人称射击游戏、多人在线角色扮演游戏等等。例如,如图3所示的显示第一虚拟场景的图形用户界面的示意图中,可以包括由虚拟建筑308、4个虚拟集装箱组成的障碍物306和5个集装箱组成的障碍物307,显示第一虚拟场景的图形用户界面中还可以包括用于控制虚拟对象移动的移动控件301、用于触发目标虚拟资源的使用触发操作的资源控件305、控制虚拟对象进行攻击的攻击控件303和其他技能控件304。
在本申请实施例中,虚拟对象可以是玩家通过游戏应用程序操作的游戏角色。例如,虚拟对象可以是虚拟人物(比如仿真人物或动漫人物)、虚拟动物等。虚拟对象在第一虚拟场景中的游戏行为包括但不限于:调整身体姿态、爬行、步行、奔跑、骑行、飞行、跳跃、驾驶、拾取、射击、攻击、投掷、释放技能中的至少一种。
步骤202、响应于对目标虚拟资源的使用触发操作,通过图形用户界面显示第二虚拟场景和位于第二虚拟场景中的资源指示器,其中,资源指示器用于可视化指示目标虚拟资源的投放位置。
在本申请实施例中,为了方便玩家控制虚拟对象对远处的敌方进行远程攻击,可以在游戏中设置虚拟资源,虚拟资源可以包括道具和技能,虚拟资源可以是需要进行投掷的资源,例如集束炸弹、集束导弹、烟雾弹等,玩家可以控制虚拟对象向视野范围内某一地点投掷集束炸弹,从而在玩家所选定的范围内发生连续多次爆炸,能够快速击败较多的玩家。虚拟资源可以由虚拟对象直接投放,也可以通过虚拟载具进行投放。
在本申请实施例中,目标虚拟资源的使用触发操作为虚拟对象在虚拟场景中使用目标虚拟资源时所需要做出的操作,不同虚拟资源的使用触发操作可以相同,也可以不同。使用触发操作可以是点击、长按、和/或双击等操作。
在本申请实施例中,显示第一虚拟场景的图形用户界面中可以包括资源触发控件,当玩家对资源触发控件执行触控操作,可以触发目标虚拟资源的使用触发操作。此外,不同的虚拟资源可以对应同一个资源触发控件,也可以对应不同的资源触发控件。
在本申请实施例中,当玩家做出了使用触发操作,显示第二虚拟场景,第二虚拟场景具有与第一虚拟场景相对应的场景布局,可以是利用虚拟仿真实体模仿第一虚拟场景中的建筑物、墙壁、山川等等的全部虚拟实体,各个虚拟仿真实体在第二虚拟场景中的布局与对应的虚拟实体在第一虚拟场景中的布局相同。此外,第二虚拟场景中的虚拟仿真实体的形状,与第一虚拟场景中对应的虚拟实体的形状相同,然而,第一虚拟场景中的虚拟实体的表面,具有与现实生活中相应物体一样的颜色、纹理等等,第二虚拟场景中的虚拟仿真实体不具有所仿真虚拟实体的颜色、纹理等,根据虚拟仿真实体形成第二虚拟场景。各个虚拟仿真实体在第二虚拟场景中的相对位置关系,与第一虚拟场景中各个虚拟实体之间的相对位置关系相同,在第二虚拟场景中的各个虚拟仿真实体的大小,可以与第一虚拟场景中对应的虚拟实体相同,也可以根据第一虚拟场景中对应的虚拟实体等比例缩小或放大。
在本申请实施例中,第二虚拟场景中包含第二场景元素,第二场景元素用于表征至少部分第一虚拟场景中的第一场景元素,第二场景元素在第二虚拟场景中的位置用于表征第一场景元素在第一虚拟场景中的位置。也就是说,第二场景元素一一对应第一场景元素,第二场景元素在第二虚拟场景中的位置与其对应的第一场景元素在第一虚拟场景中的位置相同,第二场景元素与对应的第一场景元素的形状大小等属性相同。其中,第一场景元素和第二场景元素可以是虚拟场景中的虚拟建筑物、虚拟墙壁、虚拟河流等。
在本申请实施例中,第二虚拟场景在图形用户界面中的显示范围与资源指示器绑定,当资源指示器在第二虚拟场景中移动时,第二虚拟场景在图形用户界面中的显示范围会随着资源指示器的移动而变化,因此,该方法还包括:根据资源指示器在第二虚拟场景中的位置,确定第二虚拟场景在图形用户界面中的显示范围。
在本申请实施例中,由于第二虚拟场景在图形用户界面中的显示范围会随着资源指示器在第二虚拟场景中的移动而发生变化,所以说在上述步骤“通过图形用户界面显示第二虚拟场景和位于第二虚拟场景中的资源指示器”之前,需要先确定第二虚拟场景的初始显示范围,具体可以是:根据使用触发操作发生时虚拟对象在第一虚拟场景中的位置,确定第二虚拟场景在图形用户界面中的初始显示范围。
在本申请的一种实施方式中,上述步骤中“通过图形用户界面显示第二虚拟场景”可以是:在图形用户界面隐藏第一虚拟场景,触发在图形用户界面显示第二虚拟场景。
在本申请实施例中,第二虚拟场景可以不是一个新增的,独立于第一虚拟场景的虚拟场景,而是由第一虚拟场景经过简化后得到,具体的,第二虚拟场景可以是隐藏预设虚拟对象后的第一虚拟场景,预设虚拟对象包括玩家虚拟角色、非玩家虚拟角色和/或虚拟道具对象中的一种或多种。其中,玩家虚拟角色可以是当前玩家操作的虚拟对象和/或参与游戏的其他玩家操作的玩家虚拟角色,非玩家虚拟角色可以是由终端操作而不由参与游戏的玩家所操作的非玩家虚拟角色,虚拟道具对象可以是游戏中对玩家虚拟角色产生辅助作用的虚拟对象,例如,攻击武器、骑行坐骑等等。
在本申请实施例中,也可以在图形用户界面中同时显示第一虚拟场景和第二虚拟场景,此时,上述步骤“通过图形用户界面显示第二虚拟场景”可以是:在显示第一虚拟场景的图形用户界面中确定第二显示区域,第二显示区域的区域范围小于图形用户界面对应的区域范围;通过第二显示区域显示第二虚拟场景。此外,第二显示区域在图形用户界面中的位置、大小等不受限制,可以根据实际情况灵活设置。
在本申请的一种实施方式中,资源指示器用于指示目标虚拟资源在第二虚拟场景中的参照投放位置,资源指示器可以与目标虚拟资源的形状、大小相同,也可以与目标虚拟资源的形状、大小不同。
在本申请实施例中,显示第一虚拟场景的图形用户界面中包括位于第一虚拟场景中的虚拟对象,显示第二虚拟场景的图形用户界面中可以不包括位于第二虚拟场景中的虚拟对象,从而避免在通过资源指示器设置目标虚拟资源的目标投放位置时,能够在第二虚拟场景中看到敌方虚拟对象的位置,保证游戏的公平性。
例如,如图4所示为响应使用触发操作显示的显示第二虚拟场景的图形用户界面的示意图中,第二虚拟场景中可以包括根据虚拟建筑308形成的虚拟仿真实体408、根据障碍物306生成的虚拟仿真实体407、根据障碍物307生成的虚拟仿真实体407和资源指示器409。显示第二虚拟场景的图形用户界面中还包括用于控制资源指示器409在第二虚拟场景中水平移动的水平移动控件401、控制资源指示器409在第二虚拟场景中竖直移动的竖直移动控件403和竖直移动控件404、用于触发目标虚拟资源的使用触发操作的资源控件406、取消本次目标虚拟资源投放的取消控件402和确定产生位置确定指令的位置确定控件405。
步骤203、响应于对资源指示器的移动操作,控制资源指示器在第二虚拟场景中移动。
在本申请实施例中,玩家可以通过操作移动控件来移动资源指示器,显示第二虚拟场景的图形用户界面包括用于控制资源指示器在第二虚拟场景中移动的移动控件,移动控件包括水平移动控件和竖直移动控件,在此情况下,上述步骤203中“响应于对资源指示器的移动操作,控制资源指示器在第二虚拟场景中移动”可以是:
响应于对水平移动控件的触控操作,控制资源指示器在第二虚拟场景的水平方向上移动;
响应于对竖直移动控件的触控操作,控制资源指示器在第二虚拟场景的竖直方向上移动。
在本申请实施例中,在显示第一虚拟场景的图形用户界面中可以包括用于控制虚拟对象在第一虚拟场景中移动的对象移动控件,当响应于对目标虚拟资源的使用触发操作,显示第二虚拟场景的图形用户界面之后,可以将显示第一虚拟场景的图形用户界面中的对象移动控件变为移动资源指示器的移动控件,从而实现在产生第二虚拟场景后,只能进行资源指示器的移动,而不能进行虚拟对象的移动。
在本申请实施例中,玩家还可以直接在终端屏幕上通过手指或鼠标指示光标等拖拽资源指示器的方法来移动资源指示器,在此情况下,移动操作包括拖拽操作,上述步骤203“响应于对资源指示器的移动操作,控制资源指示器在第二虚拟场景中移动”可以是:
响应于在第二虚拟场景中对资源指示器的拖拽操作,控制资源指示器在第二虚拟场景中移动。
在本申请实施例中,为了根据第二虚拟场景的空间布局更好地确定在第二虚拟场景中,资源指示器最终的参照投放位置,可以在移动资源指示器时,显示第二虚拟场景的变化过程,在此情况下,上述步骤203中“响应于对资源指示器的移动操作,控制资源指示器在第二虚拟场景中移动”之后可以包括:显示过渡片段,过渡片段包括资源指示器移动时变换的第二虚拟场景。
在本申请实施例中,当玩家对资源指示器触发移动操作时,资源指示器会在第二虚拟场景中移动,在玩家对资源指示器执行移动操作之前,需要确定资源指示器在第二虚拟场景的初始显示范围中的位置,具体可以是:根据使用触发操作发生时虚拟对象在第一虚拟场景中的位置和/或朝向,确定资源指示器在第二虚拟场景中的初始位置。
在本申请实施例中,游戏运行过程时,终端可以同时加载第一虚拟场景和第二虚拟场景,在玩家做出使用触发操作之前,图形用户界面显示第一虚拟场景。当玩家做出对目标虚拟资源的使用触发操作时,终端可以在图形用户界面中隐藏第一虚拟场景,在图形用户界面中显示第二虚拟场景。终端确定资源指示器在第二虚拟场景的初始显示范围中的初始位置时,可以先获取虚拟对象在第一虚拟场景中的空间坐标为(x,y,z),在图形用户界面中显示第二虚拟场景后,确定资源指示器的参照投放点在第二虚拟场景中的空间坐标,可以是虚拟对象在第一虚拟场景中的空间坐标(x,y,z),结合第一虚拟场景和第二虚拟场景的偏移量(xd,yd,zd),即资源指示器的初始位置(x1,y1,z1)=(x,y,z)+(xd,yd,zd),接着玩家在第二虚拟场景中操作资源指示器。其中,第一虚拟场景和第二虚拟场景的偏移量可以设置为0,也可以根据实际情况灵活设置。
步骤204、响应于对资源指示器的位置确认指令,确定资源指示器在第二虚拟场景中的参照投放位置。
在本申请实施例中,为了使得玩家根据实际游戏战况,更好地将目标虚拟资源设置在对实际游戏战况有效的地方,可以使资源指示器包括参照投放点和模拟使用形状,其中,模拟使用形状用于模拟目标虚拟资源在第一虚拟场景的不同位置时,基于参照投放点使用后的渲染形状,从而使得玩家可以在第二虚拟场景中根据资源指示器看到目标虚拟资源使用后的渲染效果,进一步地根据资源指示器的渲染效果对实际游戏战况的影响,调整资源指示器在第二虚拟场景中的参照投放位置。在此情况下,上述步骤204中“响应于对资源指示器的位置确认指令,确定资源指示器在第二虚拟场景中的参照投放位置”可以是:
在资源指示器的移动过程中,获取第二虚拟场景的空间布局;
根据第二虚拟场景的空间布局,确定资源指示器的模拟使用形状的渲染范围;
响应于对资源指示器的位置确认指令,获取模拟使用形状的参照渲染范围,以及参照投放点在第二虚拟场景中的高度和地面投影坐标;
根据高度、地面投影坐标,确定参照投放点在第二虚拟场景中的位置;
根据参照投放点在第二虚拟场景中的位置和模拟使用形状的参照渲染范围,确定资源指示器在第二虚拟场景中的参照投放位置。
在本申请实施例中,在资源指示器的移动过程中,随着第二虚拟场景的空间布局的变化,和资源指示器在第二虚拟场景中的高度变化等,资源指示器的模拟使用形状的渲染范围会发生改变。例如,当目标虚拟资源为烟雾弹时,烟雾弹使用后的烟雾渲染范围会随着第一虚拟环境中的墙壁,以及烟雾弹的投放点在第一虚拟场景中的高度等发生改变,因此为了更好地使目标虚拟资源能够在第一虚拟场景中对实际游戏战况产生更有益的效果,需要根据资源指示器的参照投放点和模拟使用形状,来确定资源指示器在第二虚拟场景中的参照投放位置。
在本申请实施例中,在一个目标投放位置投放的目标虚拟资源的数量不同,目标虚拟资源使用后的目标渲染范围也会不同,因此还可以依据设置的目标虚拟资源的数量确定资源指示器在第二虚拟场景中,模拟使用形状的渲染范围。在此情况下,上述步骤“根据第二虚拟场景的空间布局,确定资源指示器的模拟使用形状的形状”之前,还可以包括:响应于目标虚拟资源的个数设置操作,确定目标虚拟资源的投放数量。当确定在该目标投放位置目标虚拟资源的投放数量之后,上述步骤“根据第二虚拟场景的空间布局,确定资源指示器的模拟使用形状的渲染范围”可以包括:根据目标虚拟资源的投放数量和第二虚拟场景的空间布局,确定资源指示器的模拟使用形状的渲染范围。
在本申请实施例中,为了使玩家能够控制操作的虚拟对象与其他敌对虚拟对象进行攻击,在显示第一虚拟场景的图形用户界面中可以设置攻击控件,攻击控件用于指示虚拟对象在第一虚拟场景中发动攻击。当确定资源指示器在第二虚拟场景中的参照投放位置时,由于为了游戏的公平性,第二虚拟场景中可能不具有其他敌对虚拟资源,此时显示第一虚拟场景的图形用户界面中的攻击控件没有作用,为了简化显示第二虚拟场景的图形用户界面的图标设置,可以将攻击控件变为资源指示器的位置确定控件,此时,上述步骤“响应于对资源指示器的位置确认指令,确定资源指示器在第二虚拟场景中的参照投放位置”之前,还包括:将攻击控件转换为资源指示器的位置确定控件;响应于对位置确定控件的触控操作,产生资源指示器的位置确认指令。
例如,如图5所示的参照投放位置在第二虚拟场景中的示意图中,当资源指示器在第二虚拟场景中停止移动时,资源指示器在第二虚拟场景中的参照投放位置为如图5所示的资源指示器501所在的位置。
步骤205、根据参照投放位置在第一虚拟场景中确定目标虚拟资源的目标投放位置,并在第一虚拟场景的目标投放位置投放目标虚拟资源。
在本申请实施例中,目标虚拟资源包括目标投放点和目标使用形状,目标使用形状包括在目标投放点使用目标虚拟资源后的渲染形状,当在第二虚拟场景中确定资源指示器的参照投放点的投放位置和模拟使用形状后,上述步骤“将显示第二虚拟场景的图形用户界面切换为显示第一虚拟场景的图形用户界面,根据参照投放位置在第一虚拟场景中确定目标虚拟资源的目标投放位置”可以是:
获取第一虚拟场景和第二虚拟场景的空间位置对应关系;
根据参照投放位置和空间位置对应关系,在第一虚拟场景中确定参照投放点对应的位置,为目标投放点的目标投放点;
根据参照投放位置和空间位置对应关系,在第一虚拟场景中确定模拟使用形状对应的渲染范围,为目标使用形状的目标渲染范围;
根据目标投放点和目标渲染范围,在第一虚拟场景中确定目标虚拟资源的目标投放位置。
在本申请实施例中,空间位置对应关系是第一虚拟场景的各个空间点与第二虚拟场景中的各个空间点之间的对应关系。在第一虚拟场景中确定模拟使用形状对应的渲染范围,可以是确定组成模拟使用形状的关键点,或组成模拟使用形状的全部点在第一虚拟场景中的对应点,然后根据确定的对应关键点在第一虚拟场景中确定目标虚拟资源的目标渲染范围。例如,如图6所示为第一虚拟场景投放目标虚拟资源的示意图,根据如图5所示的资源指示器501的参照投放位置和参照渲染范围,在图6所示的第一虚拟场景中确定目标虚拟资源的目标投放位置601。
在本申请实施例中,当确定了目标虚拟资源在第一虚拟场景中的目标投放点,和在该目标投放点使用目标虚拟资源产生的目标渲染范围之后,可以直接在目标投放位置渲染目标虚拟资源使用后的目标渲染范围。
在本申请实施例中,当玩家在触发目标虚拟资源的使用触发操作之后,在确定后资源指示器的参照投放位置之间,可以取消在第一虚拟场景中对目标虚拟资源的使用。此时,具体方法可以是:响应于对目标虚拟资源的使用触发操作,显示道具取消区域;响应于对道具取消区域的触发操作,将显示第二虚拟场景的图形用户界面转换为显示第一虚拟场景的图形用户界面。
在本申请实施例中,道具取消区域的在显示第二虚拟场景的图形用户界面中显示位置和显示形状可以不受限制,根据实际情况灵活设置。
在本申请实施例中,当响应于目标虚拟资源的使用触发操作,显示包含第二虚拟场景的显示第二虚拟场景的图形用户界面时,可以一次性在第二虚拟场景中产生多个资源指示器,移动各个资源指示器,当每个资源指示器的参照投放位置确定之后,产生位置确定指令,从而一次性确定多个目标虚拟资源的目标投放位置,同时在多个目标投放位置投放使用目标虚拟资源。
上述所有的技术方案,可以采用任意结合形成本申请的可选实施例,在此不再一一赘述。
本申请实施例提供的虚拟资源的投放控制方法,当玩家想要在第一虚拟场景中的目标投放位置投放目标虚拟资源时,可以做出目标虚拟资源的使用触发操作,然后在与第一虚拟场景具有相同空间布局的第二虚拟场景中移动资源指示器,直至移动到第二虚拟场景中与第一虚拟场景对应的参照投放位置,使得终端可以根据参照投放位置在第一虚拟场景中确定目标投放位置,直接在目标投放位置投放目标虚拟资源,使得玩家可以将目标虚拟资源精准地投放至游戏场景的目标投放位置,降低对玩家投放虚拟资源的技巧要求。
请参阅图7,图7为本申请实施例提供的虚拟资源的投放控制方法的另一流程示意图。该方法的具体流程可以如下:
步骤701、通过图形用户界面显示第一虚拟场景和位于第一虚拟场景中的虚拟对象。
例如,显示第一虚拟场景的图形用户界面为终端执行游戏应用程序之后,在终端的显示屏幕显示的游戏画面,显示第一虚拟场景的图形用户界面的第一虚拟场景中可以具有游戏道具,和/或构成游戏世界环境所包含的多个虚拟物体等(建筑、树木、山川等等)。第一虚拟场景中的建筑、山川、墙壁等虚拟物体的摆放位置,构成第一虚拟场景的空间布局。
步骤702、响应于对目标虚拟资源的使用触发操作,通过图形用户界面显示第二虚拟场景和位于第二虚拟场景中的资源指示器。
例如,当玩家做出了使用触发操作,可以根据第一虚拟场景简化得到第二虚拟场景,从而显示第二虚拟场景。
步骤703、响应于对显示第二虚拟场景的图形用户界面中移动控件的触发操作,控制资源指示器在第二虚拟场景中移动。
例如,移动控件包括水平移动控件和竖直移动控件,响应于对水平移动控件的触控操作,在第二虚拟场景的水平方向上移动资源指示器,响应于对竖直移动控件的触控操作,在第二虚拟场景的竖直方向上移动资源指示器。
步骤704、在资源指示器的移动过程中,根据第二虚拟场景的空间布局,确定资源指示器的模拟使用形状的渲染范围。
例如,在触控移动控件移动资源指示器的过程中,实时获取第二虚拟场景的空间布局,根据第二虚拟场景的空间布局,确定资源指示器的模拟使用形状的渲染范围。
步骤705、响应于对资源指示器的位置确认指令,获取模拟使用形状的参照渲染范围,和参照投放点在第二虚拟场景中的位置。
例如,响应于对资源指示器的位置确认指令,获取模拟使用形状的参照渲染范围,以及参照投放点在第二虚拟场景中的高度和地面投影坐标,根据高度、地面投影坐标,确定参照投放点在第二虚拟场景中的位置。
步骤706、根据参照投放点在第二虚拟场景中的位置和模拟使用形状的参照渲染范围,确定资源指示器在第二虚拟场景中的参照投放位置。
步骤707、根据参照投放位置在第一虚拟场景中确定目标虚拟资源的目标投放位置,在第一虚拟场景的目标投放位置投放目标虚拟资源。
例如,获取第一虚拟场景和第二虚拟场景的空间位置对应关系,将显示第二虚拟场景的图形用户界面切换为显示第一虚拟场景的图形用户界面,根据空间位置对应关系和参照投放位置,在第一虚拟场景中确定目标虚拟资源的目标投放位置,在第一虚拟场景的目标投放位置投放目标虚拟资源。
上述所有的技术方案,可以采用任意结合形成本申请的可选实施例,在此不再一一赘述。
本申请实施例提供的虚拟资源的投放控制方法,当玩家想要在第一虚拟场景中的目标投放位置投放目标虚拟资源时,可以做出目标虚拟资源的使用触发操作,然后在与第一虚拟场景具有相同空间布局的第二虚拟场景中移动资源指示器,直至移动到第二虚拟场景中与第一虚拟场景对应的参照投放位置,使得终端可以根据参照投放位置在第一虚拟场景中确定目标投放位置,直接在目标投放位置投放目标虚拟资源,使得玩家可以将目标虚拟资源精准地投放至游戏场景的目标投放位置,降低对玩家投放虚拟资源的技巧要求。
为便于更好的实施本申请实施例的虚拟资源的投放控制方法,本申请实施例还提供一种虚拟资源的投放控制装置。请参阅图8,图8为本申请实施例提供的虚拟资源的投放控制装置的结构示意图。该虚拟资源的投放控制装置可以包括第一显示单元801、第二显示单元802、移动单元803、确定单元804和投放单元805。
其中,第一显示单元801,用于通过图形用户界面显示第一虚拟场景和位于第一虚拟场景中的虚拟对象,虚拟对象配置为响应针对图形用户界面的触控操作执行游戏行为;
第二显示单元802,用于响应于对目标虚拟资源的使用触发操作,通过图形用户界面显示第二虚拟场景和位于第二虚拟场景中的资源指示器,其中,资源指示器用于可视化指示目标虚拟资源的投放位置;
移动单元803,用于响应于对资源指示器的移动操作,控制资源指示器在第二虚拟场景中移动;
确定单元804,用于响应于对资源指示器的位置确认指令,确定资源指示器在第二虚拟场景中的参照投放位置;
投放单元805,用于根据参照投放位置在第一虚拟场景中确定目标虚拟资源的目标投放位置,并在第一虚拟场景的目标投放位置投放目标虚拟资源。
可选的,第二虚拟场景具有与第一虚拟场景相对应的场景布局。
可选的,第二虚拟场景中包含第二场景元素,第二场景元素用于表征至少部分第一虚拟场景中的第一场景元素,第二场景元素在第二虚拟场景中的位置用于表征第一场景元素在第一虚拟场景中的位置。
可选的,装置还包括:
根据资源指示器在第二虚拟场景中的位置,确定第二虚拟场景在图形用户界面中的显示范围。
可选的,第二显示单元802还用于:
根据使用触发操作发生时虚拟对象在第一虚拟场景中的位置,确定第二虚拟场景在图形用户界面中的初始显示范围。
可选的,第二显示单元802还用于:
根据使用触发操作发生时虚拟对象在第一虚拟场景中的位置和/或朝向,确定资源指示器在第二虚拟场景中的初始位置。
可选的,第二显示单元802还用于:
在图形用户界面隐藏第一虚拟场景,触发在图形用户界面显示第二虚拟场景。
可选的,第二虚拟场景为隐藏预设虚拟对象后的第一虚拟场景,预设虚拟对象包括玩家虚拟角色、非玩家虚拟角色和/或虚拟道具对象中的一种或多种。
可选的,第二显示单元802还用于:
在显示第一虚拟场景的图形用户界面中确定第二显示区域,第二显示区域的区域范围小于图形用户界面对应的区域范围;
通过第二显示区域显示第二虚拟场景。
可选的,显示第二虚拟场景的图形用户界面包括用于控制资源指示器在第二虚拟场景中移动的移动控件,移动控件包括水平移动控件和竖直移动控件,移动单元803还用于:
响应于对水平移动控件的触控操作,控制资源指示器在第二虚拟场景的水平方向上移动;
响应于对竖直移动控件的触控操作,控制资源指示器在第二虚拟场景的竖直方向上移动。
可选的,移动操作包括拖拽操作,移动单元803还用于:
响应于在第二虚拟场景中对资源指示器的拖拽操作,控制资源指示器在第二虚拟场景中移动。
可选的,移动单元803还用于:
显示过渡片段,过渡片段包括资源指示器移动时变换的第二虚拟场景。
可选的,资源指示器包括参照投放点和模拟使用形状,模拟使用形状用于模拟目标虚拟资源在第一虚拟场景的不同位置时,基于参照投放点使用后的渲染形状,确定单元804还用于:
在资源指示器的移动过程中,获取第二虚拟场景的空间布局;
根据第二虚拟场景的空间布局,确定资源指示器的模拟使用形状的渲染范围;
响应于对资源指示器的位置确认指令,获取模拟使用形状的参照渲染范围,以及参照投放点在第二虚拟场景中的高度和地面投影坐标;
根据高度、地面投影坐标,确定参照投放点在第二虚拟场景中的位置;
根据参照投放点在第二虚拟场景中的位置和模拟使用形状的参照渲染范围,确定资源指示器在第二虚拟场景中的参照投放位置。
可选的,确定单元804还用于:
响应于目标虚拟资源的个数设置操作,确定目标虚拟资源的投放数量;
根据目标虚拟资源的投放数量和第二虚拟场景的空间布局,确定资源指示器的模拟使用形状的渲染范围。
可选的,目标虚拟资源包括目标投放点和目标使用形状,目标使用形状包括在目标投放点使用目标虚拟资源后的渲染形状,投放单元805还用于:
获取第一虚拟场景和第二虚拟场景的空间位置对应关系;
根据参照投放位置和空间位置对应关系,在第一虚拟场景中确定参照投放点对应的位置,为目标投放点的目标投放点;
根据参照投放位置和空间位置对应关系,在第一虚拟场景中确定模拟使用形状对应的渲染范围,为目标使用形状的目标渲染范围;
根据目标投放点和目标渲染范围,在第一虚拟场景中确定目标虚拟资源的目标投放位置。
可选的,装置还用于:
响应于对目标虚拟资源的使用触发操作,显示道具取消区域;
响应于对道具取消区域的触发操作,显示第一虚拟场景。
可选的,显示第一虚拟场景的图形用户界面包括攻击控件,攻击控件用于指示虚拟对象在第一虚拟场景中发动攻击,确定单元804还用于:
将攻击控件转换为资源指示器的位置确定控件;
响应于对位置确定控件的触控操作,产生资源指示器的位置确认指令。
上述所有的技术方案,可以采用任意结合形成本申请的可选实施例,在此不再一一赘述。
本申请实施例提供的虚拟资源的投放控制装置,当玩家想要在第一虚拟场景中的目标投放位置投放目标虚拟资源时,可以做出目标虚拟资源的使用触发操作,然后在与第一虚拟场景具有相同空间布局的第二虚拟场景中移动资源指示器,直至移动到第二虚拟场景中与第一虚拟场景对应的参照投放位置,使得终端可以根据参照投放位置在第一虚拟场景中确定目标投放位置,直接在目标投放位置投放目标虚拟资源,使得玩家可以将目标虚拟资源精准地投放至游戏场景的目标投放位置,降低对玩家投放虚拟资源的技巧要求。
相应的,本申请实施例还提供一种计算机设备,该计算机设备可以为终端,该终端可以为智能手机、平板电脑、笔记本电脑、触控屏幕、游戏机、个人计算机、个人数字助理等终端设备。如图9所示,图9为本申请实施例提供的计算机设备的结构示意图。该计算机设备900包括有一个或者一个以上处理核心的处理器901、有一个或一个以上计算机可读存储介质的存储器902及存储在存储器902上并可在处理器上运行的计算机程序。其中,处理器901与存储器902电性连接。本领域技术人员可以理解,图中示出的计算机设备结构并不构成对计算机设备的限定,可以包括比图示更多或更少的部件,或者组合某些部件,或者不同的部件布置。
处理器901是计算机设备900的控制中心,利用各种接口和线路连接整个计算机设备900的各个部分,通过运行或加载存储在存储器902内的软件程序和/或模块,以及调用存储在存储器902内的数据,执行计算机设备900的各种功能和处理数据,从而对计算机设备900进行整体监控。
在本申请实施例中,计算机设备900中的处理器901会按照如下的步骤,将一个或一个以上的应用程序的进程对应的指令加载到存储器902中,并由处理器901来运行存储在存储器902中的应用程序,从而实现各种功能:
通过图形用户界面显示第一虚拟场景和位于第一虚拟场景中的虚拟对象,虚拟对象配置为响应针对图形用户界面的触控操作执行游戏行为;响应于对目标虚拟资源的使用触发操作,通过图形用户界面显示第二虚拟场景和位于第二虚拟场景中的资源指示器,其中,资源指示器用于可视化指示目标虚拟资源的投放位置;响应于对资源指示器的移动操作,控制资源指示器在第二虚拟场景中移动;响应于对资源指示器的位置确认指令,确定资源指示器在第二虚拟场景中的参照投放位置;根据参照投放位置在第一虚拟场景中确定目标虚拟资源的目标投放位置,并在第一虚拟场景的目标投放位置投放目标虚拟资源。
以上各个操作的具体实施可参见前面的实施例,在此不再赘述。
可选的,如图9所示,计算机设备900还包括:触控显示屏903、射频电路904、音频电路905、输入单元906以及电源907。其中,处理器901分别与触控显示屏903、射频电路904、音频电路905、输入单元906以及电源907电性连接。本领域技术人员可以理解,图9中示出的计算机设备结构并不构成对计算机设备的限定,可以包括比图示更多或更少的部件,或者组合某些部件,或者不同的部件布置。
触控显示屏903可用于显示图形用户界面以及接收用户作用于图形用户界面产生的操作指令。触控显示屏903可以包括显示面板和触控面板。其中,显示面板可用于显示由用户输入的信息或提供给用户的信息以及计算机设备的各种图形用户接口,这些图形用户接口可以由图形、文本、图标、视频和其任意组合来构成。可选的,可以采用液晶显示器(Liquid Crystal Display,LCD)、有机发光二极管(Organic Light-Emitting Diode,OLED)等形式来配置显示面板。触控面板可用于收集用户在其上或附近的触摸操作(比如用户使用手指、触笔等任何适合的物体或附件在触控面板上或在触控面板附近的操作),并生成相应的操作指令,且操作指令执行对应程序。可选的,触控面板可包括触摸检测装置和触摸控制器两个部分。其中,触摸检测装置检测用户的触摸方位,并检测触摸操作带来的信号,将信号传送给触摸控制器;触摸控制器从触摸检测装置上接收触摸信息,并将它转换成触点坐标,再送给处理器901,并能接收处理器901发来的命令并加以执行。触控面板可覆盖显示面板,当触控面板检测到在其上或附近的触摸操作后,传送给处理器901以确定触摸事件的类型,随后处理器901根据触摸事件的类型在显示面板上提供相应的视觉输出。在本申请实施例中,可以将触控面板与显示面板集成到触控显示屏903而实现输入和输出功能。但是在某些实施例中,触控面板与触控面板可以作为两个独立的部件来实现输入和输出功能。即触控显示屏903也可以作为输入单元906的一部分实现输入功能。
射频电路904可用于收发射频信号,以通过无线通信与网络设备或其他计算机设备建立无线通讯,与网络设备或其他计算机设备之间收发信号。
音频电路905可以用于通过扬声器、传声器提供用户与计算机设备之间的音频接口。音频电路905可将接收到的音频数据转换后的电信号,传输到扬声器,由扬声器转换为声音信号输出;另一方面,传声器将收集的声音信号转换为电信号,由音频电路905接收后转换为音频数据,再将音频数据输出处理器901处理后,经射频电路904以发送给比如另一计算机设备,或者将音频数据输出至存储器902以便进一步处理。音频电路905还可能包括耳塞插孔,以提供外设耳机与计算机设备的通信。
输入单元906可用于接收输入的数字、字符信息或用户特征信息(例如指纹、虹膜、面部信息等),以及产生与用户设置以及功能控制有关的键盘、鼠标、操作杆、光学或者轨迹球信号输入。
电源907用于给计算机设备900的各个部件供电。可选的,电源907可以通过电源管理系统与处理器901逻辑相连,从而通过电源管理系统实现管理充电、放电、以及功耗管理等功能。电源907还可以包括一个或一个以上的直流或交流电源、再充电系统、电源故障检测电路、电源转换器或者逆变器、电源状态指示器等任意组件。
尽管图9中未示出,计算机设备900还可以包括摄像头、传感器、无线保真模块、蓝牙模块等,在此不再赘述。
在上述实施例中,对各个实施例的描述都各有侧重,某个实施例中没有详述的部分,可以参见其他实施例的相关描述。
由上可知,本实施例提供的计算机设备,当玩家想要在第一虚拟场景中的目标投放位置投放目标虚拟资源时,可以做出目标虚拟资源的使用触发操作,然后在与第一虚拟场景具有相同空间布局的第二虚拟场景中移动资源指示器,直至移动到第二虚拟场景中与第一虚拟场景对应的参照投放位置,使得终端可以根据参照投放位置在第一虚拟场景中确定目标投放位置,直接在目标投放位置投放目标虚拟资源,使得玩家可以将目标虚拟资源精准地投放至游戏场景的目标投放位置,降低对玩家投放虚拟资源的技巧要求。
本领域普通技术人员可以理解,上述实施例的各种方法中的全部或部分步骤可以通过指令来完成,或通过指令控制相关的硬件来完成,该指令可以存储于一计算机可读存储介质中,并由处理器进行加载和执行。
为此,本申请实施例提供一种计算机可读存储介质,其中存储有多条计算机程序,该计算机程序能够被处理器进行加载,以执行本申请实施例所提供的任一种虚拟资源的投放控制方法中的步骤。例如,该计算机程序可以执行如下步骤:
通过图形用户界面显示第一虚拟场景和位于第一虚拟场景中的虚拟对象,虚拟对象配置为响应针对图形用户界面的触控操作执行游戏行为;响应于对目标虚拟资源的使用触发操作,通过图形用户界面显示第二虚拟场景和位于第二虚拟场景中的资源指示器,其中,资源指示器用于可视化指示目标虚拟资源的投放位置;响应于对资源指示器的移动操作,控制资源指示器在第二虚拟场景中移动;响应于对资源指示器的位置确认指令,确定资源指示器在第二虚拟场景中的参照投放位置;根据参照投放位置在第一虚拟场景中确定目标虚拟资源的目标投放位置,并在第一虚拟场景的目标投放位置投放目标虚拟资源。
以上各个操作的具体实施可参见前面的实施例,在此不再赘述。
其中,该存储介质可以包括:只读存储器(Read Only Memory,ROM)、随机存取记忆体(Random Access Memory,RAM)、磁盘或光盘等。
由于该存储介质中所存储的计算机程序,可以执行本申请实施例所提供的任一种虚拟资源的投放控制方法中的步骤,因此,可以实现本申请实施例所提供的任一种虚拟资源的投放控制方法所能实现的有益效果,详见前面的实施例,在此不再赘述。
在上述实施例中,对各个实施例的描述都各有侧重,某个实施例中没有详述的部分,可以参见其他实施例的相关描述。
以上对本申请实施例所提供的一种虚拟资源的投放控制方法、装置、计算机设备及存储介质,进行了详细介绍,本文中应用了具体个例对本发明的原理及实施方式进行了阐述,以上实施例的说明只是用于帮助理解本发明的技术方案及其核心思想;本领域的普通技术人员应当理解:其依然可以对前述各实施例所记载的技术方案进行修改,或者对其中部分技术特征进行等同替换;而这些修改或者替换,并不使相应技术方案的本质脱离本发明各实施例的技术方案的范围。

Claims (20)

  1. 一种虚拟资源的投放控制方法,其中,包括:
    通过图形用户界面显示第一虚拟场景和位于所述第一虚拟场景中的虚拟对象,所述虚拟对象配置为响应针对所述图形用户界面的触控操作执行游戏行为;
    响应于对目标虚拟资源的使用触发操作,通过所述图形用户界面显示第二虚拟场景和位于所述第二虚拟场景中的资源指示器,其中,所述资源指示器用于可视化指示所述目标虚拟资源的投放位置;
    响应于对所述资源指示器的移动操作,控制所述资源指示器在所述第二虚拟场景中移动;
    响应于对所述资源指示器的位置确认指令,确定所述资源指示器在所述第二虚拟场景中的参照投放位置;
    根据所述参照投放位置在所述第一虚拟场景中确定所述目标虚拟资源的目标投放位置,并在所述第一虚拟场景的目标投放位置投放所述目标虚拟资源。
  2. 根据权利要求1所述的方法,其中,所述第二虚拟场景具有与所述第一虚拟场景相对应的场景布局。
  3. 根据权利要求1所述的方法,其中,所述第二虚拟场景中包含第二场景元素,所述第二场景元素用于表征至少部分所述第一虚拟场景中的第一场景元素,所述第二场景元素在所述第二虚拟场景中的位置用于表征所述第一场景元素在所述第一虚拟场景中的位置。
  4. 根据权利要求1所述的方法,其中,还包括:
    根据所述资源指示器在所述第二虚拟场景中的位置,确定所述第二虚拟场景在所述图形用户界面中的显示范围。
  5. 根据权利要求1所述的方法,其中,所述通过所述图形用户界面显示第二虚拟场景和位于所述第二虚拟场景中的资源指示器之前,还包括:
    根据所述使用触发操作发生时所述虚拟对象在所述第一虚拟场景中的位置和/或朝向,确定所述第二虚拟场景在所述图形用户界面中的初始显示范围。
  6. 根据权利要求1所述的方法,其中,所述通过所述图形用户界面显示第二虚拟场景和位于所述第二虚拟场景中的资源指示器之前,还包括:
    根据所述使用触发操作发生时所述虚拟对象在所述第一虚拟场景中的位置和/或朝向,确定所述资源指示器在所述第二虚拟场景中的初始位置。
  7. 根据权利要求1所述的方法,其中,所述通过所述图形用户界面显示第二虚拟场景,包括:
    在所述图形用户界面隐藏所述第一虚拟场景,触发在所述图形用户界面显示第二虚拟场景。
  8. 根据权利要求1所述的方法,其中,所述第二虚拟场景为隐藏预设虚拟对象后的第一虚拟场景,所述预设虚拟对象包括玩家虚拟角色、非玩家虚拟角色和/或虚拟道具对象中的一种或多种。
  9. 根据权利要求1所述的方法,其中,所述通过所述图形用户界面显示第二虚拟场景,包括:
    在显示所述第一虚拟场景的所述图形用户界面中确定第二显示区域,所述第二显示区域的区域范围小于所述图形用户界面对应的区域范围;
    通过所述第二显示区域显示所述第二虚拟场景。
  10. 根据权利要求1所述的方法,其中,所述显示所述第二虚拟场景的图形用户界面包括用于控制所述资源指示器在所述第二虚拟场景中移动的移动控件,所述移动控件包括水平移动控件和竖直移动控件,所述响应于对所述资源指示器的移动操作,控制所述资源指示器在所述第二虚拟场景中移动,包括:
    响应于对所述水平移动控件的触控操作,控制所述资源指示器在所述第二虚拟场景的水平方向上移动;
    响应于对所述竖直移动控件的触控操作,控制所述资源指示器在所述第二虚拟场景的竖直方向上移动。
  11. 根据权利要求1所述的方法,其中,所述移动操作包括拖拽操作,所述响应于对所述资源指示器的移动操作,控制所述资源指示器在所述第二虚拟场景中移动,包括:
    响应于在所述第二虚拟场景中对所述资源指示器的所述拖拽操作,控制所述资源指示器在所述第二虚拟场景中移动。
  12. 根据权利要求1所述的方法,其中,所述响应于对所述资源指示器的移动操作,控制所述资源指示器在所述第二虚拟场景中移动之后,还包括:
    显示过渡片段,所述过渡片段包括所述资源指示器移动时变换的第二虚拟场景。
  13. 根据权利要求1所述的方法,其中,所述资源指示器包括参照投放点和模拟使用形状,所述模拟使用形状用于模拟所述目标虚拟资源在所述第一虚拟场景的不同位置时,基于所述参照投放点使用后的渲染形状,所述响应于对所述资源指示器的位置确认指令,确定所述资源指示器在所述第二虚拟场景中的参照投放位置,包括:
    在所述资源指示器的移动过程中,获取所述第二虚拟场景的空间布局;
    根据所述第二虚拟场景的空间布局,确定所述资源指示器的模拟使用形状的渲染范围;
    响应于对所述资源指示器的位置确认指令,获取所述模拟使用形状的参照渲染范围,以及所述参照投放点在所述第二虚拟场景中的高度和地面投影坐标;
    根据所述高度、所述地面投影坐标,确定所述参照投放点在所述第二虚拟场景中的位置;
    根据所述参照投放点在所述第二虚拟场景中的位置和所述模拟使用形状的参照渲染范围,确定所述资源指示器在所述第二虚拟场景中的参照投放位置。
  14. 根据权利要求13所述的方法,其中,所述根据所述第二虚拟场景的空间布局,确定所述资源指示器的模拟使用形状的形状之前,还包括:
    响应于所述目标虚拟资源的个数设置操作,确定所述目标虚拟资源的投放数量;
    所述根据所述第二虚拟场景的空间布局,确定所述资源指示器的模拟使用形状的渲染范围,包括:
    根据所述目标虚拟资源的投放数量和所述第二虚拟场景的空间布局,确定所述资源指示器的模拟使用形状的渲染范围。
  15. 根据权利要求13所述的方法,其中,所述目标虚拟资源包括目标投放点和目标使用形状,所述目标使用形状包括在所述目标投放点使用所述目标虚拟资源后的渲染形状,所述根据所述参照投放位置在所述第一虚拟场景中确定所述目标虚拟资源的目标投放位置,包括:
    获取所述第一虚拟场景和所述第二虚拟场景的空间位置对应关系;
    根据所述参照投放位置和所述空间位置对应关系,在所述第一虚拟场景中确定所述参照投放点对应的位置,为所述目标投放点的目标投放点;
    根据所述参照投放位置和所述空间位置对应关系,在所述第一虚拟场景中确定所述模拟使用形状对应的渲染范围,为所述目标使用形状的目标渲染范围;
    根据所述目标投放点和所述目标渲染范围,在所述第一虚拟场景中确定所述目标虚拟资源的目标投放位置。
  16. 根据权利要求1所述的方法,其中,所述方法还包括:
    响应于对所述目标虚拟资源的使用触发操作,显示道具取消区域;
    响应于对所述道具取消区域的触发操作,显示所述第一虚拟场景。
  17. 根据权利要求1所述的方法,其中,所述显示第一虚拟场景的图形用户界面包括攻击控件,所述攻击控件用于指示所述虚拟对象在所述第一虚拟场景中发动攻击,所述响应于对所述资源指示器的位置确认指令,确定所述资源指示器在所述第二虚拟场景中的参照投放位置之前,还包括:
    将所述攻击控件转换为所述资源指示器的位置确定控件;
    响应于对所述位置确定控件的触控操作,产生所述资源指示器的位置确认指令。
  18. 一种虚拟资源的投放控制装置,其中,包括:
    第一显示单元,用于通过图形用户界面显示第一虚拟场景和位于所述第一虚拟场景中的虚拟对象,所述虚拟对象配置为响应针对所述图形用户界面的触控操作执行游戏行为;
    第二显示单元,用于响应于对目标虚拟资源的使用触发操作,通过所述图形用户界面显示第二虚拟场景和位于所述第二虚拟场景中的资源指示器,其中,所述资源指示器用于可视化指示所述目标虚拟资源的投放位置;
    移动单元,用于响应于对所述资源指示器的移动操作,控制所述资源指示器在所述第二虚拟场景中移动;
    确定单元,用于响应于对所述资源指示器的位置确认指令,确定所述资源指示器在所述第二虚拟场景中的参照投放位置;
    投放单元,用于根据所述参照投放位置在所述第一虚拟场景中确定所述目标虚拟资源的目标投放位置,并在所述第一虚拟场景的目标投放位置投放所述目标虚拟资源。
  19. 一种计算机设备,其特征在于,包括:
    存储器,用于存储计算机程序;
    处理器,用于在执行所述计算机程序时实现如权利要求1至17任一项所述虚拟资源的投放控制方法中的步骤。
  20. 一种计算机可读存储介质,其特征在于,所述计算机可读存储介质上存储有计算机程序,所述计算机程序被处理器执行时实现如权利要求1至17任一项所述虚拟资源的投放控制方法中的步骤。
PCT/CN2022/082121 2021-07-30 2022-03-21 虚拟资源的投放控制方法、装置、计算机设备及存储介质 WO2023005234A1 (zh)

Priority Applications (1)

Application Number Priority Date Filing Date Title
JP2023552232A JP2024507595A (ja) 2021-07-30 2022-03-21 仮想リソースの投入制御方法、装置、コンピュータ機器及び記憶媒体

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN202110872438.9 2021-07-29
CN202110872438.9A CN113546422A (zh) 2021-07-30 2021-07-30 虚拟资源的投放控制方法、装置、计算机设备及存储介质

Publications (1)

Publication Number Publication Date
WO2023005234A1 true WO2023005234A1 (zh) 2023-02-02

Family

ID=78133390

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2022/082121 WO2023005234A1 (zh) 2021-07-30 2022-03-21 虚拟资源的投放控制方法、装置、计算机设备及存储介质

Country Status (3)

Country Link
JP (1) JP2024507595A (zh)
CN (1) CN113546422A (zh)
WO (1) WO2023005234A1 (zh)

Families Citing this family (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113546422A (zh) * 2021-07-30 2021-10-26 网易(杭州)网络有限公司 虚拟资源的投放控制方法、装置、计算机设备及存储介质
CN114415907B (zh) * 2022-01-21 2023-08-18 腾讯科技(深圳)有限公司 媒体资源显示方法、装置、设备及存储介质
CN116688502A (zh) * 2022-02-25 2023-09-05 腾讯科技(深圳)有限公司 虚拟场景中的位置标记方法、装置、设备及存储介质

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20180345148A1 (en) * 2017-06-05 2018-12-06 Nintendo Co., Ltd. Storage medium, game apparatus, game system and game control method
CN112870715A (zh) * 2021-01-22 2021-06-01 腾讯科技(深圳)有限公司 虚拟道具的投放方法、装置、终端及存储介质
CN113041622A (zh) * 2021-04-23 2021-06-29 腾讯科技(深圳)有限公司 虚拟环境中虚拟投掷物的投放方法、终端及存储介质
CN113082712A (zh) * 2021-03-30 2021-07-09 网易(杭州)网络有限公司 虚拟角色的控制方法、装置、计算机设备和存储介质
CN113546422A (zh) * 2021-07-30 2021-10-26 网易(杭州)网络有限公司 虚拟资源的投放控制方法、装置、计算机设备及存储介质

Family Cites Families (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP5106894B2 (ja) * 2007-03-22 2012-12-26 株式会社バンダイナムコゲームス プログラム、情報記憶媒体及びゲーム装置
WO2018103515A1 (zh) * 2016-12-06 2018-06-14 腾讯科技(深圳)有限公司 一种应用中虚拟资源对象的插入方法以及终端
CN108434734B (zh) * 2018-01-30 2020-09-08 网易(杭州)网络有限公司 游戏场景中虚拟资源处理方法、装置、终端和存储介质
CN109876438B (zh) * 2019-02-20 2021-06-18 腾讯科技(深圳)有限公司 用户界面显示方法、装置、设备及存储介质
CN111249731A (zh) * 2020-01-17 2020-06-09 腾讯科技(深圳)有限公司 虚拟道具控制方法、装置、存储介质及电子装置
CN111773718A (zh) * 2020-07-10 2020-10-16 网易(杭州)网络有限公司 游戏行为的处理方法、装置、存储介质和电子装置
CN111773721A (zh) * 2020-08-10 2020-10-16 网易(杭州)网络有限公司 游戏中的画面显示方法及装置、电子设备、存储介质
CN111803937A (zh) * 2020-08-25 2020-10-23 网易(杭州)网络有限公司 游戏中的信息处理方法及装置、电子设备、存储介质

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20180345148A1 (en) * 2017-06-05 2018-12-06 Nintendo Co., Ltd. Storage medium, game apparatus, game system and game control method
CN112870715A (zh) * 2021-01-22 2021-06-01 腾讯科技(深圳)有限公司 虚拟道具的投放方法、装置、终端及存储介质
CN113082712A (zh) * 2021-03-30 2021-07-09 网易(杭州)网络有限公司 虚拟角色的控制方法、装置、计算机设备和存储介质
CN113041622A (zh) * 2021-04-23 2021-06-29 腾讯科技(深圳)有限公司 虚拟环境中虚拟投掷物的投放方法、终端及存储介质
CN113546422A (zh) * 2021-07-30 2021-10-26 网易(杭州)网络有限公司 虚拟资源的投放控制方法、装置、计算机设备及存储介质

Also Published As

Publication number Publication date
JP2024507595A (ja) 2024-02-20
CN113546422A (zh) 2021-10-26

Similar Documents

Publication Publication Date Title
WO2023005234A1 (zh) 虚拟资源的投放控制方法、装置、计算机设备及存储介质
CN113101652A (zh) 信息展示方法、装置、计算机设备及存储介质
CN113082712A (zh) 虚拟角色的控制方法、装置、计算机设备和存储介质
CN113398590A (zh) 声音处理方法、装置、计算机设备及存储介质
CN113082709A (zh) 游戏中信息提示方法、装置、存储介质及计算机设备
CN113426124A (zh) 游戏中的显示控制方法、装置、存储介质及计算机设备
WO2023240925A1 (zh) 虚拟道具的拾取方法、装置、计算机设备和存储介质
WO2022142622A1 (zh) 虚拟对象互动模式的选择方法、装置、设备、介质及产品
CN113398566A (zh) 游戏的显示控制方法、装置、存储介质及计算机设备
CN111589102B (zh) 辅助工具检测方法、装置、设备及存储介质
WO2024011894A1 (zh) 虚拟对象的控制方法、装置、存储介质及计算机设备
WO2024031942A1 (zh) 游戏道具的控制方法、装置、计算机设备及存储介质
CN115999153A (zh) 虚拟角色的控制方法、装置、存储介质及终端设备
CN113332721B (zh) 一种游戏控制方法、装置、计算机设备及存储介质
CN115645912A (zh) 游戏元素的显示方法、装置、计算机设备及存储介质
CN114225412A (zh) 信息处理方法、装置、计算机设备及存储介质
CN114159789A (zh) 游戏交互方法、装置、计算机设备及存储介质
CN116139483A (zh) 游戏功能控制方法、装置、存储介质及计算机设备
US20240131434A1 (en) Method and apparatus for controlling put of virtual resource, computer device, and storage medium
CN113332724A (zh) 虚拟角色的控制方法、装置、终端和存储介质
CN113398564B (zh) 虚拟角色控制方法、装置、存储介质及计算机设备
WO2024098984A1 (zh) 一种虚拟道具的控制方法、装置、设备以及存储介质
CN113546424A (zh) 虚拟资源的使用控制方法、装置、计算机设备及存储介质
CN116850594A (zh) 游戏交互方法、装置、计算机设备及计算机可读存储介质
CN117643723A (zh) 游戏交互方法、装置、计算机设备及计算机可读存储介质

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 22847854

Country of ref document: EP

Kind code of ref document: A1

WWE Wipo information: entry into national phase

Ref document number: 2023552232

Country of ref document: JP

WWE Wipo information: entry into national phase

Ref document number: 18548226

Country of ref document: US

NENP Non-entry into the national phase

Ref country code: DE