WO2020143148A1 - 游戏中的显示控制方法、装置、存储介质、处理器及终端 - Google Patents

游戏中的显示控制方法、装置、存储介质、处理器及终端 Download PDF

Info

Publication number
WO2020143148A1
WO2020143148A1 PCT/CN2019/086464 CN2019086464W WO2020143148A1 WO 2020143148 A1 WO2020143148 A1 WO 2020143148A1 CN 2019086464 W CN2019086464 W CN 2019086464W WO 2020143148 A1 WO2020143148 A1 WO 2020143148A1
Authority
WO
WIPO (PCT)
Prior art keywords
touch operation
display area
game
scene
focus position
Prior art date
Application number
PCT/CN2019/086464
Other languages
English (en)
French (fr)
Inventor
邵堃
晋铮
Original Assignee
网易(杭州)网络有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 网易(杭州)网络有限公司 filed Critical 网易(杭州)网络有限公司
Priority to US16/632,547 priority Critical patent/US11446565B2/en
Publication of WO2020143148A1 publication Critical patent/WO2020143148A1/zh

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/20Input arrangements for video game devices
    • A63F13/21Input arrangements for video game devices characterised by their sensors, purposes or types
    • A63F13/214Input arrangements for video game devices characterised by their sensors, purposes or types for locating contacts on a surface, e.g. floor mats or touch pads
    • A63F13/2145Input arrangements for video game devices characterised by their sensors, purposes or types for locating contacts on a surface, e.g. floor mats or touch pads the surface being also a display device, e.g. touch screens
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/40Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment
    • A63F13/42Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment by mapping the input signals into game commands, e.g. mapping the displacement of a stylus on a touch screen to the steering angle of a virtual vehicle
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/40Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment
    • A63F13/44Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment involving timing of operations, e.g. performing an action within a time slot
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/50Controlling the output signals based on the game progress
    • A63F13/52Controlling the output signals based on the game progress involving aspects of the displayed game scene
    • A63F13/525Changing parameters of virtual cameras
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/50Controlling the output signals based on the game progress
    • A63F13/52Controlling the output signals based on the game progress involving aspects of the displayed game scene
    • A63F13/525Changing parameters of virtual cameras
    • A63F13/5258Changing parameters of virtual cameras by dynamically adapting the position of the virtual camera to keep a game object or game character in its viewing frustum, e.g. for tracking a character or a ball
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/90Constructional details or arrangements of video game devices not provided for in groups A63F13/20 or A63F13/25, e.g. housing, wiring, connections or cabinets
    • A63F13/92Video game devices specially adapted to be hand-held while playing

Definitions

  • the present disclosure relates to the field of computers, and in particular, to a display control method, device, storage medium, processor, and terminal in a game.
  • the mainstream multiplayer online tactical competitive game (MOBA) mobile game usually uses a fixed lens and lens dragging to observe the game scene.
  • MOBA multiplayer online tactical competitive game
  • Fixed lens means that the lens center is fixed on the game character model by default, and the lens height is fixed by default.
  • Gamers can move the camera by tapping on a specific area of the screen, and the drag speed of the camera varies in different types of games.
  • the maximum drag distance can reach 1/4 the maximum length of the battlefield.
  • the longest drag distance can reach almost the entire battlefield.
  • the camera when the game player clicks on the small map in the user interface, the camera will be moved to the clicked position immediately; and, if the game player continuously performs the drag operation after performing the click operation, the camera will also Follow the movement until the game player releases the hand, the camera will automatically return to position.
  • At least some embodiments of the present disclosure provide a display control method, device, storage medium, processor, and terminal in a game, to at least solve the adjustment method of in-game virtual lens provided in the related art, which has a single operation mode and lacks good The technical problem of adaptability and scalability, and the inability to meet the game experience requirements of gamers at different levels.
  • a display control method in a game in which a graphical user interface is obtained by executing a software application on a processor of a mobile terminal and rendering on a touch display of the mobile terminal. It includes a first virtual character and a scene display area. The scene display area is at least part of the game scene. The content displayed by the graphical user interface includes the scene display area.
  • the method includes:
  • determining the focus position in the release direction includes: when the release track of the skill passes through the scene display area, determining the focus position in the release track in the release direction.
  • the above method further includes: prohibiting the response to the first touch operation.
  • detecting that the first touch operation and the second touch operation overlap at least partially in time sequence includes: during the process of the first touch operation updating the scene display area in the game scene, detecting the second touch operation Or, in the process of adjusting the release direction of the skill by the second touch operation, the first touch operation is detected.
  • a virtual camera corresponding to the first virtual character is set in the game scene, and the scene display area in the game scene is an area photographed by the virtual camera.
  • updating the scene display area according to the first touch operation includes: controlling the movement of the virtual camera according to the movement track of the touch point of the first touch operation; and updating the scene display area according to the movement of the virtual camera.
  • updating the scene display area according to the focus position includes: detecting the operation state of the touch point of the second touch operation; when it is determined by the operation state that the rest time of the touch point exceeds a preset threshold, controlling the virtual camera from the current position Move to the focus position at a preset speed; update the scene display area according to the movement of the virtual camera.
  • updating the scene display area according to the focus position includes: adjusting the release direction according to the second touch operation; when the skill release track in the release direction passes through a specific area in the scene display area, determining the focus position according to the skill release track; The focus position updates the scene display area.
  • updating the scene display area according to the focus position includes: acquiring a specific event within a preset range of the focus position or a specific position of the second virtual character; adjusting the update of the scene display area in the game scene according to the specific position and the focus position.
  • the update of adjusting the scene display area in the game scene according to the specific position and focus position includes: adjusting the scene display area in the game scene with preset sensitivity according to the specific position and focus position.
  • a display control device in a game by executing a software application on a processor of a mobile terminal and rendering a graphical user interface on a touch display of the mobile terminal, a game scene of the game Contains a first virtual character and a scene display area.
  • the scene display area is at least part of the game scene.
  • the content displayed by the graphical user interface includes the scene display area.
  • the device includes: a first detection component configured to detect the effect on the graphics The first touch operation in a preset area of the user interface; the first update component, which is set to update the scene display area according to the first touch operation; the second detection component, which is set to detect the second action on the skill control of the graphical user interface Touch operation, in which the skill control corresponds to the skill of the first virtual character; the control component is set to control the release direction of the skill according to the second touch operation; the second update component is set to detect when the first touch operation and When the second touch operation overlaps at least partially in time sequence, a focus position is determined in the release direction, and the scene display area in the game scene is updated according to the focus position in the release direction.
  • the second update component is configured to determine the focus position in the release trajectory in the release direction when the release trajectory of the skill passes the scene display area.
  • the above device further includes: a processing component configured to prohibit response to the first touch operation.
  • the above device further includes: a third detection component configured to detect the second touch operation during the process of updating the scene display area in the game scene by the first touch operation; or, during the second touch operation During the adjustment of the release direction of the skill, the first touch operation is detected.
  • a third detection component configured to detect the second touch operation during the process of updating the scene display area in the game scene by the first touch operation; or, during the second touch operation During the adjustment of the release direction of the skill, the first touch operation is detected.
  • a virtual camera corresponding to the first virtual character is set in the game scene, and the scene display area in the game scene is an area photographed by the virtual camera.
  • the first update component includes: a first control element configured to control the movement of the virtual camera according to the movement track of the touch point of the first touch operation; a first update element configured to update the scene display according to the movement of the virtual camera area.
  • the second update component includes: a detection element configured to detect the operating state of the touch point of the second touch operation; a second control element configured to determine that the static duration of the touch point exceeds the preset when the operation state is determined
  • the virtual camera is controlled to move from the current position to the focus position at a preset speed; the second update element is set to update the scene display area according to the movement of the virtual camera.
  • the second update component includes: a first adjustment element set to adjust the release direction according to the second touch operation; a determination element set to be set when the skill release trajectory in the release direction passes through a specific area in the scene display area, The focus position is determined according to the skill release trajectory; the third update element is set to update the scene display area according to the focus position.
  • the second update component includes: an acquisition element set to acquire a specific event within a preset range of the focus position or a specific position of the second virtual character; a fourth update element set to adjust the game according to the specific position and focus position Update of the scene display area in the scene.
  • the fourth update element is set to adjust the scene display area in the game scene with preset sensitivity according to the specific position and focus position.
  • the storage medium includes a stored program, wherein, when the program is running, the device where the storage medium is located is controlled to perform any one of the display control methods in the game.
  • a processor for running a program wherein when the program is running, any one of the above-mentioned display control methods in the game is executed.
  • a terminal including: one or more processors, a memory, a display device, and one or more programs, wherein the one or more programs are stored in the memory and are It is configured to be executed by one or more processors, and one or more programs are used to execute any one of the display control methods in the game.
  • a first touch operation that detects a predetermined area of the graphical user interface is used to update the scene display area according to the first touch operation, and a first action that detects a skill control of the graphical user interface is used.
  • Two touch operations and a method of controlling the release direction of the skill according to the second touch operation by determining a focus position in the release direction when it is detected that the first touch operation and the second touch operation at least partially overlap in time sequence
  • the purpose of assisting the release of long-distance skills is achieved, so that an intelligent scene display area adjustment method is realized, so that the scene display area adjustment method is more flexible and
  • the intelligent technical effect further solves the technical problem that the in-game virtual lens adjustment method provided in the related art has a single operation mode, lacks good adaptability and scalability, and cannot meet the game experience requirements of game players at different levels.
  • FIG. 1 is a flowchart of a display control method in a game according to one embodiment of the present disclosure
  • FIG. 2 is a flowchart of a display control method in a game according to one of the alternative embodiments of the present disclosure
  • FIG. 3 is a structural block diagram of a display control device in a game according to one embodiment of the present disclosure
  • FIG. 4 is a structural block diagram of a display control device in a game according to one of the alternative embodiments of the present disclosure.
  • an embodiment of a display control method in a game is provided. It should be noted that the steps shown in the flowchart in the drawings may be in a computer system such as a set of computer-executable instructions Execution, and although the logical order is shown in the flowchart, in some cases, the steps shown or described may be performed in an order different from here.
  • the method embodiment can be executed in a mobile terminal, a computer terminal or a similar computing device.
  • the mobile terminal may include one or more processors (processors may include but not limited to a central processing unit (CPU), a graphics processor (GPU), a digital signal processing (DSP) chip, a microcomputer A processing device such as a processor (MCU) or programmable logic device (FPGA)) and a memory for storing data.
  • processors may include but not limited to a central processing unit (CPU), a graphics processor (GPU), a digital signal processing (DSP) chip, a microcomputer A processing device such as a processor (MCU) or programmable logic device (FPGA)) and a memory for storing data.
  • the above mobile terminal may further include a transmission device for communication function and an input and output device.
  • the mobile terminal may further include more or fewer components than the above structural description, or have a configuration different from the above structural description.
  • the memory may be used to store computer programs, for example, software programs and components of application software, such as computer programs corresponding to the display control method in the game in the embodiments of the present disclosure, and the processor executes various programs by running the computer programs stored in the memory Various functional applications and data processing, that is, to realize the display control method in the game described above.
  • the memory may include a high-speed random access memory, and may also include a non-volatile memory, such as one or more magnetic storage devices, flash memory, or other non-volatile solid-state memory.
  • the memory may further include memories remotely provided with respect to the processor, and these remote memories may be connected to the mobile terminal through a network. Examples of the above network include but are not limited to the Internet, intranet, local area network, mobile communication network, and combinations thereof.
  • the transmission device is used to receive or send data via a network.
  • the specific example of the network described above may include a wireless network provided by a communication provider of a mobile terminal.
  • the transmission device includes a network adapter (Network Interface Controller, referred to as NIC for short), which can be connected to other network devices through the base station to communicate with the Internet.
  • the transmission device may be a radio frequency (Radio Frequency, RF for short) component, which is used to communicate with the Internet in a wireless manner.
  • RF Radio Frequency
  • a display control method for a game running on the mobile terminal described above is provided.
  • a graphical user interface is obtained by executing a software application on a processor of the mobile terminal and rendering it on a touch display of the mobile terminal.
  • the game scene includes a first virtual character and a scene display area.
  • the scene display area is at least part of the game scene.
  • the content displayed by the graphical user interface includes the scene display area.
  • FIG. 1 is a game according to one embodiment of the present disclosure.
  • the flow chart of the display control method is shown in Figure 1. The method includes the following steps:
  • Step S10 detecting the first touch operation acting on a preset area of the graphical user interface
  • Step S11 Update the scene display area according to the first touch operation
  • Step S12 Detect a second touch operation acting on a skill control of the graphical user interface, where the skill control corresponds to the skill of the first virtual character;
  • Step S13 controlling the release direction of the skill according to the second touch operation
  • Step S14 when it is detected that the first touch operation and the second touch operation overlap at least partially in time sequence, a focus position is determined in the release direction, and the scene display area in the game scene is updated according to the focus position in the release direction .
  • the first touch operation may be performed before the second touch operation or after the second touch operation, or the first touch operation and the second touch operation may be performed simultaneously.
  • the technical solutions provided by the embodiments of the present disclosure can be applied.
  • the above-mentioned preset area may be either a small map in the graphical user interface or a field of view adjustment area in the graphical user interface.
  • the preset area will be taken as a small map for example, and the implementation process is also applicable to the field of view adjustment area.
  • the game player can adjust the release direction of the skill by pressing and holding (equivalent to the second touch operation) the skill control.
  • the game player can also click or slide (equivalent to the above-mentioned first touch operation) to update the scene display area on the small map.
  • a focus position for example: the geometric center of the graphical user interface
  • the game player uses the second touch operation While adjusting the release direction, the focus position will change accordingly, and then the scene display area in the game scene is updated by the change of the focus position.
  • the skills release direction and the skills release trajectory can only be displayed in the scene display area corresponding to the current field of view of the first virtual character, and beyond the scene display area corresponding to the current field of view
  • the trajectory of the outside skill release is invisible, resulting in deviation from a specific event or enemy virtual character and unable to hit the target.
  • a focus position can be determined in the release direction pointed by the skill indicator, and by moving the skill indicator, the focus position will also follow the movement, thus according to the release direction
  • the focus position updates the scene display area in the game scene, and then accurately locates a specific event or enemy virtual character through the movement of the focus position to complete the precise release of ultra-long-distance skills.
  • step S14 determining the focus position in the release direction may include the following execution steps:
  • Step S140 when the release trajectory of the skill passes through the scene display area, the focus position is determined in the release trajectory in the release direction.
  • the focus position can be determined in the release trajectory in the release direction.
  • the focus position may be any position in the release track within the scene display area.
  • the focus position is the geometric center position in the release track within the scene display area.
  • step S14 when the release track of the skill passes through the scene display area, the above method may further include the following execution steps:
  • step S15 it is prohibited to respond to the first touch operation.
  • the touch operation performed by the game player in the small map will not be responded, and it cannot Update the scene display area, that is, the touch operation in the mini map fails. Only when the game player cancels the skill release operation, the touch operation of the small map can take effect again, that is, the scene display area is updated by the first touch operation. After updating the scene display area through the first touch operation, if the game player cancels the touch operation in the mini-map, the scene display area can be restored to the display state before the game player performs the first touch operation and the second touch operation .
  • step S14 detecting that the first touch operation and the second touch operation overlap at least partially in time sequence may be implemented in one of the following ways:
  • game players usually perform a click operation on a small map to determine the release direction in order to release the directional long-range attack skills.
  • the game player needs to find the attack object (for example: the enemy's residual blood virtual character) in the small map, and then click the skill control, thereby generating the first touch operation and the second touch operation in sequence at least Partially coincide.
  • Manner 2 In the process of adjusting the release direction of the skill by the second touch operation, the first touch operation is detected.
  • the game player can also determine the approximate release direction by tapping the skill control based on experience or game proficiency, and then perform a click operation in the small map to confirm whether the release direction can target the enemy virtual character, thereby generating the first The one touch operation and the second touch operation at least partially overlap in timing.
  • a virtual camera corresponding to the first virtual character is set in the game scene, and the scene display area in the game scene is an area photographed by the virtual camera.
  • the virtual camera can be fixed on the first virtual character controlled by the game player, and follow the movement of the virtual character to move, and rotate according to the rotation of the virtual character, which is similar to the virtual The subjective perspective of the role.
  • the scene display area in the game scene is the area photographed by the virtual camera.
  • the game can also be set at a position relative to the first virtual character controlled by the game player, for example, at a preset position above the first virtual character, and follow the virtual character to move, which is similar to that for the virtual character. Bystander perspective.
  • the scene display area in the game scene is the area photographed by the virtual camera.
  • the following optional embodiments are mainly described by taking a virtual camera fixed at a relative position of a virtual character controlled by a game player as an example.
  • the implementation process is also applicable to setting the game on a virtual character controlled by a game player.
  • updating the scene display area according to the first touch operation may include the following execution steps:
  • Step S111 Control the movement of the virtual camera according to the movement track of the touch point of the first touch operation
  • Step S112 Update the scene display area according to the movement of the virtual camera.
  • the movement trajectory is a collection of touch points generated by the first touch operation at consecutive times and at different positions.
  • the displacement changes of the touch point can be determined according to the movement trajectory of the touch point, and then The above mapping relationship controls the virtual camera to move at a preset speed, and the sensitivity is determined according to the above mapping relationship, so as to adjust the update speed of the scene display area according to the sensitivity.
  • the above displacement change can calculate the displacement change between two adjacent frames of images, and can also calculate the displacement change between the start touch point and the end touch point of the moving track.
  • step S14 updating the scene display area according to the focus position may include the following execution steps:
  • Step S141 Detect the operation state of the touch point of the second touch operation
  • Step S142 when it is determined through the operation state that the stationary duration of the touch point exceeds a preset threshold, the virtual camera is controlled to move from the current position to the focus position at a preset speed;
  • Step S143 Update the scene display area according to the movement of the virtual camera.
  • the mobile terminal can control the virtual camera to move from the current position (for example: the geometric center position of the scene display area) to the focus position at a preset speed, accompanied by As you approach from the current position to the focus position, the scene display area is automatically updated.
  • step S14 updating the scene display area according to the focus position may include the following execution steps:
  • Step S144 Adjust the release direction according to the second touch operation
  • Step S145 when the skill release trajectory in the release direction passes through a specific area in the scene display area, the focus position is determined according to the skill release trajectory;
  • Step S146 Update the scene display area according to the focus position.
  • the specific area may be any area in the entire visual field adjustment area in the graphical user interface. In an optional embodiment, the specific area is the center of the current scene display area.
  • the game player can adjust the release direction of the skill by tapping or dragging the skill control. If the skill release trajectory is displayed in the current scene display area, and the skill release trajectory passes the current position of the virtual camera (that is, the skill release trajectory passes through the center of the current scene display area), the focus position can be determined according to the skill release trajectory. Update the scene display area.
  • step S14 updating the scene display area according to the focus position may include the following execution steps:
  • Step S147 Acquire a specific event within a preset range of the focus position or a specific position of the second virtual character
  • Step S148 Adjust the update of the scene display area in the game scene according to the specific position and the focus position.
  • the above-mentioned second virtual character may be an enemy virtual character.
  • the above-mentioned specific events may be events that have an important influence on the progress of the game, for example: team fights, chasing after blood, rescue, hunting important monsters, etc. After being determined as a specific event, a specific area may be generated around the location where the specific event occurs, which may be the smallest circle covering all virtual characters participating in the specific event.
  • the scene display area in the game scene can be adjusted with preset sensitivity according to the specific position and focus position.
  • the update speed of the scene display area can be adjusted according to the first sensitivity according to the mapping relationship between the displacement change of the focus position and the preset first sensitivity .
  • the focus position can be adjusted so that the release direction of the skill points to the specific position, thereby generating a displacement change of the focus position.
  • the update speed of the scene display area is adjusted according to the second sensitivity. That is, the update speed of the scene display area in the release direction in the game scene is accelerated.
  • the above displacement change can calculate the displacement change between two adjacent frames of images, and also can calculate the displacement change between the start position and the end position of the focus position during the movement.
  • FIG. 2 is a flowchart of a display control method in a game according to one of the alternative embodiments of the present disclosure.
  • the focus position can be adjusted by Pointing the release direction of the skill to a specific position where the second virtual character is located, thereby generating a displacement change of the focus position. Then, according to the mapping relationship between the displacement change of the focus position and the above-mentioned second sensitivity, the update speed of the scene display area in the release direction in the game scene is accelerated.
  • Intelligent lens adjustment method according to the difference in game player's operation behavior and game player's situation, provide different ways of lens assistance to meet the special needs of the game's field of vision, and intelligently adjust the lens fine-tuning operation To reduce the operational burden of game players, so that game players can obtain the most needed game information in the most relaxed way, thereby improving the efficiency of game information transmission and providing game players with a smoother game experience.
  • Game players and novice players with limited levels of operation who are unable to use lens operations proficiently and cannot receive battlefield information well can adapt to the game faster, master the game lens operation methods, and on specific special lens operations, give The intelligent solution reduces the learning cost of game players, lowers the overall operating threshold of the game, and improves the coverage area of the user group corresponding to the game.
  • the method according to the above embodiments can be implemented by means of software plus a necessary general hardware platform, and of course, it can also be implemented by hardware, but in many cases the former is Better implementation.
  • the technical solution of the present disclosure can be embodied in the form of a software product in essence or part that contributes to the existing technology, and the computer software product is stored in a storage medium (such as ROM/RAM, magnetic disk,
  • the CD-ROM includes several instructions to enable a terminal device (which may be a mobile phone, computer, server, or network device, etc.) to perform the methods described in the embodiments of the present disclosure.
  • a display control device in a game is also provided.
  • the device is used to implement the above-mentioned embodiments and preferred implementation modes, and descriptions that have already been described will not be repeated.
  • the term “component” may be a combination of software and/or hardware that implements predetermined functions.
  • the devices described in the following embodiments are preferably implemented in software, implementation of hardware or a combination of software and hardware is also possible and conceived.
  • FIG. 3 is a structural block diagram of a display control device in a game according to one embodiment of the present disclosure.
  • graphics are obtained by executing a software application on a processor of a mobile terminal and rendering on a touch display of the mobile terminal User interface
  • the game scene of the game includes a first virtual character and a scene display area
  • the scene display area is at least part of the game scene
  • the content displayed by the graphical user interface includes the scene display area
  • the device includes: a first detection component 10.
  • the first update component 20 is set to update the scene display area according to the first touch operation
  • the second detection component 30 is set to detect the function For a second touch operation of a skill control in the graphical user interface, wherein the skill control corresponds to the skill of the first virtual character
  • the control component 40 is set to control the release direction of the skill according to the second touch operation
  • the second update component 50 Set to determine a focus position in the release direction when the first touch operation and the second touch operation are detected to coincide at least partially in time sequence, and update the scene display area in the game scene according to the focus position in the release direction .
  • the second update component 50 is configured to determine the focus position in the release trajectory in the release direction when the release trajectory of the skill passes through the scene display area.
  • FIG. 4 is a structural block diagram of a display control device in a game according to one of the optional embodiments of the present disclosure. As shown in FIG. 4, the device includes all the components shown in FIG. : The processing component 60 is set to prohibit response to the first touch operation.
  • the above device further includes: a third detection component 70 configured to detect the second touch operation during the process of updating the scene display area in the game scene with the first touch operation; or In the process of adjusting the release direction of the skill by the second touch operation, the first touch operation is detected.
  • a third detection component 70 configured to detect the second touch operation during the process of updating the scene display area in the game scene with the first touch operation; or In the process of adjusting the release direction of the skill by the second touch operation, the first touch operation is detected.
  • a virtual camera corresponding to the first virtual character is set in the game scene, and the scene display area in the game scene is an area photographed by the virtual camera.
  • the first update component 20 includes: a first control element (not shown in the figure) configured to control the movement of the virtual camera according to the movement track of the touch point of the first touch operation; the first update element (in the figure (Not shown), set to update the scene display area according to the movement of the virtual camera.
  • the second update component 50 includes: a detection element (not shown in the figure) configured to detect the operating state of the touch point of the second touch operation; a second control element (not shown in the figure) configured To control the virtual camera to move from the current position at the preset speed to the focus position when the static duration of the touch point is determined by the operating state to exceed the preset threshold; the second update element (not shown in the figure) is set to be based on the virtual camera The mobile update scene display area.
  • the second update component 50 includes: a first adjustment element (not shown in the figure), which is set to adjust the release direction according to the second touch operation; a determination element (not shown in the figure), which is set to be the release direction When the skill release trajectory on passes a specific area in the scene display area, the focus position is determined according to the skill release trajectory; the third update element (not shown in the figure) is set to update the scene display area according to the focus position.
  • the second update component 50 includes: an acquisition element (not shown in the figure), which is set to acquire a specific event within a preset range of the focus position or a specific position of the second virtual character; a fourth update element (in the figure (Not shown), set to adjust the update of the scene display area in the game scene according to the specific position and focus position.
  • a fourth update element (not shown in the figure) is set to adjust the scene display area in the game scene with preset sensitivity according to the specific position and the focus position.
  • the above components can be implemented by software or hardware. For the latter, they can be implemented in the following ways, but not limited to this: the above components are all located in the same processor; or, the above components can be combined in any combination The forms are located in different processors.
  • An embodiment of the present disclosure also provides a storage medium in which a computer program is stored, wherein the computer program is set to execute any of the steps in the above method embodiments when it is run.
  • the above storage medium may be set to store a computer program for performing the following steps:
  • the above storage medium may include, but is not limited to: a USB flash drive, a read-only memory (Read-Only Memory, ROM for short), a random access memory (Random Access Memory, RAM for short), Various media that can store computer programs, such as removable hard disks, magnetic disks, or optical disks.
  • An embodiment of the present disclosure also provides a processor configured to run a computer program to perform the steps in any one of the above method embodiments.
  • the foregoing processor may be configured to perform the following steps through a computer program:
  • the disclosed technical content may be implemented in other ways.
  • the device embodiments described above are only schematic.
  • the division of the elements may be a logical function division.
  • there may be another division manner for example, multiple elements or components may be combined or may Integration into another system, or some features can be ignored, or not implemented.
  • the displayed or discussed mutual coupling or direct coupling or communication connection may be indirect coupling or communication connection through some interfaces, elements or components, and may be in electrical or other forms.
  • the components described as separate components may or may not be physically separated, and the components displayed as components may or may not be physical components, that is, they may be located in one place, or may be distributed on multiple components. Some or all of the components may be selected according to actual needs to achieve the objectives of the solutions of the embodiments.
  • each functional element in each embodiment of the present disclosure may be integrated into one processing element, or each element may exist alone physically, or two or more elements may be integrated into one element.
  • the above integrated elements can be implemented in the form of hardware or software functional elements.
  • the integrated element is implemented in the form of a software functional element and sold or used as an independent product, it may be stored in a computer-readable storage medium.
  • the technical solution of the present disclosure essentially or part of the contribution to the existing technology or all or part of the technical solution can be embodied in the form of a software product, and the computer software product is stored in a storage medium , Including several instructions to enable a computer device (which may be a personal computer, server, or network device, etc.) to perform all or part of the steps of the methods described in various embodiments of the present disclosure.
  • the aforementioned storage media include: U disk, read-only memory (ROM, Read-Only Memory), random access memory (RAM, Random Access Memory), mobile hard disk, magnetic disk or optical disk and other media that can store program code .

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Human Computer Interaction (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

一种游戏中的显示控制方法,包括:检测作用于图形用户界面一预设区域的第一触控操作;根据第一触控操作更新场景显示区域;检测作用于图形用户界面一技能控件的第二触控操作,其中,技能控件与第一虚拟角色的技能对应;根据第二触控操作控制技能的释放方向;当检测到第一触控操作和第二触控操作在时序上至少部分重合时,在释放方向上确定一焦点位置,并根据释放方向上的焦点位置更新游戏场景中的场景显示区域。还提供了一种游戏中的显示控制装置、存储介质、处理器及终端。该方法解决了游戏内虚拟镜头的调整方式操作方式单一、缺乏良好的适应性和扩展性,且无法满足不同层次游戏玩家的游戏体验需求的技术问题。

Description

游戏中的显示控制方法、装置、存储介质、处理器及终端
交叉援引
本公开基于申请号为201910024151.3、申请日为2019-01-10的中国专利申请提出,并要求该中国专利申请的优先权,该中国专利申请的全部内容在此引入本公开作为参考。
技术领域
本公开涉及计算机领域,具体而言,涉及一种游戏中的显示控制方法、装置、存储介质、处理器及终端。
背景技术
目前,主流多人在线战术竞技游戏(MOBA)手游通常采用固定镜头加镜头拖拽的基础方式来观察游戏场景。另外,还有一部分游戏会在此基础上对少数特定类型英雄角色进行有针对性的镜头交互设计;或者,在特定镜头方面进行补充设计,例如:游戏玩家可以手动切换镜头的高低模式。
固定镜头是指镜头中心默认固定在游戏角色模型的身上,镜头高度默认为固定值。游戏玩家通过点按屏幕特定区域进行拖拽,可以移动镜头,而镜头的拖动速度在不同类别游戏中存在差异。在一部分MOBA手游中,最长拖拽距离可以达到1/4战场最大长度。而在另外一部分MOBA手游中,则最长拖拽距离几乎可以达到整个战场。
此外,通过游戏玩家点按用户界面内的小地图,会立刻将镜头移动至所点按的位置处;而且,若游戏玩家在执行点按操作之后又连续执行了拖动操作,则镜头也会跟随移动,直至游戏玩家松手后,镜头会自动归位。
然而,在当前市面上的MOBA类手游中,由于受到双手交互方式的限制,因此,绝大部分游戏都使用了固定镜头加镜头拖拽的解决方案,以满足游戏玩家对战场视野和镜头操作的需求。然而,通过上述分析可知,这套以固定镜头与主动拖拽镜头相结合的解决方案只能够满足最基本的功能需求,对游戏中出现的特殊的情况和操作,缺乏良好的适应性和扩展性,且无法满足不同层次游戏玩家的游戏体验需求。
针对上述的问题,目前尚未提出有效的解决方案。
发明内容
本公开至少部分实施例提供了一种游戏中的显示控制方法、装置、存储介质、处理器及终端,以至少解决相关技术中所提供的游戏内虚拟镜头的调整方式操作方式单一、缺乏良好的适应性和扩展性,且无法满足不同层次游戏玩家的游戏体验需求的技术问题。
根据本公开其中一实施例,提供了一种游戏中的显示控制方法,通过在移动终端的处理器上执行软件应用并在移动终端的触控显示器上渲染得到图形用户界面,游戏的游戏场景中包含一第一虚拟角色和一场景显示区域,场景显示区域为至少部分的游戏场景,图形用户界面所显示的内容包含场景显示区域,该方法包括:
检测作用于图形用户界面一预设区域的第一触控操作;根据第一触控操作更新场景显示区域;检测作用于图形用户界面一技能控件的第二触控操作,其中,技能控件与第一虚拟角色的技能对应;根据第二触控操作控制技能的释放方向;当检测到第一触控操作和第二触控操作在时序上至少部分重合时,在释放方向上确定一焦点位置,并根据释放方向上的焦点位置更新游戏场景中的场景显示区域。
可选地,在释放方向上确定焦点位置包括:当技能的释放轨迹经过场景显示区域时,在释放方向上的释放轨迹中确定焦点位置。
可选地,当技能的释放轨迹经过场景显示区域时,上述方法还包括:禁止响应第一触控操作。
可选地,检测到第一触控操作和第二触控操作在时序上至少部分重合包括:在第一触控操作更新游戏场景中的场景显示区域的过程中,检测到第二触控操作;或者,在第二触控操作调整技能的释放方向的过程中,检测到第一触控操作。
可选地,游戏场景中设置有与第一虚拟角色对应的虚拟摄像机,游戏场景中的场景显示区域为虚拟摄像机拍摄的区域。
可选地,根据第一触控操作更新场景显示区域包括:根据第一触控操作的触控点的移动轨迹控制虚拟摄像机移动;根据虚拟摄像机的移动更新场景显示区域。
可选地,根据焦点位置更新场景显示区域包括:检测第二触控操作的触控点的操作状态;当通过操作状态确定触控点的静止时长超过预设阈值时,控制虚拟摄像机从当前位置以预设速度移动至焦点位置;根据虚拟摄像机的移动更新场景显示区域。
可选地,根据焦点位置更新场景显示区域包括:根据第二触控操作调整释放方向;当释放方向上的技能释放轨迹经过场景显示区域中的特定区域时,根据技能释放轨迹 确定焦点位置;根据焦点位置更新场景显示区域。
可选地,根据焦点位置更新场景显示区域包括:获取焦点位置的预设范围内的特定事件或第二虚拟角色的特定位置;根据特定位置和焦点位置调整游戏场景中的场景显示区域的更新。
可选地,根据特定位置和焦点位置调整游戏场景中的场景显示区域的更新包括:根据特定位置和焦点位置以预设灵敏度调整游戏场景中的场景显示区域。
根据本公开其中一实施例,还提供了一种游戏中的显示控制装置,通过在移动终端的处理器上执行软件应用并在移动终端的触控显示器上渲染得到图形用户界面,游戏的游戏场景中包含一第一虚拟角色和一场景显示区域,场景显示区域为至少部分的游戏场景,图形用户界面所显示的内容包含场景显示区域,该装置包括:第一检测组件,设置为检测作用于图形用户界面一预设区域的第一触控操作;第一更新组件,设置为根据第一触控操作更新场景显示区域;第二检测组件,设置为检测作用于图形用户界面一技能控件的第二触控操作,其中,技能控件与第一虚拟角色的技能对应;控制组件,设置为根据第二触控操作控制技能的释放方向;第二更新组件,设置为当检测到第一触控操作和第二触控操作在时序上至少部分重合时,在释放方向上确定一焦点位置,并根据释放方向上的焦点位置更新游戏场景中的场景显示区域。
可选地,第二更新组件,设置为当技能的释放轨迹经过场景显示区域时,在释放方向上的释放轨迹中确定焦点位置。
可选地,上述装置还包括:处理组件,设置为禁止响应第一触控操作。
可选地,上述装置还包括:第三检测组件,设置为在第一触控操作更新游戏场景中的场景显示区域的过程中,检测到第二触控操作;或者,在第二触控操作调整技能的释放方向的过程中,检测到第一触控操作。
可选地,游戏场景中设置有与第一虚拟角色对应的虚拟摄像机,游戏场景中的场景显示区域为虚拟摄像机拍摄的区域。
可选地,第一更新组件包括:第一控制元件,设置为根据第一触控操作的触控点的移动轨迹控制虚拟摄像机移动;第一更新元件,设置为根据虚拟摄像机的移动更新场景显示区域。
可选地,第二更新组件包括:检测元件,设置为检测第二触控操作的触控点的操作状态;第二控制元件,设置为当通过操作状态确定触控点的静止时长超过预设阈值时,控制虚拟摄像机从当前位置以预设速度移动至焦点位置;第二更新元件,设置为 根据虚拟摄像机的移动更新场景显示区域。
可选地,第二更新组件包括:第一调整元件,设置为根据第二触控操作调整释放方向;确定元件,设置为当释放方向上的技能释放轨迹经过场景显示区域中的特定区域时,根据技能释放轨迹确定焦点位置;第三更新元件,设置为根据焦点位置更新场景显示区域。
可选地,第二更新组件包括:获取元件,设置为获取焦点位置的预设范围内的特定事件或第二虚拟角色的特定位置;第四更新元件,设置为根据特定位置和焦点位置调整游戏场景中的场景显示区域的更新。
可选地,第四更新元件,设置为根据特定位置和焦点位置以预设灵敏度调整游戏场景中的场景显示区域。
根据本公开其中一实施例,还提供了一种存储介质,存储介质包括存储的程序,其中,在程序运行时控制存储介质所在设备执行上述任意一项的游戏中的显示控制方法。
根据本公开其中一实施例,还提供了一种处理器,处理器用于运行程序,其中,程序运行时执行上述任意一项的游戏中的显示控制方法。
根据本公开其中一实施例,还提供了一种终端,包括:一个或多个处理器,存储器,显示装置以及一个或多个程序,其中,一个或多个程序被存储在存储器中,并且被配置为由一个或多个处理器执行,一个或多个程序用于执行上述任意一项的游戏中的显示控制方法。
在本公开至少部分实施例中,采用检测作用于图形用户界面一预设区域的第一触控操作并根据第一触控操作更新场景显示区域,以及检测作用于图形用户界面一技能控件的第二触控操作并根据第二触控操作控制技能的释放方向的方式,通过当检测到第一触控操作和第二触控操作在时序上至少部分重合时,在释放方向上确定一焦点位置以便根据释放方向上的焦点位置更新游戏场景中的场景显示区域,达到了辅助释放远距离技能的目的,从而实现了智能化的场景显示区域调整方式,以使得场景显示区域的调节方式更加灵活与智能的技术效果,进而解决了相关技术中所提供的游戏内虚拟镜头的调整方式操作方式单一、缺乏良好的适应性和扩展性,且无法满足不同层次游戏玩家的游戏体验需求的技术问题。
附图说明
此处所说明的附图用来提供对本公开的进一步理解,构成本申请的一部分,本公 开的示意性实施例及其说明用于解释本公开,并不构成对本公开的不当限定。在附图中:
图1是根据本公开其中一实施例的游戏中的显示控制方法的流程图;
图2是根据本公开其中一可选实施例的游戏中的显示控制方法的流程图;
图3是根据本公开其中一实施例的游戏中的显示控制装置的结构框图;
图4是根据本公开其中一可选实施例的游戏中的显示控制装置的结构框图。
具体实施方式
为了使本技术领域的人员更好地理解本公开方案,下面将结合本公开实施例中的附图,对本公开实施例中的技术方案进行清楚、完整地描述,显然,所描述的实施例仅仅是本公开一部分的实施例,而不是全部的实施例。基于本公开中的实施例,本领域普通技术人员在没有做出创造性劳动前提下所获得的所有其他实施例,都应当属于本公开保护的范围。
需要说明的是,本公开的说明书和权利要求书及上述附图中的术语“第一”、“第二”等是用于区别类似的对象,而不必用于描述特定的顺序或先后次序。应该理解这样使用的数据在适当情况下可以互换,以便这里描述的本公开的实施例能够以除了在这里图示或描述的那些以外的顺序实施。此外,术语“包括”和“具有”以及他们的任何变形,意图在于覆盖不排他的包含,例如,包含了一系列步骤或元件的过程、方法、系统、产品或设备不必限于清楚地列出的那些步骤或元件,而是可包括没有清楚地列出的或对于这些过程、方法、产品或设备固有的其它步骤或元件。
根据本公开其中一实施例,提供了一种游戏中的显示控制方法的实施例,需要说明的是,在附图的流程图示出的步骤可以在诸如一组计算机可执行指令的计算机系统中执行,并且,虽然在流程图中示出了逻辑顺序,但是在某些情况下,可以以不同于此处的顺序执行所示出或描述的步骤。
该方法实施例可以在移动终端、计算机终端或者类似的运算装置中执行。以运行在移动终端上为例,移动终端可以包括一个或多个处理器(处理器可以包括但不限于中央处理器(CPU)、图形处理器(GPU)、数字信号处理(DSP)芯片、微处理器(MCU)或可编程逻辑器件(FPGA)等的处理装置)和用于存储数据的存储器。可选地,上述移动终端还可以包括用于通信功能的传输装置以及输入输出设备。本领域普通技术人员可以理解,上述结构描述仅为示意,其并不对上述移动终端的结构造成限定。例如, 移动终端还可包括比上述结构描述更多或者更少的组件,或者具有与上述结构描述不同的配置。
存储器可用于存储计算机程序,例如,应用软件的软件程序以及组件,如本公开实施例中的游戏中的显示控制方法对应的计算机程序,处理器通过运行存储在存储器内的计算机程序,从而执行各种功能应用以及数据处理,即实现上述的游戏中的显示控制方法。存储器可包括高速随机存储器,还可包括非易失性存储器,如一个或者多个磁性存储装置、闪存、或者其他非易失性固态存储器。在一些实例中,存储器可进一步包括相对于处理器远程设置的存储器,这些远程存储器可以通过网络连接至移动终端。上述网络的实例包括但不限于互联网、企业内部网、局域网、移动通信网及其组合。
传输装置用于经由一个网络接收或者发送数据。上述的网络具体实例可包括移动终端的通信供应商提供的无线网络。在一个实例中,传输装置包括一个网络适配器(Network Interface Controller,简称为NIC),其可通过基站与其他网络设备相连从而可与互联网进行通讯。在一个实例中,传输装置可以为射频(Radio Frequency,简称为RF)组件,其用于通过无线方式与互联网进行通讯。
在本实施例中提供了一种运行于上述移动终端的游戏中的显示控制方法,通过在移动终端的处理器上执行软件应用并在移动终端的触控显示器上渲染得到图形用户界面,游戏的游戏场景中包含一第一虚拟角色和一场景显示区域,场景显示区域为至少部分的游戏场景,图形用户界面所显示的内容包含场景显示区域,图1是根据本公开其中一实施例的游戏中的显示控制方法的流程图,如图1所示,该方法包括如下步骤:
步骤S10,检测作用于图形用户界面一预设区域的第一触控操作;
步骤S11,根据第一触控操作更新场景显示区域;
步骤S12,检测作用于图形用户界面一技能控件的第二触控操作,其中,技能控件与第一虚拟角色的技能对应;
步骤S13,根据第二触控操作控制技能的释放方向;
步骤S14,当检测到第一触控操作和第二触控操作在时序上至少部分重合时,在释放方向上确定一焦点位置,并根据释放方向上的焦点位置更新游戏场景中的场景显示区域。
通过上述步骤,可以采用检测作用于图形用户界面一预设区域的第一触控操作并根据第一触控操作更新场景显示区域,以及检测作用于图形用户界面一技能控件的第 二触控操作并根据第二触控操作控制技能的释放方向的方式,通过当检测到第一触控操作和第二触控操作在时序上至少部分重合时,在释放方向上确定一焦点位置以便根据释放方向上的焦点位置更新游戏场景中的场景显示区域,达到了辅助释放远距离技能的目的,从而实现了智能化的场景显示区域调整方式,以使得场景显示区域的调节方式更加灵活与智能的技术效果,进而解决了相关技术中所提供的游戏内虚拟镜头的调整方式操作方式单一、缺乏良好的适应性和扩展性,且无法满足不同层次游戏玩家的游戏体验需求的技术问题。
需要说明的是,上述步骤S10-步骤S11与上述步骤S12-步骤S13在时序上并不存在严格的先后顺序。即,第一触控操作既可以先于第二触控操作执行,也可以后于第二触控操作执行,还可以同时执行第一触控操作和第二触控操作。只要第一触控操作和第二触控操作在时序上至少部分重合,均可以适用本公开实施例所提供的技术方案。
上述预设区域既可以是图形用户界面内的小地图,也可以图形用户界面内的视野调整区域。下面将以预设区域为小地图进行举例说明,其实施过程同样适用于视野调整区域。
对于释放指向性的超远距离技能(例如:技能释放范围覆盖整个游戏场景)而言,游戏玩家按住(相当于上述第二触控操作)技能控件可以调整技能的释放方向。此外,游戏玩家还可以点击或滑动(相当于上述第一触控操作)小地图更新场景显示区域。当第一触控操作和第二触控操作在时序上至少部分重合时,会在释放方向上确定一焦点位置(例如:图形用户界面的几何中心),而在游戏玩家通过第二触控操作调整释放方向的同时,焦点位置也会随之发生变化,进而通过焦点位置的变化更新游戏场景中的场景显示区域。
相关技术中所采用的在释放超远距离技能时,仅能够在第一虚拟角色当前视野范围对应的场景显示区域内显示技能释放方向与技能释放轨迹,而超出当前视野范围对应的场景显示区域之外的技能释放轨迹处于不可见状态,由此导致偏离特定事件或敌方虚拟角色而无法击中目标。相反地,在本公开实施例提供的技术方案中,在技能指示器所指向的释放方向上能够确定一个焦点位置,通过移动技能指示器,焦点位置也会跟随移动,由此根据释放方向上的焦点位置更新游戏场景中的场景显示区域,进而通过焦点位置的移动准确地定位到特定事件或敌方虚拟角色,以完成超远距离技能的精准释放。
可选地,在步骤S14中,在释放方向上确定焦点位置可以包括以下执行步骤:
步骤S140,当技能的释放轨迹经过场景显示区域时,在释放方向上的释放轨迹中确定焦点位置。
在检测到游戏玩家作用于小地图的点击操作并更新场景显示区域,以及检测到游戏玩家作用于远距离技能控件的点按操作并控制技能的释放方向之后,如果发现技能的释放轨迹经过场景显示区域,则可以在释放方向上的释放轨迹中确定焦点位置。该焦点位置可以是场景显示区域内的释放轨迹中的任一位置。在本公开的一个可选实施例中,该焦点位置为场景显示区域内的释放轨迹中的几何中心位置。
可选地,在步骤S14中,当技能的释放轨迹经过场景显示区域时,上述方法还可以包括以下执行步骤:
步骤S15,禁止响应第一触控操作。
当技能的释放轨迹经过场景显示区域时,由于需要根据释放方向上的焦点位置更新游戏场景中的场景显示区域,因此,游戏玩家在小地图中执行的触控操作并不会得到响应,其无法更新场景显示区域,即小地图内的触控操作失效。只有在游戏玩家取消技能释放操作,小地图的触控操作才能够重新生效,即通过第一触控操作更新场景显示区域。在通过第一触控操作更新场景显示区域之后,如果游戏玩家在小地图中取消触控操作,则场景显示区域可以恢复至游戏玩家执行第一触控操作和第二触控操作之前的显示状态。
可选地,在步骤S14中,检测到第一触控操作和第二触控操作在时序上至少部分重合可以通过以下方式之一来实现:
方式一、在第一触控操作更新游戏场景中的场景显示区域的过程中,检测到第二触控操作;
即,游戏玩家为释放指向性远距离攻击技能通常会通过在小地图上执行点击操作,以确定释放方向。为此,游戏玩家需要先在小地图中查找攻击对象(例如:敌方残血虚拟角色),然后再点按技能控件,由此产生第一触控操作和第二触控操作在时序上至少部分重合。
方式二、在第二触控操作调整技能的释放方向的过程中,检测到第一触控操作。
即,游戏玩家也可以凭借经验或游戏熟练度先通过点按技能控件确定大致的释放方向,然后再通过在小地图中执行点击操作以确认释放方向是否能够瞄准敌方虚拟角色,由此产生第一触控操作和第二触控操作在时序上至少部分重合。
可选地,游戏场景中设置有与第一虚拟角色对应的虚拟摄像机,游戏场景中的场景显示区域为虚拟摄像机拍摄的区域。
在一个可选实施例中,可以将虚拟摄像机固定在游戏玩家所操控的第一虚拟角色 身上,并跟随虚拟角色的移动而发生移动,以及根据虚拟角色的转动而发生转动,其类似于该虚拟角色的主观视角。由此,游戏场景中的场景显示区域为虚拟摄像机拍摄的区域。当然,还可以将游戏设置在与游戏玩家所操控的第一虚拟角色相对位置处,例如设置在第一虚拟角色上方头顶预设位置,并跟随虚拟角色发生移动,其类似于针对该虚拟角色的旁观视角。由此,游戏场景中的场景显示区域为虚拟摄像机拍摄的区域。以下可选实施例主要以虚拟摄像机固定在游戏玩家所操控的虚拟角色相对位置处为例进行详细说明,其实施过程同样适用于将游戏设置在与游戏玩家所操控的虚拟角色身上。
可选地,在步骤S11中,根据第一触控操作更新场景显示区域可以包括以下执行步骤:
步骤S111,根据第一触控操作的触控点的移动轨迹控制虚拟摄像机移动;
步骤S112,根据虚拟摄像机的移动更新场景显示区域。
游戏玩家在小地图上执行拖拽或滑动操作(相当于上述第一触控操作)过程中,会产生相应的移动轨迹。该移动轨迹是第一触控操作在连续时刻、不同位置上产生的触控点的集合。鉴于触控点的位移变化、虚拟摄像机的移动速度以及场景显示区域更新的灵敏度这三者存在预设的映射关系,因此,根据触控点的移动轨迹能够确定触控点的位移变化,进而根据上述映射关系控制虚拟摄像机以预设速度移动,并且根据上述映射关系确定灵敏度,以便按照该灵敏度调整场景显示区域的更新速度。上述位移变化既可以计算相邻两帧图像之间的位移变化,也可以计算移动轨迹的起始触控点与终止触控点之间的位移变化。
可选地,在步骤S14中,根据焦点位置更新场景显示区域可以包括以下执行步骤:
步骤S141,检测第二触控操作的触控点的操作状态;
步骤S142,当通过操作状态确定触控点的静止时长超过预设阈值时,控制虚拟摄像机从当前位置以预设速度移动至焦点位置;
步骤S143,根据虚拟摄像机的移动更新场景显示区域。
在游戏玩家通过执行第二触控操作控制技能的释放方向的过程中,由于受到特定因素影响(例如:游戏玩家被外界因素干扰无法正常操作、游戏内卡顿)导致第二触控操作在技能控件上的触控点处于静止状态并且静止时长超过预设阈值,因此,移动终端可以控制虚拟摄像机从当前位置(例如:场景显示区域的几何中心位置)以预设速度移动至焦点位置,并伴随着从当前位置向焦点位置靠近,自动更新场景显示区域。
可选地,在步骤S14中,根据焦点位置更新场景显示区域可以包括以下执行步骤:
步骤S144,根据第二触控操作调整释放方向;
步骤S145,当释放方向上的技能释放轨迹经过场景显示区域中的特定区域时,根据技能释放轨迹确定焦点位置;
步骤S146,根据焦点位置更新场景显示区域。
上述特定区域可以是图形用户界面内整个视野调整区域中的任一区域。在一个可选实施例中,该特定区域为当前场景显示区域的中心地带。游戏玩家通过点按或拖拽技能控件可以调整技能的释放方向。如果技能释放轨迹显示在当前场景显示区域,且技能释放轨迹经过虚拟摄像机的当前位置时(即技能释放轨迹穿过当前场景显示区域的中心地带),则可以根据技能释放轨迹确定焦点位置,以此更新场景显示区域。
可选地,在步骤S14中,根据焦点位置更新场景显示区域可以包括以下执行步骤:
步骤S147,获取焦点位置的预设范围内的特定事件或第二虚拟角色的特定位置;
步骤S148,根据特定位置和焦点位置调整游戏场景中的场景显示区域的更新。
上述第二虚拟角色可以是敌方虚拟角色。上述特定事件可以是对游戏进程产生重要影响的事件,例如:团战,残血追杀,救援,狩猎重要野怪等。在判定为特定事件之后,可以在特定事件发生的位置周围生成一个特定区域,其可以为覆盖参与该特定事件中全部虚拟角色的最小圆。
在一个可选实施例中,可以根据特定位置和焦点位置以预设灵敏度调整游戏场景中的场景显示区域。当焦点位置的预设范围内未存在特定事件或第二虚拟角色时,可以根据焦点位置的位移变化与预设的第一灵敏度之间的映射关系,按照第一灵敏度调整场景显示区域的更新速度。当焦点位置的预设范围内存在特定事件或第二虚拟角色时,可以通过调整焦点位置以使技能的释放方向指向特定位置,由此产生焦点位置的位移变化。然后,再根据焦点位置的位移变化与预设的第二灵敏度之间的映射关系,按照第二灵敏度调整场景显示区域的更新速度。即,加快场景显示区域在游戏场景中沿释放方向的更新速度。上述位移变化既可以计算相邻两帧图像之间的位移变化,也可以计算焦点位置在移动过程中的起始位置与终止位置之间的位移变化。
图2是根据本公开其中一可选实施例的游戏中的显示控制方法的流程图,如图2所示,当焦点位置的预设范围内存在第二虚拟角色时,可以通过调整焦点位置以使技能的释放方向指向第二虚拟角色所在的特定位置,由此产生焦点位置的位移变化。然后,再根据焦点位置的位移变化与上述第二灵敏度之间的映射关系,加快场景显示区 域在游戏场景中沿释放方向的更新速度。
综合上述各个实施例,可以实现如下技术效果:
(1)综合优化了当前主流的镜头方案,弥补了当前主流MOBA类手游镜头方案中镜头方式固定僵化,依赖游戏玩家频繁操作等缺陷,从而使得游戏镜头的操作更加灵活与智能。而且,具有良好的延展性,为游戏新玩法、新角色等变动留有较多的设计空间,以使游戏的镜头方案更加多元化、定制化。
(2)智能化的镜头调整方式,根据游戏玩家的操作行为差异和游戏玩家所处游戏情境差异,提供不同方式的镜头辅助,以满足游戏视野的特殊需求,并将镜头精细调节的操作智能化,减少游戏玩家的操作负担,使得游戏玩家能够以最轻松的方式获得当前最需要的游戏信息,从而提升了游戏信息传递效率,提供给游戏玩家更加流畅的游戏体验。
(3)可以让无法熟练使用镜头操作,也无法很好接收战场信息的操作水平有限的游戏玩家和新手玩家能够更快的适应游戏,掌握游戏镜头操作方法,而且在特定特殊镜头操作上,给予智能化解决方式,降低游戏玩家的学习成本,降低了游戏整体的操作门槛,提升游戏对应用户群的覆盖面积。
(4)能够使得操作水平高超的游戏玩家能够以最便利的操作完成更多精细的游戏操作,为此类游戏玩家的游戏技能提供了更多改进空间,提升了此类游戏玩家的游戏体验,有利于该层次游戏玩家的留存。
(5)作为一个整体优化方案,能够适应不同层次的游戏玩家的需求,为游戏整体的推广传播提供有利条件。
通过以上的实施方式的描述,本领域的技术人员可以清楚地了解到根据上述实施例的方法可借助软件加必需的通用硬件平台的方式来实现,当然也可以通过硬件,但很多情况下前者是更佳的实施方式。基于这样的理解,本公开的技术方案本质上或者说对现有技术做出贡献的部分可以以软件产品的形式体现出来,该计算机软件产品存储在一个存储介质(如ROM/RAM、磁碟、光盘)中,包括若干指令用以使得一台终端设备(可以是手机,计算机,服务器,或者网络设备等)执行本公开各个实施例所述的方法。
在本实施例中还提供了一种游戏中的显示控制装置,该装置用于实现上述实施例及优选实施方式,已经进行过说明的不再赘述。如以下所使用的,术语“组件”可以实现预定功能的软件和/或硬件的组合。尽管以下实施例所描述的装置较佳地以软件来实现,但是硬件,或者软件和硬件的组合的实现也是可能并被构想的。
图3是根据本公开其中一实施例的游戏中的显示控制装置的结构框图,如图3所示,通过在移动终端的处理器上执行软件应用并在移动终端的触控显示器上渲染得到图形用户界面,游戏的游戏场景中包含一第一虚拟角色和一场景显示区域,场景显示区域为至少部分的游戏场景,图形用户界面所显示的内容包含场景显示区域,该装置包括:第一检测组件10,设置为检测作用于图形用户界面一预设区域的第一触控操作;第一更新组件20,设置为根据第一触控操作更新场景显示区域;第二检测组件30,设置为检测作用于图形用户界面一技能控件的第二触控操作,其中,技能控件与第一虚拟角色的技能对应;控制组件40,设置为根据第二触控操作控制技能的释放方向;第二更新组件50,设置为当检测到第一触控操作和第二触控操作在时序上至少部分重合时,在释放方向上确定一焦点位置,并根据释放方向上的焦点位置更新游戏场景中的场景显示区域。
可选地,第二更新组件50,设置为当技能的释放轨迹经过场景显示区域时,在释放方向上的释放轨迹中确定焦点位置。
可选地,图4是根据本公开其中一可选实施例的游戏中的显示控制装置的结构框图,如图4所示,该装置除包括图3所示的所有组件外,上述装置还包括:处理组件60,设置为禁止响应第一触控操作。
可选地,如图4所示,上述装置还包括:第三检测组件70,设置为在第一触控操作更新游戏场景中的场景显示区域的过程中,检测到第二触控操作;或者,在第二触控操作调整技能的释放方向的过程中,检测到第一触控操作。
可选地,游戏场景中设置有与第一虚拟角色对应的虚拟摄像机,游戏场景中的场景显示区域为虚拟摄像机拍摄的区域。
可选地,第一更新组件20包括:第一控制元件(图中未示出),设置为根据第一触控操作的触控点的移动轨迹控制虚拟摄像机移动;第一更新元件(图中未示出),设置为根据虚拟摄像机的移动更新场景显示区域。
可选地,第二更新组件50包括:检测元件(图中未示出),设置为检测第二触控操作的触控点的操作状态;第二控制元件(图中未示出),设置为当通过操作状态确定触控点的静止时长超过预设阈值时,控制虚拟摄像机从当前位置以预设速度移动至焦点位置;第二更新元件(图中未示出),设置为根据虚拟摄像机的移动更新场景显示区域。
可选地,第二更新组件50包括:第一调整元件(图中未示出),设置为根据第二触控操作调整释放方向;确定元件(图中未示出),设置为当释放方向上的技能释放轨 迹经过场景显示区域中的特定区域时,根据技能释放轨迹确定焦点位置;第三更新元件(图中未示出),设置为根据焦点位置更新场景显示区域。
可选地,第二更新组件50包括:获取元件(图中未示出),设置为获取焦点位置的预设范围内的特定事件或第二虚拟角色的特定位置;第四更新元件(图中未示出),设置为根据特定位置和焦点位置调整游戏场景中的场景显示区域的更新。
可选地,第四更新元件(图中未示出),设置为根据特定位置和焦点位置以预设灵敏度调整游戏场景中的场景显示区域。
需要说明的是,上述各个组件是可以通过软件或硬件来实现的,对于后者,可以通过以下方式实现,但不限于此:上述组件均位于同一处理器中;或者,上述各个组件以任意组合的形式分别位于不同的处理器中。
本公开的实施例还提供了一种存储介质,该存储介质中存储有计算机程序,其中,该计算机程序被设置为运行时执行上述任一项方法实施例中的步骤。
可选地,在本实施例中,上述存储介质可以被设置为存储用于执行以下步骤的计算机程序:
S1,检测作用于图形用户界面一预设区域的第一触控操作;
S2,根据第一触控操作更新场景显示区域;
S3,检测作用于图形用户界面一技能控件的第二触控操作,其中,技能控件与第一虚拟角色的技能对应;
S4,根据第二触控操作控制技能的释放方向;
S5,当检测到第一触控操作和第二触控操作在时序上至少部分重合时,在释放方向上确定一焦点位置,并根据释放方向上的焦点位置更新游戏场景中的场景显示区域。
可选地,在本实施例中,上述存储介质可以包括但不限于:U盘、只读存储器(Read-Only Memory,简称为ROM)、随机存取存储器(Random Access Memory,简称为RAM)、移动硬盘、磁碟或者光盘等各种可以存储计算机程序的介质。
本公开的实施例还提供了一种处理器,该处理器被设置为运行计算机程序以执行上述任一项方法实施例中的步骤。
可选地,在本实施例中,上述处理器可以被设置为通过计算机程序执行以下步骤:
S1,检测作用于图形用户界面一预设区域的第一触控操作;
S2,根据第一触控操作更新场景显示区域;
S3,检测作用于图形用户界面一技能控件的第二触控操作,其中,技能控件与第一虚拟角色的技能对应;
S4,根据第二触控操作控制技能的释放方向;
S5,当检测到第一触控操作和第二触控操作在时序上至少部分重合时,在释放方向上确定一焦点位置,并根据释放方向上的焦点位置更新游戏场景中的场景显示区域。
上述本公开实施例序号仅仅为了描述,不代表实施例的优劣。
在本公开的上述实施例中,对各个实施例的描述都各有侧重,某个实施例中没有详述的部分,可以参见其他实施例的相关描述。
在本申请所提供的几个实施例中,应该理解到,所揭露的技术内容,可通过其它的方式实现。其中,以上所描述的装置实施例仅仅是示意性的,例如所述元件的划分,可以为一种逻辑功能划分,实际实现时可以有另外的划分方式,例如多个元件或组件可以结合或者可以集成到另一个系统,或一些特征可以忽略,或不执行。另一点,所显示或讨论的相互之间的耦合或直接耦合或通信连接可以是通过一些接口,元件或组件的间接耦合或通信连接,可以是电性或其它的形式。
所述作为分离部件说明的元件可以是或者也可以不是物理上分开的,作为元件显示的部件可以是或者也可以不是物理元件,即可以位于一个地方,或者也可以分布到多个元件上。可以根据实际的需要选择其中的部分或者全部元件来实现本实施例方案的目的。
另外,在本公开各个实施例中的各功能元件可以集成在一个处理元件中,也可以是各个元件单独物理存在,也可以两个或两个以上元件集成在一个元件中。上述集成的元件既可以采用硬件的形式实现,也可以采用软件功能元件的形式实现。
所述集成的元件如果以软件功能元件的形式实现并作为独立的产品销售或使用时,可以存储在一个计算机可读取存储介质中。基于这样的理解,本公开的技术方案本质上或者说对现有技术做出贡献的部分或者该技术方案的全部或部分可以以软件产品的形式体现出来,该计算机软件产品存储在一个存储介质中,包括若干指令用以使得一 台计算机设备(可为个人计算机、服务器或者网络设备等)执行本公开各个实施例所述方法的全部或部分步骤。而前述的存储介质包括:U盘、只读存储器(ROM,Read-Only Memory)、随机存取存储器(RAM,Random Access Memory)、移动硬盘、磁碟或者光盘等各种可以存储程序代码的介质。
以上所述仅是本公开的优选实施方式,应当指出,对于本技术领域的普通技术人员来说,在不脱离本公开原理的前提下,还可以做出若干改进和润饰,这些改进和润饰也应视为本公开的保护范围。

Claims (14)

  1. 一种游戏中的显示控制方法,通过在移动终端的处理器上执行软件应用并在所述移动终端的触控显示器上渲染得到图形用户界面,所述游戏的游戏场景中包含一第一虚拟角色和一场景显示区域,所述场景显示区域为至少部分的所述游戏场景,所述图形用户界面所显示的内容包含所述场景显示区域,所述方法包括:
    检测作用于所述图形用户界面一预设区域的第一触控操作;
    根据所述第一触控操作更新所述场景显示区域;
    检测作用于图形用户界面一技能控件的第二触控操作,其中,所述技能控件与所述第一虚拟角色的技能对应;
    根据所述第二触控操作控制所述技能的释放方向;
    当检测到所述第一触控操作和所述第二触控操作在时序上至少部分重合时,在所述释放方向上确定一焦点位置,并根据所述释放方向上的焦点位置更新所述游戏场景中的所述场景显示区域。
  2. 根据权利要求1所述的方法,其中,在所述释放方向上确定所述焦点位置包括:
    当所述技能的释放轨迹经过所述场景显示区域时,在所述释放方向上的释放轨迹中确定所述焦点位置。
  3. 根据权利要求2所述的方法,其中,当所述技能的释放轨迹经过所述场景显示区域时,所述方法还包括:禁止响应所述第一触控操作。
  4. 根据权利要求1所述的方法,其中,检测到所述第一触控操作和所述第二触控操作在时序上至少部分重合包括:
    在所述第一触控操作更新所述游戏场景中的所述场景显示区域的过程中,检测到所述第二触控操作;或者,
    在所述第二触控操作调整所述技能的释放方向的过程中,检测到所述第一触控操作。
  5. 根据权利要求1所述的方法,其中,所述游戏场景中设置有与所述第一虚拟角色对应的虚拟摄像机,所述游戏场景中的所述场景显示区域为所述虚拟摄像机拍摄的区域。
  6. 根据权利要求5所述的方法,其中,根据所述第一触控操作更新所述场景显示区 域包括:
    根据所述第一触控操作的触控点的移动轨迹控制所述虚拟摄像机移动;
    根据所述虚拟摄像机的移动更新所述场景显示区域。
  7. 根据权利要求6所述的方法,其中,根据所述焦点位置更新所述场景显示区域包括:
    检测所述第二触控操作的触控点的操作状态;
    当通过所述操作状态确定所述触控点的静止时长超过预设阈值时,控制所述虚拟摄像机从当前位置以预设速度移动至所述焦点位置;
    根据所述虚拟摄像机的移动更新所述场景显示区域。
  8. 根据权利要求6所述的方法,其中,根据所述焦点位置更新所述场景显示区域包括:
    根据所述第二触控操作调整所述释放方向;
    当所述释放方向上的技能释放轨迹经过所述场景显示区域中的特定区域时,根据所述技能释放轨迹确定所述焦点位置;
    根据所述焦点位置更新所述场景显示区域。
  9. 根据权利要求1所述的方法,其中,根据所述焦点位置更新所述场景显示区域包括:
    获取所述焦点位置的预设范围内的特定事件或第二虚拟角色的特定位置;
    根据所述特定位置和所述焦点位置调整所述游戏场景中的所述场景显示区域的更新。
  10. 根据权利要求9所述的方法,其中,根据所述特定位置和所述焦点位置调整所述游戏场景中的所述场景显示区域的更新包括:
    根据所述特定位置和所述焦点位置以预设灵敏度调整所述游戏场景中的所述场景显示区域。
  11. 一种游戏中的显示控制装置,通过在移动终端的处理器上执行软件应用并在所述移动终端的触控显示器上渲染得到图形用户界面,所述游戏的游戏场景中包含一第一虚拟角色和一场景显示区域,所述场景显示区域为至少部分的所述游戏场景,所述图形用户界面所显示的内容包含所述场景显示区域,所述装置包括:
    第一检测组件,设置为检测作用于所述图形用户界面一预设区域的第一触控操作;
    第一更新组件,设置为根据所述第一触控操作更新所述场景显示区域;
    第二检测组件,设置为检测作用于图形用户界面一技能控件的第二触控操作,其中,所述技能控件与所述第一虚拟角色的技能对应;
    控制组件,设置为根据所述第二触控操作控制所述技能的释放方向;
    第二更新组件,设置为当检测到所述第一触控操作和所述第二触控操作在时序上至少部分重合时,在所述释放方向上确定一焦点位置,并根据所述释放方向上的焦点位置更新所述游戏场景中的所述场景显示区域。
  12. 一种存储介质,所述存储介质包括存储的程序,其中,在所述程序运行时控制所述存储介质所在设备执行权利要求1至10中任意一项所述的游戏中的显示控制方法。
  13. 一种处理器,所述处理器用于运行程序,其中,所述程序运行时执行权利要求1至10中任意一项所述的游戏中的显示控制方法。
  14. 一种终端,包括:一个或多个处理器,存储器,显示装置以及一个或多个程序,其中,所述一个或多个程序被存储在所述存储器中,并且被配置为由所述一个或多个处理器执行,所述一个或多个程序用于执行权利要求1至10中任意一项所述的游戏中的显示控制方法。
PCT/CN2019/086464 2019-01-10 2019-05-10 游戏中的显示控制方法、装置、存储介质、处理器及终端 WO2020143148A1 (zh)

Priority Applications (1)

Application Number Priority Date Filing Date Title
US16/632,547 US11446565B2 (en) 2019-01-10 2019-05-10 In-game display control method and apparatus, storage medium processor, and terminal

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN201910024151.3 2019-01-10
CN201910024151.3A CN109568957B (zh) 2019-01-10 2019-01-10 游戏中的显示控制方法、装置、存储介质、处理器及终端

Publications (1)

Publication Number Publication Date
WO2020143148A1 true WO2020143148A1 (zh) 2020-07-16

Family

ID=65916165

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2019/086464 WO2020143148A1 (zh) 2019-01-10 2019-05-10 游戏中的显示控制方法、装置、存储介质、处理器及终端

Country Status (3)

Country Link
US (1) US11446565B2 (zh)
CN (1) CN109568957B (zh)
WO (1) WO2020143148A1 (zh)

Families Citing this family (15)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109568957B (zh) * 2019-01-10 2020-02-07 网易(杭州)网络有限公司 游戏中的显示控制方法、装置、存储介质、处理器及终端
CN110354506B (zh) * 2019-08-20 2023-11-21 网易(杭州)网络有限公司 游戏操作方法及装置
CN111124226B (zh) * 2019-12-17 2021-07-30 网易(杭州)网络有限公司 游戏画面的显示控制方法、装置、电子设备及存储介质
CN111481934B (zh) * 2020-04-09 2023-02-10 腾讯科技(深圳)有限公司 虚拟环境画面的显示方法、装置、设备及存储介质
CN111589142B (zh) * 2020-05-15 2023-03-21 腾讯科技(深圳)有限公司 虚拟对象的控制方法、装置、设备及介质
US11458394B2 (en) * 2020-09-11 2022-10-04 Riot Games, Inc. Targeting of a long-range object in a multiplayer game
CN112206516B (zh) * 2020-10-21 2024-06-11 网易(杭州)网络有限公司 游戏中的信息显示方法、装置、存储介质及计算机设备
CN112402967B (zh) * 2020-12-04 2024-04-12 网易(杭州)网络有限公司 游戏控制方法、装置、终端设备及介质
CN112957738B (zh) * 2021-03-19 2024-06-14 北京完美赤金科技有限公司 游戏技能释放的处理方法、装置及电子设备
CN113350779A (zh) * 2021-06-16 2021-09-07 网易(杭州)网络有限公司 游戏虚拟角色动作控制方法及装置、存储介质及电子设备
CN113440846B (zh) * 2021-07-15 2024-05-10 网易(杭州)网络有限公司 游戏的显示控制方法、装置、存储介质及电子设备
CN113694514B (zh) * 2021-09-03 2024-06-25 网易(杭州)网络有限公司 对象控制方法及装置
CN113996060A (zh) * 2021-10-29 2022-02-01 腾讯科技(成都)有限公司 显示画面的调整方法和装置、存储介质及电子设备
CN114870393B (zh) * 2022-04-14 2024-07-05 北京字跳网络技术有限公司 一种技能释放方法、装置、计算机设备及存储介质
CN115738230A (zh) * 2022-10-08 2023-03-07 网易(杭州)网络有限公司 游戏的操作控制方法、装置和电子设备

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105094346A (zh) * 2015-09-29 2015-11-25 腾讯科技(深圳)有限公司 一种信息处理方法、终端及计算机存储介质
CN107661630A (zh) * 2017-08-28 2018-02-06 网易(杭州)网络有限公司 一种射击游戏的控制方法及装置、存储介质、处理器、终端
CN107678663A (zh) * 2017-08-25 2018-02-09 网易(杭州)网络有限公司 一种控制游戏技能释放的方法及装置
CN107823882A (zh) * 2017-11-17 2018-03-23 网易(杭州)网络有限公司 信息处理方法、装置、电子设备及存储介质
CN107913520A (zh) * 2017-12-14 2018-04-17 网易(杭州)网络有限公司 信息处理方法、装置、电子设备及存储介质
CN108196765A (zh) * 2017-12-13 2018-06-22 网易(杭州)网络有限公司 显示控制方法、电子设备及存储介质
US20180311579A1 (en) * 2012-08-31 2018-11-01 Kabushiki Kaisha Square Enix (Also Trading As Square Enix Co., Ltd.) Video game processing apparatus and video game processing program product
CN109568957A (zh) * 2019-01-10 2019-04-05 网易(杭州)网络有限公司 游戏中的显示控制方法、装置、存储介质、处理器及终端

Family Cites Families (24)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20140357356A1 (en) * 2013-05-28 2014-12-04 DeNA Co., Ltd. Character battle system controlled by user's flick motion
US9561432B2 (en) * 2014-03-12 2017-02-07 Wargaming.Net Limited Touch control with dynamic zones
KR101580210B1 (ko) * 2015-01-08 2016-01-04 라인플러스 주식회사 터치 방식에 적합한 스마트 제어를 제공하는 게임 방법 및 게임 시스템
JP6450350B2 (ja) * 2015-07-20 2019-01-09 ネオウィズ コーポレーション ゲーム制御方法、ゲーム制御装置、及びその記録媒体
CN105260100B (zh) * 2015-09-29 2017-05-17 腾讯科技(深圳)有限公司 一种信息处理方法和终端
CN105148517B (zh) * 2015-09-29 2017-08-15 腾讯科技(深圳)有限公司 一种信息处理方法、终端及计算机存储介质
CN105214309B (zh) * 2015-10-10 2017-07-11 腾讯科技(深圳)有限公司 一种信息处理方法、终端及计算机存储介质
CN108355348B (zh) * 2015-10-10 2021-01-26 腾讯科技(成都)有限公司 信息处理方法、终端及计算机存储介质
CN105335065A (zh) * 2015-10-10 2016-02-17 腾讯科技(深圳)有限公司 一种信息处理方法、终端及计算机存储介质
JP6143934B1 (ja) * 2016-11-10 2017-06-07 株式会社Cygames 情報処理プログラム、情報処理方法、及び情報処理装置
CN107450812A (zh) * 2017-06-26 2017-12-08 网易(杭州)网络有限公司 虚拟对象控制方法及装置、存储介质、电子设备
CN107376339B (zh) * 2017-07-18 2018-12-28 网易(杭州)网络有限公司 在游戏中锁定目标的交互方法及装置
CN107617213B (zh) * 2017-07-27 2019-02-19 网易(杭州)网络有限公司 信息处理方法及装置、存储介质、电子设备
CN109843399B (zh) * 2017-08-03 2020-07-21 腾讯科技(深圳)有限公司 用于提供游戏控制的设备、方法和图形用户界面
CN107648847B (zh) * 2017-08-22 2020-09-22 网易(杭州)网络有限公司 信息处理方法及装置、存储介质、电子设备
CN107741819B (zh) * 2017-09-01 2018-11-23 网易(杭州)网络有限公司 信息处理方法、装置、电子设备及存储介质
CN107648848B (zh) * 2017-09-01 2018-11-16 网易(杭州)网络有限公司 信息处理方法及装置、存储介质、电子设备
JP6418299B1 (ja) * 2017-09-15 2018-11-07 株式会社セガゲームス 情報処理装置及びプログラム
CN116450020A (zh) * 2017-09-26 2023-07-18 网易(杭州)网络有限公司 虚拟射击主体控制方法、装置、电子设备及存储介质
CN107754308A (zh) * 2017-09-28 2018-03-06 网易(杭州)网络有限公司 信息处理方法、装置、电子设备及存储介质
CN108310770A (zh) * 2018-01-05 2018-07-24 腾讯科技(深圳)有限公司 虚拟控制对象的控制方法、装置、存储介质和电子装置
CN108465240B (zh) * 2018-03-22 2020-08-11 腾讯科技(深圳)有限公司 标记点位置显示方法、装置、终端及计算机可读存储介质
CN108379844B (zh) * 2018-03-30 2020-10-23 腾讯科技(深圳)有限公司 控制虚拟对象移动的方法、装置、电子装置及存储介质
CN108509139B (zh) * 2018-03-30 2019-09-10 腾讯科技(深圳)有限公司 虚拟对象的移动控制方法、装置、电子装置及存储介质

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20180311579A1 (en) * 2012-08-31 2018-11-01 Kabushiki Kaisha Square Enix (Also Trading As Square Enix Co., Ltd.) Video game processing apparatus and video game processing program product
CN105094346A (zh) * 2015-09-29 2015-11-25 腾讯科技(深圳)有限公司 一种信息处理方法、终端及计算机存储介质
CN107678663A (zh) * 2017-08-25 2018-02-09 网易(杭州)网络有限公司 一种控制游戏技能释放的方法及装置
CN107661630A (zh) * 2017-08-28 2018-02-06 网易(杭州)网络有限公司 一种射击游戏的控制方法及装置、存储介质、处理器、终端
CN107823882A (zh) * 2017-11-17 2018-03-23 网易(杭州)网络有限公司 信息处理方法、装置、电子设备及存储介质
CN108196765A (zh) * 2017-12-13 2018-06-22 网易(杭州)网络有限公司 显示控制方法、电子设备及存储介质
CN107913520A (zh) * 2017-12-14 2018-04-17 网易(杭州)网络有限公司 信息处理方法、装置、电子设备及存储介质
CN109568957A (zh) * 2019-01-10 2019-04-05 网易(杭州)网络有限公司 游戏中的显示控制方法、装置、存储介质、处理器及终端

Also Published As

Publication number Publication date
US20210322864A1 (en) 2021-10-21
US11446565B2 (en) 2022-09-20
CN109568957B (zh) 2020-02-07
CN109568957A (zh) 2019-04-05

Similar Documents

Publication Publication Date Title
WO2020143148A1 (zh) 游戏中的显示控制方法、装置、存储介质、处理器及终端
WO2020143144A1 (zh) 游戏中的显示控制方法、装置、存储介质、处理器及终端
WO2020143145A1 (zh) 游戏中的显示控制方法、装置、存储介质、处理器及终端
WO2020143146A1 (zh) 游戏中的显示控制方法、装置、存储介质、处理器及终端
WO2020143147A1 (zh) 游戏中的显示控制方法、装置、存储介质、处理器及终端
JP6722252B2 (ja) 情報処理方法及び装置、記憶媒体、電子機器
US11298609B2 (en) Virtual object movement control method and apparatus, electronic apparatus, and storage medium
JP7379532B2 (ja) 仮想オブジェクトの制御方法、装置、機器及びコンピュータプログラム
CN111124226B (zh) 游戏画面的显示控制方法、装置、电子设备及存储介质
US11623142B2 (en) Data processing method and mobile terminal
CN108310764B (zh) 辅助定位方法、装置及设备
JP7390400B2 (ja) 仮想オブジェクトの制御方法並びにその、装置、端末及びコンピュータプログラム
WO2021244209A1 (zh) 虚拟对象控制方法、装置、终端及存储介质
WO2022213521A1 (zh) 游戏中控制虚拟对象移动的方法、装置、电子设备及存储介质
CN110075522A (zh) 射击游戏中虚拟武器的控制方法、装置及终端
CN113546417A (zh) 一种信息处理方法、装置、电子设备和存储介质
CN113476825B (zh) 一种游戏中的角色控制方法、角色控制装置、设备和介质
WO2021244237A1 (zh) 虚拟对象控制方法、装置、计算机设备及存储介质
CN115957507A (zh) 一种游戏中虚拟角色交互方法、装置、设备及介质
CN113663326B (zh) 一种游戏技能的瞄准方法和装置
KR20230158075A (ko) 컴퓨터 프로그램, 그것에 사용하는 게임 시스템 및 제어 방법
JP2017158680A (ja) ゲーム制御方法、および、ゲームプログラム
CN117122912A (zh) 一种游戏中角色的控制方法、控制装置、设备和介质
CN117122893A (zh) 游戏中角色的控制方法、控制装置、计算机设备和介质
CN117046098A (zh) 一种游戏中虚拟角色的控制方法、控制装置、设备和介质

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 19908955

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 19908955

Country of ref document: EP

Kind code of ref document: A1