CN113457144B - Virtual unit selection method and device in game, storage medium and electronic equipment - Google Patents

Virtual unit selection method and device in game, storage medium and electronic equipment Download PDF

Info

Publication number
CN113457144B
CN113457144B CN202110856819.8A CN202110856819A CN113457144B CN 113457144 B CN113457144 B CN 113457144B CN 202110856819 A CN202110856819 A CN 202110856819A CN 113457144 B CN113457144 B CN 113457144B
Authority
CN
China
Prior art keywords
virtual
selection frame
user interface
graphical user
target
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202110856819.8A
Other languages
Chinese (zh)
Other versions
CN113457144A (en
Inventor
桑田
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Netease Hangzhou Network Co Ltd
Original Assignee
Netease Hangzhou Network Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Netease Hangzhou Network Co Ltd filed Critical Netease Hangzhou Network Co Ltd
Priority to CN202110856819.8A priority Critical patent/CN113457144B/en
Publication of CN113457144A publication Critical patent/CN113457144A/en
Application granted granted Critical
Publication of CN113457144B publication Critical patent/CN113457144B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/50Controlling the output signals based on the game progress
    • A63F13/52Controlling the output signals based on the game progress involving aspects of the displayed game scene
    • A63F13/525Changing parameters of virtual cameras
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/20Input arrangements for video game devices
    • A63F13/21Input arrangements for video game devices characterised by their sensors, purposes or types
    • A63F13/214Input arrangements for video game devices characterised by their sensors, purposes or types for locating contacts on a surface, e.g. floor mats or touch pads
    • A63F13/2145Input arrangements for video game devices characterised by their sensors, purposes or types for locating contacts on a surface, e.g. floor mats or touch pads the surface being also a display device, e.g. touch screens
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/20Input arrangements for video game devices
    • A63F13/21Input arrangements for video game devices characterised by their sensors, purposes or types
    • A63F13/218Input arrangements for video game devices characterised by their sensors, purposes or types using pressure sensors, e.g. generating a signal proportional to the pressure applied by the player
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/10Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by input arrangements for converting player-generated signals into game device control signals
    • A63F2300/1068Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by input arrangements for converting player-generated signals into game device control signals being specially adapted to detect the point of contact of the player on a surface, e.g. floor mat, touch pad
    • A63F2300/1075Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by input arrangements for converting player-generated signals into game device control signals being specially adapted to detect the point of contact of the player on a surface, e.g. floor mat, touch pad using a touch screen
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/30Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by output arrangements for receiving control signals generated by the game device
    • A63F2300/308Details of the user interface

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Human Computer Interaction (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The disclosure relates to the technical field of man-machine interaction, and provides a virtual unit selection method and device in a game, a computer readable storage medium and electronic equipment. The method comprises the following steps: responding to control operation of the touch medium for a virtual control in a graphical user interface, and adjusting the display size of an initial selection frame displayed in the graphical user interface to generate a target selection frame; selecting a target virtual unit from the plurality of virtual units to be selected according to the position relation between the target selection frame and the plurality of virtual units to be selected; and controlling the target virtual unit to execute game actions. According to the scheme, the initial selection frame can be directly subjected to size adjustment based on the virtual control, so that virtual units in the game can be selected, and the selection efficiency of the virtual units can be improved.

Description

Virtual unit selection method and device in game, storage medium and electronic equipment
Technical Field
The disclosure relates to the technical field of man-machine interaction, in particular to a virtual unit selection method, a virtual unit selection device, a computer readable storage medium and electronic equipment in a game.
Background
Some required virtual units are selected from a plurality of virtual units in the game to control the selected virtual units to perform some other game operations, which is one of conventional game operations of an RTS (Real-Time Strategy) type game.
In the related art, a selection frame is formed by double-touch to select a game unit within the selection frame. However, this method requires the player to perform double-finger clicking and operation, and the operation steps are complex and inefficient, and when the player performs double-finger operation in the screen, the line of sight of the player is blocked, which affects the game experience and the accuracy of the game operation of the player.
It should be noted that the information disclosed in the above background section is only for enhancing understanding of the background of the present disclosure and thus may include information that does not constitute prior art known to those of ordinary skill in the art.
Disclosure of Invention
The present disclosure is directed to a method and an apparatus for selecting virtual units in a game, a computer readable storage medium, and an electronic device, and further, at least to a certain extent, to overcome the problems that virtual units in a game are selected with low efficiency, and in the selection process, a player is blocked from view, which affects the experience of the player.
Other features and advantages of the present disclosure will be apparent from the following detailed description, or may be learned in part by the practice of the disclosure.
According to a first aspect of the present disclosure, there is provided a virtual unit selection method in a game, wherein a game screen of the game is displayed through a graphical user interface of a display component, the game screen including a part or all of a game scene and a plurality of virtual units to be selected located in the game scene, the method comprising:
responding to control operation of the touch medium for a virtual control in a graphical user interface, and adjusting the display size of an initial selection frame displayed in the graphical user interface to generate a target selection frame;
selecting a target virtual unit from the plurality of virtual units to be selected according to the position relation between the target selection frame and the plurality of virtual units to be selected;
and controlling the target virtual unit to execute game actions.
In an exemplary embodiment of the disclosure, based on the foregoing solution, before responding to the control operation of the touch medium for the virtual control in the graphical user interface, the method further includes:
and responding to the triggering operation of the touch medium for a first preset area in the graphical user interface, displaying a virtual control in the graphical user interface, and displaying an initial selection frame at a preset position of the graphical user interface.
In an exemplary embodiment of the disclosure, based on the foregoing, the virtual control includes a rocker and a chassis, the rocker being located in the chassis;
the step of responding to the control operation of the touch medium for the virtual control in the graphical user interface, the step of adjusting the display size of the initial selection frame displayed in the graphical user interface to generate a target selection frame comprises the following steps:
and in response to the sliding operation of the touch medium on the rocker in the chassis, adjusting the display size of an initial selection frame displayed in the graphical user interface according to the sliding distance of the sliding operation so as to generate a target selection frame.
In an exemplary embodiment of the present disclosure, based on the foregoing solution, a game screen obtained by photographing a part or all of a game scene and a plurality of virtual units located in the game scene by a virtual camera is displayed through a graphical user interface; selecting the target virtual unit from the plurality of virtual units to be selected according to the position relationship between the target selection frame and the plurality of virtual units to be selected, including:
responding to the sliding operation of a second preset area in a graphical user interface, and adjusting the pose of a virtual camera in the game so that the display position of a virtual unit to be selected in the graphical user interface is at least partially positioned in the target selection frame;
And selecting a target virtual unit from the plurality of virtual units to be selected according to the virtual units to be selected, which are at least partially positioned in the target selection frame, of the display positions in the graphical user interface.
In an exemplary embodiment of the present disclosure, based on the foregoing solution, the selecting, from the plurality of virtual units to be selected, the target virtual unit according to a positional relationship between the target selection frame and the plurality of virtual units to be selected includes:
responding to a sliding operation of a second preset area in a graphical user interface, and moving the target selection frame in the graphical user interface so that the display position of a virtual unit to be selected in the graphical user interface is at least partially positioned in the target selection frame;
and selecting a target virtual unit from the plurality of virtual units to be selected according to the virtual units to be selected, which are at least partially positioned in the target selection frame, of the display positions in the graphical user interface.
In an exemplary embodiment of the disclosure, based on the foregoing, the virtual control includes a rocker and a chassis, the rocker being located in the chassis;
the method for generating the target selection frame comprises the steps of responding to control operation of a touch medium for a virtual control in a graphical user interface, adjusting the display size of an initial selection frame displayed in the graphical user interface to generate the target selection frame, and comprising the following steps:
Detecting a pressing force value of the touch medium for the rocker in response to a pressing operation of the touch medium for the rocker;
and adjusting the display size of the initial selection frame displayed in the graphical user interface according to the pressing force value based on a preset mapping relation so as to generate a target selection frame.
In an exemplary embodiment of the disclosure, based on the foregoing, the virtual control includes a rocker and a chassis, the rocker being located in the chassis;
the method for generating the target selection frame comprises the steps of responding to control operation of a touch medium for a virtual control in a graphical user interface, adjusting the display size of an initial selection frame displayed in the graphical user interface to generate the target selection frame, and comprising the following steps:
and when the touch medium is detected to be in a contact state with the rocker, responding to the sliding operation in a second preset area, and adjusting the display size of an initial selection frame displayed in the graphical user interface according to the sliding distance and/or the sliding direction of the sliding operation so as to generate a target selection frame.
In an exemplary embodiment of the present disclosure, based on the foregoing solution, the selecting, according to a positional relationship between the target selection frame and a plurality of virtual units to be selected, a target virtual unit from the plurality of virtual units to be selected includes:
Responding to the sliding operation of a touch medium in the chassis for the rocker, and moving the target selection frame in the graphical user interface so that the display position of a virtual unit to be selected in the graphical user interface is at least partially positioned in the target selection frame;
and selecting a target virtual unit from the plurality of virtual units to be selected according to the virtual units to be selected, which are at least partially positioned in the target selection frame, of the display positions in the graphical user interface.
In an exemplary embodiment of the present disclosure, based on the foregoing, the selecting, from the plurality of virtual units to be selected, the target virtual unit according to the virtual unit to be selected located at least partially within the target selection frame at the display position in the graphical user interface includes:
and selecting a target virtual unit from the plurality of virtual units to be selected according to the position overlapping degree between the display position of the virtual unit to be selected and the target selection frame.
In an exemplary embodiment of the present disclosure, based on the foregoing solution, the selecting, according to a positional relationship between the target selection frame and a plurality of virtual units to be selected, a target virtual unit from the plurality of virtual units to be selected includes:
When the touch medium and the virtual control are detected to change from a contact state to a non-contact state, selecting a target virtual unit from the plurality of virtual units to be selected according to the position relation between the target selection frame and the plurality of virtual units to be selected.
According to a second aspect of the present disclosure, there is provided a virtual unit selection device in a game, displaying a game screen of the game through a graphical user interface of a display assembly, the game screen including a part or all of a game scene and a plurality of virtual units to be selected located in the game scene, including:
the target selection frame generation module is configured to respond to control operation of the touch medium on the virtual control in the graphical user interface, and adjust the display size of the initial selection frame displayed in the graphical user interface to generate a target selection frame;
the selection module is configured to select a target virtual unit from the plurality of virtual units to be selected according to the position relation between the target selection frame and the plurality of virtual units to be selected;
and the control module is configured to control the target virtual unit to execute game actions.
According to a third aspect of the present disclosure, there is provided a computer-readable storage medium having stored thereon a computer program which, when executed by a processor, implements a virtual unit selection method in a game as described in the first aspect in the above-described embodiment.
According to a fourth aspect of embodiments of the present disclosure, there is provided an electronic device, comprising: a processor; and storage means for storing one or more programs which, when executed by the one or more processors, cause the one or more processors to implement the virtual unit selection method in a game as described in the first aspect in the above embodiments.
As can be seen from the above technical solutions, the method for selecting a virtual unit in a game, the device for selecting a virtual unit in a game, and the computer-readable storage medium and the electronic device for implementing the method for selecting a virtual unit in a game according to the exemplary embodiments of the present disclosure have at least the following advantages and positive effects:
in the technical solutions provided in some embodiments of the present disclosure, first, in response to a control operation of a touch medium for a virtual control in a graphical user interface, a display size of an initial selection frame displayed in the graphical user interface is adjusted to generate a target selection frame, and then, a target virtual unit may be selected from a plurality of virtual units according to a positional relationship between the target selection frame and the plurality of virtual units to be selected, so as to control the target virtual unit to execute a game action. Compared with the related art, on one hand, the method and the device can directly operate the virtual control to adjust the display size of the initial selection frame so as to select the virtual units in the game, simplify the operation steps when the virtual units are selected, and improve the selection efficiency of the virtual units in the game; on the other hand, the virtual unit selection mode through the virtual control can reduce or avoid shielding of the visual field range of the player in the virtual unit selection process, and improve the accuracy of game operation of the player.
It is to be understood that both the foregoing general description and the following detailed description are exemplary and explanatory only and are not restrictive of the disclosure.
Drawings
The accompanying drawings, which are incorporated in and constitute a part of this specification, illustrate embodiments consistent with the disclosure and together with the description, serve to explain the principles of the disclosure. It will be apparent to those of ordinary skill in the art that the drawings in the following description are merely examples of the disclosure and that other drawings may be derived from them without undue effort.
FIG. 1A is a graphical user interface diagram illustrating prior art virtual unit selection in an embodiment of the present disclosure;
FIG. 1B illustrates another graphical user interface diagram of prior art virtual unit selection in an exemplary embodiment of the present disclosure;
FIG. 1C illustrates yet another graphical user interface diagram of prior art virtual unit selection in an exemplary embodiment of the present disclosure;
FIG. 2 illustrates a flow diagram of a virtual unit selection method in a game in an exemplary embodiment of the present disclosure;
FIG. 3 shows a schematic diagram of a graphical user interface in an exemplary embodiment of the present disclosure;
FIG. 4 illustrates a schematic diagram of another graphical user interface in an exemplary embodiment of the present disclosure;
FIG. 5 illustrates a flow diagram of a method of resizing an initial selection box in an exemplary embodiment of the present disclosure;
FIG. 6 is a flow chart illustrating a method for selecting a virtual unit according to a positional relationship between a target selection frame and a plurality of virtual units to be selected in an exemplary embodiment of the disclosure;
FIG. 7 is a flow chart illustrating another method of selecting virtual units according to a target selection box in an exemplary embodiment of the present disclosure;
FIG. 8 is a flow chart illustrating a method of selecting virtual units according to a target selection box in accordance with yet another exemplary embodiment of the present disclosure;
FIG. 9 illustrates yet another graphical user interface diagram in an exemplary embodiment of the present disclosure;
FIG. 10 illustrates yet another graphical user interface diagram in an exemplary embodiment of the present disclosure;
FIG. 11 illustrates a graphical user interface diagram of a virtual unit selected according to the selection box in FIG. 10 in an exemplary embodiment of the present disclosure;
FIG. 12 is a schematic diagram showing the configuration of a virtual unit selection apparatus in a game according to an exemplary embodiment of the present disclosure;
FIG. 13 illustrates a schematic diagram of a computer storage medium in an exemplary embodiment of the present disclosure;
fig. 14 shows a schematic structural diagram of an electronic device in an exemplary embodiment of the present disclosure.
Detailed Description
Example embodiments will now be described more fully with reference to the accompanying drawings. However, the exemplary embodiments may be embodied in many forms and should not be construed as limited to the examples set forth herein; rather, these embodiments are provided so that this disclosure will be thorough and complete, and will fully convey the concept of the example embodiments to those skilled in the art. The described features, structures, or characteristics may be combined in any suitable manner in one or more embodiments. In the following description, numerous specific details are provided to give a thorough understanding of embodiments of the present disclosure. One skilled in the relevant art will recognize, however, that the aspects of the disclosure may be practiced without one or more of the specific details, or with other methods, components, devices, steps, etc. In other instances, well-known technical solutions have not been shown or described in detail to avoid obscuring aspects of the present disclosure.
The terms "a," "an," "the," and "said" are used in this specification to denote the presence of one or more elements/components/etc.; the terms "comprising" and "having" are intended to be inclusive and mean that there may be additional elements/components/etc. in addition to the listed elements/components/etc.; the terms "first" and "second" and the like are used merely as labels, and are not intended to limit the number of their objects.
Furthermore, the drawings are merely schematic illustrations of the present disclosure and are not necessarily drawn to scale. The same reference numerals in the drawings denote the same or similar parts, and thus a repetitive description thereof will be omitted. Some of the block diagrams shown in the figures are functional entities and do not necessarily correspond to physically or logically separate entities.
Some required virtual units are selected from a plurality of virtual units in the game to control the selected virtual units to perform some other game operations, which is one of conventional game operations of an RTS (Real-Time Strategy) type game.
For the PC end (Personal Computer), the selection of virtual units is easy, and a game player can select single virtual units or batch selection of virtual units in a mode of clicking a frame through a mouse. But for touch terminals with smaller display screens, such as mobile phones, tablet computers, wearable electronic devices, etc., this approach is not suitable.
In the related art, selection of the virtual unit may be performed by a frame selected in a two-touch manner. As shown in fig. 1A, a selection frame may be formed in an area between the double touches according to an operation gesture of the player to select a virtual unit.
However, this method requires the player to quickly click and operate the finger, which has complicated operation steps, low efficiency and high false touch rate. Further, whether the virtual units in the clusters at the center of the screen or the virtual units in the clusters at the sides of the screen are selected, as shown in fig. 1B and fig. 1C, during the operation, fingers can block the central line of sight of the player who is in antagonism, so that the experience of the player and the accuracy of the game operation are affected.
The virtual unit selection method in the game provided by the disclosure overcomes the defects in the related art at least to a certain extent.
Fig. 2 is a flow chart illustrating a virtual unit selection method in a game according to an exemplary embodiment of the present disclosure, where the virtual unit selection method in a game provided in the present embodiment shows a game screen of the game through a graphical user interface of a display component. Referring to fig. 2, the method includes:
Step S210, in response to control operation of the touch medium on a virtual control in a graphical user interface, adjusting the display size of an initial selection frame displayed in the graphical user interface to generate a target selection frame;
step S220, selecting a target virtual unit from the plurality of virtual units to be selected according to the position relation between the target selection frame and the plurality of virtual units to be selected;
step S230, controlling the target virtual unit to execute a game action.
In the technical solution provided in the embodiment shown in fig. 2, first, in response to a control operation of a touch medium for a virtual control in a graphical user interface, a display size of an initial selection frame displayed in the graphical user interface is adjusted to generate a target selection frame, and then, a target virtual unit may be selected from a plurality of virtual units according to a positional relationship between the target selection frame and the plurality of virtual units to be selected, so as to control the target virtual unit to execute a game action. Compared with the related art, on one hand, the method and the device can directly operate the virtual control to adjust the display size of the initial selection frame so as to select the virtual units in the game, simplify the operation steps when the virtual units are selected, and improve the selection efficiency of the virtual units in the game; on the other hand, the method for selecting the virtual units through the virtual control can reduce or avoid shielding of the visual field range of the player in the process of selecting the virtual units, and improve the accuracy of game operation of the player.
The following describes in detail the specific implementation of each step in the embodiment shown in fig. 2:
in step S210, in response to a control operation of the touch medium for a virtual control in the graphical user interface, a display size of an initial selection frame displayed in the graphical user interface is adjusted to generate a target selection frame.
In an exemplary embodiment, the touch medium may include a substance capable of multi-touch on a display screen of the terminal device, such as a user's finger, a stylus, or the like. The initial selection box may include a certain point at a preset position in the gui, for example, a center point of the gui, or may include a certain initial selection box having an initial display size at a preset position, for example, a selection box of 2px×2px or an initial selection box having a display size smaller than a virtual unit, where px is a pixel unit (hereinafter, px is to be understood as a pixel unit if px is not specifically described below), and the preset position may be set in a user-defined manner according to the requirement.
Wherein the virtual control may include a rocker and a chassis, and the rocker is located in the chassis. Specifically, the touch medium can press the rocker to slide in the chassis at will, but the rocker cannot be slid out of the chassis, namely, the rocker cannot be separated from the chassis in the process that the touch medium presses the rocker to slide.
In an alternative embodiment, before responding to the control operation of the touch medium for the virtual control in the graphical user interface, the method further comprises: and responding to the triggering operation of the touch medium for a first preset area in the graphical user interface, displaying a virtual control in the graphical user interface, and displaying an initial selection frame at a preset position of the graphical user interface.
For example, a certain display area in the gui may be configured as a virtual control trigger area in advance, for example, a display area with a preset width on the left side in the gui is configured as a virtual control trigger area, and as shown in fig. 3, 31 may be a virtual control trigger area, that is, a first preset area. In other words, after the user performs the triggering operation in the virtual control triggering area, a virtual control is displayed at the triggering position, such as displaying the virtual control 32 in fig. 3, where 321 is a rocker, 322 is a chassis, and an initial selection box is displayed at a preset position of the graphical user interface, such as the initial selection box 33 in fig. 3. In fig. 3, circles like 34 represent virtual units.
Specifically, when it is detected that the touch medium is in contact with the first preset area in the gui, a virtual control is displayed at the contact position, such as 32 in fig. 3, and an initial selection frame is displayed at the center of the gui, where the initial selection frame may be an initial selection frame with an initial display size of 0×0, that is, a point, such as 33 in fig. 3.
In another alternative embodiment, the virtual control and the initial selection frame may also be displayed directly in the graphical user interface, that is, after the user opens the game, a virtual control and the initial selection frame are directly displayed in the game interface, without triggering the virtual control and the initial selection frame through the first preset area by means of the touch medium.
In order to prevent misoperation caused by false touch of a user, an initial selection frame directly displayed in the graphical user interface may have an initial display state, and when the initial selection frame is in the initial display state (the initial display state may be understood as an unactivated state), selection of a virtual unit cannot be performed based on the initial selection frame, that is, at this time, a sliding operation of the user in the graphical user interface may only be used to control the virtual camera to move so as to update a game scene shot in a visual field of the virtual camera, and cannot be used to move or resize the initial selection frame.
When a virtual control and an initial selection frame are displayed in a game interface at the beginning, responding to the triggering operation of a touch medium on the displayed virtual control, for example, when the touch medium is detected to be in a contact state with the virtual control, the initial selection frame can be configured to be in a target display state from an initial display state, the target display state can be understood to be an activation state, and when the initial display frame is in the activation state, virtual unit selection can be performed by using the exemplary method disclosed by the disclosure based on the initial selection frame.
In yet another alternative embodiment, the virtual control may be displayed directly in the graphical user interface, an initial selection box is displayed and activated in response to a triggering operation of the touch medium on the virtual control, and then virtual units are selected using the exemplary method of the present disclosure based on the initial selection box.
In still another optional implementation manner, only the virtual control is directly displayed in the graphical user interface, an initial selection frame in an initial display state is displayed at a preset position of the graphical user interface in response to a first triggering operation of the touch medium on the virtual control, and the initial selection frame is configured from the initial display state to a target display state in response to a second triggering operation of the touch medium on the virtual control. As described above, when the initial selection frame is in the initial display state in which it is not activated, selection of the virtual unit cannot be performed based on the initial selection frame. The selection of virtual units may be performed using the exemplary methods of the present disclosure based on the initial selection box while the initial display box is in the activated target display state.
For example, the control operation in step S210 may include an operation in which the touch medium presses the rocker to slide the rocker in the chassis. Based on this, in a specific embodiment of step S210, in response to a sliding operation of the touch medium on the rocker in the chassis, a display size of an initial selection frame displayed in the graphical user interface is adjusted according to a sliding distance of the sliding operation, so as to generate a target selection frame.
In an exemplary embodiment, the initial selection box may include a point at a preset location in the graphical user interface, and the generated target selection box may include any quadrilateral (e.g., rectangular), circular, or any other closed polygon after the initial selection box is resized. The initial selection box may also include a closed graphic of any shape having a certain display size in the graphical user interface, such as a rectangle, a circle, etc., where the shape of the target selection box and the initial selection box are identical.
It can be seen that when the initial selection box is a point or rectangle, the generated target selection box may be a rectangle, and when the initial selection box is a point or circle, the generated target selection box may be a circle.
In the following, taking the case that the target selection frame is rectangular or circular as an example, how to adjust the display size of the initial selection frame displayed in the graphical user interface according to the sliding distance of the touch medium in the chassis for the sliding operation of the rocker is described to generate the target selection frame.
When the target selection frame is rectangular, according to the sliding distance of the sliding operation map in the horizontal direction and the sliding distance of the sliding operation map in the vertical direction, the display boundary of the rectangle in the horizontal direction and the increment of the display size in the vertical direction are correspondingly determined, and then the display size of the initial selection frame is adjusted to generate the target selection frame.
Specifically, a proportional relationship between the sliding distance of the sliding operation map in the horizontal direction and the increment corresponding to the length of the rectangle and a proportional relationship between the sliding distance of the sliding operation map in the vertical direction and the increment corresponding to the width of the rectangle may be preset, for example, a proportional relationship between the sliding distance of the sliding operation map in the horizontal direction and the increment corresponding to the length of the rectangle is preset to be 1:2, and a proportional relationship between the sliding distance of the sliding operation map in the vertical direction and the increment corresponding to the width of the rectangle is also preset to be 1:2. Taking the initial selection frame as an example, when the user slides the rocker in the chassis, and the mapping distance in the horizontal direction is 5px and the mapping distance in the vertical direction is 4px, a target selection frame with the length of 10px and the width of 8px, where the point corresponding to the initial selection frame is the center of the diagonal line, is generated in the graphical user interface. As shown in fig. 4, the initial selection frame 33 in fig. 3 may be adjusted according to a sliding operation of the user, thereby generating the target selection frame 41 in fig. 4.
It should be noted that, in the process of generating the target selection frame, the user may slide the rocker reciprocally to determine the optimal display size of the target selection frame. In the process of generating the target selection frame, the sliding direction corresponding to the first sliding operation can be determined as a positive direction, and then in the process of generating the target selection frame for the second time, when the sliding direction of the sliding operation map of the user in the horizontal direction or the vertical direction is consistent with the first sliding operation, the increment of the corresponding display boundary is a positive value, and otherwise, is a negative value. If the user slides the rocker upward and downward by 3px and then by 1px in the chassis to the right in a certain adjustment process, the sliding operation is mapped in the horizontal rightward direction to be the positive direction of the horizontal direction, and the vertical upward direction to be the positive direction of the vertical direction. That is, the display sizes of the initial selection frame in the horizontal direction and the vertical direction are correspondingly increased, then the display size in the horizontal direction is unchanged, and the display size in the vertical direction is correspondingly reduced, so that the target selection frame is finally generated.
When the target selection frame is circular, the increasing length of the radius of the circle can be determined according to the sliding distance of the sliding operation map in the horizontal direction or the vertical direction, or the increasing length of the radius of the circle can be determined directly according to the sliding distance of the sliding operation, so that the display size of the initial selection frame is adjusted to generate the target selection frame. Similarly, a predetermined proportional relationship between the sliding distance of the sliding operation and the increasing length of the radius of the circular selection frame may be predetermined, and, for example, the increasing length of the radius of the circular shape may be determined based on the sliding distance of the sliding operation map in the horizontal direction, the direction in which the sliding operation is initially mapped in the horizontal direction may be defined as a positive direction, for example, the direction in which the sliding operation is initially mapped in the horizontal direction is initially slid upward to the right, the direction in which the horizontal direction is initially slid is a positive direction, and when the sliding operation is mapped in the horizontal direction is initially slid in the positive direction, the increasing length of the radius is a positive value, otherwise, the increasing length of the radius is a negative value.
Illustratively, the center of the rocker and the chassis are coincident when the rocker is not slid. Taking the example that the rocker and the chassis are both round, the initial selection frame is a point in the graphical user interface, and the generated target selection frame is rectangular, the sliding distance of the rocker in the horizontal direction is proportional to the length of the generated target selection frame, and the sliding distance of the rocker in the vertical direction is proportional to the width of the target selection frame. In other words, the area of the target selection frame is proportional to the distance between the rocker center and the chassis center, i.e., the farther the rocker center is from the chassis center, the larger the area of the target selection frame is generated. Specifically, taking the example that the horizontal direction represents the length of the selection frame and the vertical direction represents the width of the selection frame, the percentage of the distance that the rocker slides out on the transverse axis of the chassis (i.e. the horizontal direction) is the same as the percentage of the length of the target selection frame that is the length of the graphical user interface, and the percentage of the distance that the rocker slides out on the vertical axis of the chassis (i.e. the vertical direction) is the same as the percentage of the width of the target selection frame that is the width of the graphical user interface.
The above manner is to slide in the chassis by controlling the rocker to adjust the display size of the initial selection frame, thereby generating the target selection frame. In another exemplary embodiment, the control operation in step S210 may include a pressing operation of different force on a rocker in the chassis, and specifically, the display size of the initial selection frame may be adjusted by detecting a pressing force value of the touch medium on the rocker based on a 3D touch (three-dimensional touch) technology.
For example, referring to fig. 5, a method of adjusting a display size of an initial selection frame based on a 3D touch technique may include steps S510 to S520.
In step S510, in response to a pressing operation of the touch medium against the rocker, a pressing force value of the touch medium against the rocker is detected.
For example, when the duration of the contact state between the touch medium and the rocker is longer than a preset threshold, for example, 0.2 seconds, the current pressing force value of the touch medium may be detected based on the 3D touch technology. Specifically, compared with the operation of multi-Touch in a planar two-dimensional space, the 3D-Touch technology increases the perception of finger strength and finger area, so that the pressing strength value of the Touch medium aiming at the rocker can be directly detected through the 3D-Touch technology.
Next, in step S520, the display size of the initial selection frame displayed in the graphical user interface is adjusted according to the pressing force value based on the preset mapping relationship, so as to generate a target selection frame.
In an exemplary embodiment, the preset mapping relationship may include a correspondence relationship between the pressing force value and an increment of the display size of the initial selection frame, and the correspondence relationship may be a proportional relationship, that is, the larger the pressing force value is, the larger the increment of the display size of the initial selection frame is.
For example, a correspondence relationship between the pressing force value and the increment of the display size of the initial selection frame may be preset to be y=3x, where x represents the pressing force value and y represents the increment of the display size of the initial selection frame, so that the increment of the display size of the initial selection frame may be determined based on the detected pressing force value, and the display size of the initial selection frame may be adjusted. When the generated target selection frame is a rectangle, the preset mapping relationship may include a mapping relationship between the pressing force value and an increment of the length of the rectangle and a mapping relationship between the pressing force value and an increment of the width of the rectangle, which may be equal or unequal, and the exemplary embodiment is not limited in particular. When the generated target selection frame is a circle, the preset mapping relationship may include a correspondence relationship between the pressing force value and an increment of the radius of the circle.
In an exemplary embodiment, the preset mapping relationship may further include a correspondence relationship between the pressing force values of different levels and the increment of the display size of the initial selection frame. If the pressing force value is greater than 0 and less than 3, the pressing force value is at the first level, the increment of the display size of the corresponding initial selection frame is 20px, and if the pressing force value is not greater than 5 and not less than 3, the pressing force value is at the second level, the increment of the display size of the corresponding initial selection frame is 40px, and so on. After the current pressing force value is detected, determining a pressing force level corresponding to the current pressing force value, and then adjusting the display size of the initial selection frame according to the increment of the display size corresponding to the pressing force level so as to generate the target selection frame.
In still another exemplary embodiment, the control operation in step S210 may further include an operation of sliding in a second preset area of the graphical user interface. Based on this, in the specific embodiment of step S210, when it is detected that the touch medium is in a contact state with the rocker, in response to a sliding operation in the second preset area, the display size of the initial selection frame displayed in the graphical user interface is adjusted according to the sliding distance and/or the sliding direction of the sliding operation, so as to generate the target selection frame.
For example, taking a touch medium as an example, a user's finger may press the joystick with a left hand, then the right hand finger slides in the second preset area, and the display size of the initial selection frame is adjusted to generate the target selection frame. For example, a certain area in the graphical user interface may be configured in advance as a selection frame resizing area, i.e., a second preset area. When the touch medium slides in the area, the display size of the generated selection frame can be correspondingly adjusted based on the sliding operation of the touch medium.
The specific embodiment of adjusting the display size of the initial selection frame displayed in the graphical user interface according to the sliding distance and the sliding direction of the touch medium in the second preset area and according to the sliding distance of the touch medium in the second preset area may refer to the specific embodiment of adjusting the display size of the initial selection frame in the graphical user interface according to the sliding operation of the touch medium in the chassis for the rocker, which is not described herein.
For a specific embodiment of adjusting the display size of the initial selection frame displayed in the graphical user interface according to the sliding direction of the touch medium in the second preset area, the display size of the initial selection frame may be adjusted according to the sliding direction based on the boundary increasing size of the preset initial selection frame.
Taking a rectangular selection frame as an example, the preset increase size of the boundary of the rectangular selection frame in the first direction is a, the increase size of the boundary of the rectangular selection frame in the second direction is b, namely, the size of the boundary of the selection frame in the first direction is increased by a when the touch medium slides once, the size of the boundary in the second direction is increased by b in the generation process of the target selection frame, the direction of the first sliding direction in the first direction and the direction of the second direction are mapped as positive directions, the increase size of the boundary is positive for the positive directions, and the increase size of the boundary is negative for the negative directions. If, in the process of generating the target selection frame, the target selection frame is slid up and down for 2 times and then slid down and down for 1 time, taking the first direction as the horizontal direction and the second direction as the vertical direction as examples, in the process of generating the target selection frame for the second time, the horizontal direction and the horizontal direction in the horizontal direction and the vertical direction in the horizontal direction are the positive direction of the vertical direction, that is to say, the display sizes of the initial selection frame corresponding to the display boundaries of the first direction and the second direction are increased by 2a and 2b respectively, then the display sizes of the initial selection frame in the first direction and the display sizes of the initial selection frame in the second direction are reduced by a and b respectively, and finally a rectangular selection frame with the display size of a in the first direction and the display size of b in the second direction is generated. The target selection box may be understood as a selection box used when finally selecting a virtual unit.
Of course, the display size of the initial selection frame may be adjusted according to the sliding direction based on the preset increasing speed of the display boundary, which is similar to the embodiment of adjusting the display size of the initial selection frame according to the sliding direction based on the preset increasing size of the display boundary, and will not be described herein.
With continued reference to fig. 2, in step S220, a target virtual unit is selected from the plurality of virtual units to be selected according to the positional relationship between the target selection frame and the plurality of virtual units to be selected.
Next, a specific embodiment of step S220 will be further described with reference to fig. 6 to 11, respectively.
Fig. 6 is a flowchart illustrating a method for selecting a virtual unit according to a positional relationship between a target selection frame and a plurality of virtual units to be selected in an exemplary embodiment of the present disclosure. Referring to fig. 6, the method may include steps S610 to S620. Wherein:
in step S610, in response to the sliding operation of the second preset area in the graphical user interface, the pose of the virtual camera in the game is adjusted so that the display position of the virtual unit to be selected in the graphical user interface is at least partially located in the target selection frame.
In an exemplary embodiment, a game screen obtained by photographing a part or all of a game scene and a plurality of virtual units located in the game scene is displayed through a graphical user interface. When the pose of the virtual camera changes, the game picture displayed in the graphical user interface is updated, and correspondingly, the display position of the virtual unit to be selected in the graphical user interface also changes along with the updating of the game picture.
For example, a certain area in the graphical user interface may be configured as a pose adjustment area of the virtual camera in advance, that is, a second preset area, and when the touch medium slides in the area, as shown in 35 in fig. 3, the pose of the virtual camera may be correspondingly adjusted based on the sliding operation of the touch medium, so that the display position of the virtual unit to be selected in the graphical user interface is at least partially located in the target selection frame. The pose of the virtual camera comprises the position and the rotation angle of the virtual camera.
It should be noted that, the first preset area and the second preset area are two areas in the graphical user interface where there is no overlapping area. And when the second preset area is an area for adjusting the pose of the virtual camera, the second preset area cannot be used for controlling the display size or movement of the target selection frame, when the second preset area is an area for controlling the display size of the target selection frame, the second preset area cannot be used for adjusting the pose of the virtual camera or moving the target selection, and when the second preset area is an area for controlling the target selection frame to move, the second preset area cannot be used for adjusting the pose of the virtual camera or controlling the display size of the target selection.
For example, according to a preset proportional relationship between the sliding distance of the sliding operation and the moving distance and/or the rotating angle of the virtual camera, the moving distance and/or the rotating angle of the virtual camera may be determined based on the sliding distance of the sliding operation, and the moving direction and/or the rotating direction of the virtual camera may be determined based on the sliding direction of the sliding operation, for example, the moving direction of the virtual camera in the virtual scene and the sliding direction of the sliding operation are identical or opposite, and then the virtual camera in the game may be moved or rotated based on the sliding direction and the sliding distance of the sliding operation for the second preset area, so as to adjust the pose of the virtual camera.
As described above, when the virtual camera moves, the game scene captured by the virtual camera changes, and the game scene displayed in the gui changes accordingly, so that the display position of the virtual unit in the game scene in the gui changes, and thus the display position of at least one virtual unit to be selected in the gui can be at least partially moved into the selection frame by controlling the movement or rotation of the virtual camera. It will be appreciated that at this point, the location of each virtual unit in the game scene does not change, but rather the location of each virtual unit in the game scene changes as mapped to the display position in the graphical user interface. The virtual unit to be selected may be understood as a virtual unit that the game player wants to select.
In step S620, a target virtual unit is selected from the plurality of virtual units to be selected according to the virtual units to be selected located in the target selection frame at least partially at the display position in the graphical user interface.
Exemplary, the specific embodiment of step S620 may include: and selecting a target virtual unit from the plurality of virtual units to be selected according to the position overlapping degree between the display position of the virtual unit to be selected and the target selection frame.
The display position of the virtual unit to be selected may include a display area occupied by the virtual unit to be selected in the graphical user interface. The position overlapping degree can be understood as the overlapping rate between the area of the display area occupied by the display position of the virtual unit to be selected in the graphical user interface and the area of the display area occupied by the display position of the target selection frame in the graphical user interface.
Taking an example in which a represents the area of a display area occupied by the display position of a certain virtual unit to be selected in the graphical user interface and B represents the area of a display area occupied by the display position of the target selection frame in the graphical user interface. Then, the positional overlap ratio therebetween can be expressed as the following equation (1) or equation (2):
Where A.andB represents the intersection of A and B, i.e., the area of the area where A overlaps B.
For example, when the target virtual unit is selected according to the position overlapping degree between the display position of the virtual unit to be selected and the target selection frame, the virtual unit to be selected with the position overlapping degree being greater than or equal to a preset threshold may be determined as the target virtual unit. The preset threshold value can be set in a self-defined manner according to actual situations or requirements, and can be any value which is larger than 0 and smaller than or equal to 1. Taking the above formula (1) as an example, when the preset threshold value is equal to 1, it may be indicated that the virtual unit in which the display position in the graphical user interface is fully contained in the target selection frame is selected.
In an exemplary embodiment, a line passing through the center point of the chassis of the virtual control and forming an angle of 90 degrees with the horizontal direction is taken as a boundary, and when the rocker is slid rightward in the chassis, all virtual units contained in the target selection frame can be selected, for example, a virtual unit having an overlapping rate with the target selection frame of greater than 85% can be selected. When the rocker is slid to the left in the chassis, all virtual units that are in contact with the target selection frame may be selected, e.g., virtual units having an overlap ratio with the target selection frame of greater than 10% may be selected. That is, when the rocker is slid rightward in the chassis, the preset threshold corresponding to the overlapping rate may be greater than the preset threshold corresponding to the overlapping rate when the rocker is slid leftward in the chassis, or, of course, it may also be opposite, or, whether the rocker is slid rightward or slid leftward in the chassis may be not distinguished, that is, only one preset threshold is set, for example, whether the rocker is slid rightward or slid leftward in the chassis, a virtual unit with the overlapping rate greater than 90% is selected as the target virtual unit.
Through the steps S610 to S620, the display size of the generated target selection frame may be determined based on the sliding distance of the rocker in the chassis, and the pose of the virtual camera may be adjusted based on the sliding of the touch medium in the second preset area, so as to move the display position of the virtual unit to be selected in the graphical user interface into the target selection frame, thereby realizing the selection of the virtual unit, and the target selection frame does not move in the whole process.
In another exemplary embodiment, the display size of the generated target selection frame may be determined based on the sliding distance of the rocker in the chassis, and the generated target selection frame may be moved based on the sliding operation of the touch medium in the second preset area, so as to implement selection of the virtual unit.
Fig. 7 is a flow chart illustrating another method for selecting virtual units according to a target selection box in an exemplary embodiment of the present disclosure. Referring to fig. 7, the method may include steps S710 to S720. Wherein,
in step S710, in response to the sliding operation of the second preset area in the graphical user interface, the target selection frame is moved in the graphical user interface, so that the display position of the virtual unit to be selected in the graphical user interface is at least partially located in the target selection frame.
The second preset area in step S710 may include any area in the graphical user interface where there is no overlapping portion with the first preset area, and the display size of the second preset area may be customized according to the requirement. When the touch medium slides in the second preset area, the target selection frame can be moved in the graphical user interface based on the sliding operation of the touch medium, so that the display position of the virtual unit to be selected is at least partially located in the target selection frame.
For example, the moving distance of the target selection frame corresponding to the sliding distance of the current sliding operation may be determined according to a preset proportional relationship between the sliding distance of the sliding operation and the moving distance of the target selection frame. The moving direction of the target selection frame is determined based on the sliding direction of the sliding operation, that is, the sliding direction of the sliding operation and the moving direction of the target selection frame coincide. And moving the target selection frame based on the sliding distance and the sliding direction of the sliding operation, so that the display position of the virtual unit to be selected in the graphical user interface is at least partially positioned in the target selection frame.
Next, in step S720, a target virtual unit is selected from the plurality of virtual units to be selected according to the virtual units to be selected located at least partially within the target selection frame at the display position in the graphical user interface.
The specific embodiment of step S720 is identical to the specific embodiment of step S620 described above, and will not be described here.
It should be noted that, in the embodiment shown in fig. 7, the target selection frame is moved in the gui, so that the virtual unit is selected, that is, only the virtual unit that is currently displayed in the gui is selected. Therefore, when the embodiment shown in fig. 7 is used to select the virtual unit, before the initial selection frame is not activated, the pose of the virtual camera may be adjusted by sliding in the second preset area, so that the virtual unit to be selected may be displayed in the gui. And then activating the initial selection frame to select the virtual units, wherein when the initial selection frame is activated, the second preset area can only be used for adjusting the display position of the generated target selection frame in the graphical user interface, and can not be used for adjusting the pose of the virtual camera.
For the embodiment shown in fig. 6, the display position of the virtual unit to be selected in the gui can be moved into the target selection frame by adjusting the pose of the virtual camera during the actual selection of the virtual unit, so as to achieve the purpose of selecting any virtual unit to be selected in the virtual scene, and therefore, the virtual unit to be selected can be displayed in the gui without being displayed in advance. Of course, in order to improve the selection efficiency, the display position of the virtual unit to be selected in the graphical user interface can be moved to the vicinity of the initial selection frame in advance by adjusting the pose of the virtual camera, so that when the virtual unit is actually selected, the virtual unit to be selected can be selected directly through the generated target selection frame by only performing fine adjustment of the display pose of the virtual camera or even without adjusting the virtual camera.
In still another exemplary embodiment, when the display size of the initial selection frame is adjusted based on the pressing force value of the touch medium with respect to the rocker in step S210 or when the display size of the initial selection frame is adjusted based on the sliding operation of the touch medium in the second preset area in step S210, the target selection frame may be moved in the graphical user interface by the sliding operation with respect to the rocker in step S220, so that the display position of the virtual unit to be selected in the graphical user interface is at least partially located within the target selection frame, thereby performing the selection of the virtual unit.
Fig. 8 is a flowchart illustrating a method for selecting a virtual unit according to a positional relationship between a target selection frame and a plurality of virtual units to be selected in accordance with still another exemplary embodiment of the present disclosure. Referring to fig. 8, the method may include steps S810 to S820. Wherein:
in step S810, in response to a sliding operation of the touch medium in the chassis with respect to the joystick, the target selection frame is moved in the graphical user interface, so that a display position of a virtual unit to be selected in the graphical user interface is located at least partially within the target selection frame.
Illustratively, the object selection frame moves upward when the rocker is positioned above the central axis in the horizontal direction, and moves downward otherwise, by taking the central axes of the chassis in the horizontal direction and the vertical direction as the dividing lines respectively; when the rocker is positioned on the right side of the central axis in the vertical direction, the target selection frame moves rightwards, and conversely, moves leftwards. And determining the moving direction of the target selection frame in the graphical user interface based on the sliding direction of the rocker. And determining the moving distance of the target selection frame in the graphical user interface based on a preset proportional relation between the sliding distance of the rocker and the moving distance of the target selection frame.
In step S820, a target virtual unit is selected from the plurality of virtual units to be selected according to the virtual units to be selected located within the target selection frame at least in part at a display position in the graphical user interface.
The specific embodiment of step S820 is the same as the specific embodiment of step S620 described above, and will not be described here again.
When the display size of the initial selection frame is adjusted based on the pressing force value of the touch medium against the rocker in step S210, the user can be assisted to complete the selection of the virtual unit by using one hand through steps S810 to S820 described above. Specifically, the user may control the size of the generated target selection frame by pressing the pressing force of the rocker with a finger, as shown in fig. 9, the target selection frame generated according to the current pressing force value may be 91, and then, continue to press the virtual rocker and slide the rocker in the chassis to move the target selection frame 91, so as to obtain the target selection frame 101 in fig. 10, so that the display position of the virtual unit to be selected in the graphical user interface is at least partially located in the target selection frame 101. At this time, when the finger is released, the target virtual unit may be selected from the plurality of virtual units to be selected according to the current display size and the position overlapping degree between the display position of the target selection frame 101 and the display position of the virtual unit to be selected in the graphical user interface. As shown in fig. 11, the final selected target virtual units are 111, 112, 113. Therefore, shielding of the visual field range of the user in the virtual unit selection process is avoided, accuracy of user operation is improved, and efficiency of virtual unit selection is improved.
In an exemplary embodiment, the size of the generated target selection frame may be controlled by the pressing force of the touch medium pressing the joystick, and the position of the generated target selection frame or the pose of the virtual camera may be adjusted by the sliding operation in the second preset area to adjust the display position of the virtual unit in the graphical user interface, so that the virtual unit to be selected is framed in the generated target selection frame, thereby selecting the virtual unit. At this time, the virtual control may include only one touch button, for example, only a rocker is reserved, without a chassis, and the rocker cannot be slid in the process of selecting the virtual unit. The present exemplary embodiment is not particularly limited thereto.
Illustratively, selecting the target virtual unit from the plurality of virtual units to be selected according to the positional relationship between the target selection frame and the plurality of virtual units to be selected, including: when the touch medium and the virtual control are detected to change from a contact state to a non-contact state, selecting a target virtual unit from the plurality of virtual units to be selected according to the position relation between the target selection frame and the plurality of virtual units to be selected.
For example, in an exemplary application scenario, as shown in fig. 3, the user may press in the first preset area 31 by the left hand, where a virtual control 32 is displayed at the pressed position, and where the pressed position of the user is the display position of the rocker in the virtual control. The user may slide the rocker in the chassis by pressing the rocker with the left hand to control the size of the generated target selection frame, and slide the rocker in the second preset area 35 by the right hand to adjust the pose of the virtual camera so that the display position of the virtual unit which the user wants to select is at least partially located in the target selection frame in the graphical user interface.
In the exemplary application scenario, in the process of selecting the virtual unit, the display position of the target selection frame in the graphical user interface is not changed, namely, the center point of the target selection frame is always located at the preset position of the graphical user interface, and the display position of the virtual unit in the graphical user interface is adjusted through the pose change of the virtual camera, so that the virtual unit to be selected is framed in the target selection frame to be selected. In the whole process, the game place where the virtual unit is located in the game scene is not changed, but only the display position of the game place map where the virtual unit is located in the graphical user interface is changed.
In another exemplary application scenario, as shown in fig. 3, the user may press in the first preset area 31 by the left hand, where a virtual control 32 is displayed at the pressed position, and where the pressed position of the user is the displayed position of the rocker 321 in the virtual control. The user may then slide the joystick in the chassis by the right hand while continuing to press the joystick by the left hand to control the size of the generated target selection frame, and slide in the second preset area 35 by the right hand to move the currently generated target selection frame in the graphical user interface, so that the display position of the virtual unit that the user wants to select is located at least partially within the target selection frame in the graphical user interface. When the display positions of the virtual units to be selected in the graphical user interface are at least partially located in the target selection frame, the user can release the left finger, and at this time, the virtual units are selected according to the overlapping rate of the display positions of the virtual units in the graphical user interface and the currently generated target selection frame, namely, the virtual units with the overlapping rate of the display positions of the virtual units in the graphical user interface and the currently generated target selection frame being larger than a preset threshold value are selected. In this exemplary application scenario, in the course of virtual unit selection, the display position of the virtual unit in the graphical user interface is not changed, and the virtual unit frame to be selected is selected within the target selection frame by movement of the target selection frame to be selected.
In yet another exemplary application scenario, in the graphical user interface shown in fig. 9, the user may control the size of the generated target selection frame by pressing the joystick with the left hand, and then slide the joystick with the left hand to adjust the display position of the generated target selection frame in the graphical user interface, so that the display position of the virtual unit to be selected in the graphical user interface is at least partially located in the generated target selection frame. And then loosening the left hand, and at the moment, selecting the virtual unit according to the overlapping rate of the display position of the virtual unit in the graphical user interface and the currently generated target selection frame, namely selecting the virtual unit with the overlapping rate of the display position of the virtual unit in the graphical user interface and the currently generated target selection frame being larger than a preset threshold value. In this exemplary application scenario, in the course of virtual unit selection, the display position of the virtual unit in the graphical user interface is not changed, and the virtual unit frame to be selected is selected within the target selection frame by movement of the target selection frame to be selected.
In yet another exemplary application scenario, as shown in fig. 3, the user may press in the first preset area 31 by the left hand, where a virtual control 32 is displayed at the pressed position, and where the pressed position of the user is the displayed position of the rocker 321 in the virtual control. Then, the user may slide in the chassis by controlling the joystick with the left hand while sliding in the second preset area with the right hand to adjust the size of the initial selection frame 33, so as to move the selection frame in the graphical user interface, so that the display position of the virtual unit to be selected in the graphical user interface is at least partially located in the generated target selection frame. When it is determined that the display positions of the virtual units to be selected are at least partially located in the target selection frame in the graphical user interface, the user can release the left finger, and at this time, the virtual units are selected according to the position overlapping degree between the display positions of the virtual units in the graphical user interface and the currently generated target selection frame, for example, the virtual units with the overlapping rate of the display positions currently in the graphical user interface and the currently generated target selection frame being greater than a preset threshold value are selected. In this exemplary application scenario, the target selection frame generated by the movement control of the joystick is moved, and the selection of the virtual unit is performed by the display size of the target selection frame generated by the sliding control in the second preset area.
It should be noted that, in the present disclosure, when the left finger leaves the virtual control, the virtual rocker returns to the center position of the chassis.
In an exemplary embodiment, after the virtual units are selected, the virtual units that are successfully selected may be specially marked to remind the user which virtual units are currently selected, and the generated target selection frame may be cancelled. The preset selection mark may be customized according to the requirement, for example, a yellow circle may be added to the selected virtual unit, which is not limited in this exemplary embodiment. At the same time, the rocker in the virtual control also returns to the center position of the chassis. Or the virtual control can be hidden in the first preset area, namely the virtual control is also not displayed, and the virtual control is redisplayed when the user triggers the first preset area next time.
In an exemplary embodiment, after the selection of the virtual unit is completed, the game of the current client may be taken out of the frame selection state (the frame selection state may be understood as a process corresponding state of selecting the virtual unit), that is, the game of the current client is in the non-frame selection state. In the non-frame selection state, the second preset area or any area in the whole graphical user interface can be used for controlling pose adjustment of the virtual camera so as to update the game picture displayed in the graphical user interface.
With continued reference to fig. 2, next, in step S230, the target virtual unit is controlled to execute a game action.
In an exemplary application scenario, after selecting a target virtual unit from a plurality of virtual units to be selected, a game player may control the selected target virtual unit to execute a corresponding game action according to its own demand. Such as controlling the movement of the target virtual units, attacking, grouping the target virtual units, etc., which is not particularly limited in the present exemplary embodiment.
In the method, the selection frame can be quickly generated through the virtual control to select the virtual units, so that the selection efficiency of the virtual units is improved. Meanwhile, the size of the generated target selection frame can be controlled through the cooperation of the virtual control and the second preset area, virtual units with different numbers can be selected, the applicability of virtual unit selection is improved, and the target selection frame can be subjected to mobile control, so that a user is assisted in selecting the virtual unit frame to be selected into the generated target selection frame, useless virtual units are prevented from being selected, and the two virtual units are combined, so that the accuracy of virtual unit selection can be ensured. Furthermore, the selection of the virtual units can be completed through the operation in a small range, and even the user can be assisted to use one hand to complete the rapid and accurate selection of the virtual units, so that the shielding of the visual field of the user in the selection process of the virtual units is reduced.
Those skilled in the art will appreciate that all or part of the steps implementing the above embodiments are implemented as a computer program executed by a CPU. When executed by a CPU, performs the functions defined by the above-described method provided by the present invention. The program may be stored in a computer readable storage medium, which may be a read-only memory, a magnetic disk or an optical disk, etc.
Furthermore, it should be noted that the above-described figures are merely illustrative of the processes involved in the method according to the exemplary embodiment of the present invention, and are not intended to be limiting. It will be readily appreciated that the processes shown in the above figures do not indicate or limit the temporal order of these processes. In addition, it is also readily understood that these processes may be performed synchronously or asynchronously, for example, among a plurality of modules.
Fig. 12 is a schematic diagram showing a configuration of a virtual unit selection apparatus in a game according to an exemplary embodiment of the present disclosure, the apparatus displaying a game screen of the game through a graphical user interface of a display assembly, the game screen including a part or all of a game scene and a plurality of virtual units to be selected located in the game scene. Referring to fig. 12, the apparatus 1200 may include a target selection box generation module 1210 and a selection module 1220, a control module 1230. Wherein:
A target selection frame generation module 1210 configured to adjust a display size of an initial selection frame displayed in a graphical user interface in response to a control operation of a touch medium for a virtual control in the graphical user interface to generate a target selection frame;
the selecting module 1220 is configured to select a target virtual unit from the plurality of virtual units to be selected according to the positional relationship between the target selection frame and the plurality of virtual units to be selected.
A control module 1230 configured to control the target virtual unit to perform a game action.
In an exemplary implementation, based on the foregoing embodiment, the apparatus 1200 may further include an activation display module configured to:
and responding to the triggering operation of the touch medium for a first preset area in the graphical user interface, displaying a virtual control in the graphical user interface, and displaying an initial selection frame at a preset position of the graphical user interface.
In an exemplary implementation, based on the foregoing embodiment, the virtual control includes a rocker and a chassis, the rocker being located in the chassis;
the target selection box generation module 1210 may be specifically configured to: and in response to the sliding operation of the touch medium on the rocker in the chassis, adjusting the display size of an initial selection frame displayed in the graphical user interface according to the sliding distance of the sliding operation so as to generate a target selection frame.
In an exemplary embodiment, the selecting module 1220 may be specifically configured to, based on the foregoing embodiment, display, through a graphical user interface, a game screen obtained by capturing, by a virtual camera, a part or all of a game scene and a plurality of virtual units located in the game scene:
responding to the sliding operation of a second preset area in a graphical user interface, and adjusting the pose of a virtual camera in the game so that the display position of a virtual unit to be selected in the graphical user interface is at least partially positioned in the target selection frame;
and selecting a target virtual unit from the plurality of virtual units to be selected according to the virtual units to be selected, which are at least partially positioned in the target selection frame, of the display positions in the graphical user interface.
In an exemplary implementation, based on the foregoing embodiment, the selecting module 1220 may be specifically configured to:
responding to a sliding operation of a second preset area in a graphical user interface, and moving the target selection frame in the graphical user interface so that the display position of a virtual unit to be selected in the graphical user interface is at least partially positioned in the target selection frame;
And selecting a target virtual unit from the plurality of virtual units to be selected according to the virtual units to be selected, which are at least partially positioned in the target selection frame, of the display positions in the graphical user interface.
In an exemplary implementation, based on the foregoing embodiment, the target selection box generation module 1210 may be specifically configured to:
detecting a pressing force value of the touch medium for the rocker in response to a pressing operation of the touch medium for the rocker;
and adjusting the display size of the initial selection frame displayed in the graphical user interface according to the pressing force value based on a preset mapping relation so as to generate a target selection frame.
In an exemplary implementation, based on the foregoing embodiment, the virtual control includes a rocker and a chassis, the rocker being located in the chassis; the target selection box generation module 1210 may be specifically configured to:
and when the touch medium is detected to be in a contact state with the rocker, responding to the sliding operation in a second preset area, and adjusting the display size of an initial selection frame displayed in the graphical user interface according to the sliding distance and/or the sliding direction of the sliding operation so as to generate a target selection frame.
In an exemplary implementation, based on the foregoing embodiment, the selecting module 1220 may be specifically configured to:
responding to the sliding operation of a touch medium in the chassis for the rocker, and moving the target selection frame in the graphical user interface so that the display position of a virtual unit to be selected in the graphical user interface is at least partially positioned in the target selection frame;
and selecting a target virtual unit from the plurality of virtual units to be selected according to the virtual units to be selected, which are at least partially positioned in the target selection frame, of the display positions in the graphical user interface.
In an exemplary implementation manner, based on the foregoing embodiment, the selecting, from the plurality of virtual units to be selected, the virtual unit to be selected according to the display position in the graphical user interface, at least partially, is located within the target selection frame includes:
and selecting a target virtual unit from the plurality of virtual units to be selected according to the position overlapping degree between the display position of the virtual unit to be selected and the target selection frame.
In an exemplary implementation, based on the foregoing embodiment, the selecting module 1220 may be specifically configured to:
When the touch medium and the virtual control are detected to change from a contact state to a non-contact state, selecting a target virtual unit from the plurality of virtual units to be selected according to the position relation between the target selection frame and the plurality of virtual units to be selected.
The specific details of each module in the above-mentioned virtual unit selection device in the game are described in detail in the corresponding virtual unit selection method in the game, so that the details are not repeated here.
It should be noted that although in the above detailed description several modules or units of a device for action execution are mentioned, such a division is not mandatory. Indeed, the features and functionality of two or more modules or units described above may be embodied in one module or unit in accordance with embodiments of the present disclosure. Conversely, the features and functions of one module or unit described above may be further divided into a plurality of modules or units to be embodied.
Furthermore, although the steps of the methods in the present disclosure are depicted in a particular order in the drawings, this does not require or imply that the steps must be performed in that particular order or that all illustrated steps be performed in order to achieve desirable results. Additionally or alternatively, certain steps may be omitted, multiple steps combined into one step to perform, and/or one step decomposed into multiple steps to perform, etc.
From the above description of embodiments, those skilled in the art will readily appreciate that the example embodiments described herein may be implemented in software, or may be implemented in software in combination with the necessary hardware. Thus, the technical solution according to the embodiments of the present disclosure may be embodied in the form of a software product, which may be stored in a non-volatile storage medium (may be a CD-ROM, a U-disk, a mobile hard disk, etc.) or on a network, including several instructions to cause a computing device (may be a personal computer, a server, a mobile terminal, or a network device, etc.) to perform the method according to the embodiments of the present disclosure.
In an exemplary embodiment of the present disclosure, a computer storage medium capable of implementing the above method is also provided. On which a program product is stored which enables the implementation of the method described above in the present specification. In some possible embodiments, the various aspects of the present disclosure may also be implemented in the form of a program product comprising program code for causing a terminal device to carry out the steps according to the various exemplary embodiments of the disclosure as described in the "exemplary methods" section of this specification, when the program product is run on the terminal device.
Referring to fig. 13, a program product 1300 for implementing the above-described method according to an embodiment of the present disclosure is described, which may employ a portable compact disc read only memory (CD-ROM) and include program code, and may be run on a terminal device, such as a personal computer. However, the program product of the present disclosure is not limited thereto, and in this document, a readable storage medium may be any tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device.
The program product may employ any combination of one or more readable media. The readable medium may be a readable signal medium or a readable storage medium. The readable storage medium can be, for example, but is not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or a combination of any of the foregoing. More specific examples (a non-exhaustive list) of the readable storage medium would include the following: an electrical connection having one or more wires, a portable disk, a hard disk, random Access Memory (RAM), read-only memory (ROM), erasable programmable read-only memory (EPROM or flash memory), optical fiber, portable compact disk read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing.
The computer readable signal medium may include a data signal propagated in baseband or as part of a carrier wave with readable program code embodied therein. Such a propagated data signal may take any of a variety of forms, including, but not limited to, electro-magnetic, optical, or any suitable combination of the foregoing. A readable signal medium may also be any readable medium that is not a readable storage medium and that can communicate, propagate, or transport a program for use by or in connection with an instruction execution system, apparatus, or device.
Program code embodied on a readable medium may be transmitted using any appropriate medium, including but not limited to wireless, wireline, optical fiber cable, RF, etc., or any suitable combination of the foregoing.
Program code for carrying out operations of the present disclosure may be written in any combination of one or more programming languages, including an object oriented programming language such as Java, C++ or the like and conventional procedural programming languages, such as the "C" programming language or similar programming languages. The program code may execute entirely on the user's computing device, partly on the user's device, as a stand-alone software package, partly on the user's computing device, partly on a remote computing device, or entirely on the remote computing device or server. In the case of remote computing devices, the remote computing device may be connected to the user computing device through any kind of network, including a Local Area Network (LAN) or a Wide Area Network (WAN), or may be connected to an external computing device (e.g., connected via the Internet using an Internet service provider).
In addition, in an exemplary embodiment of the present disclosure, an electronic device capable of implementing the above method is also provided.
Those skilled in the art will appreciate that the various aspects of the present disclosure may be implemented as a system, method, or program product. Accordingly, various aspects of the disclosure may be embodied in the following forms, namely: an entirely hardware embodiment, an entirely software embodiment (including firmware, micro-code, etc.) or an embodiment combining hardware and software aspects may be referred to herein as a "circuit," module "or" system.
An electronic device 1400 according to such an embodiment of the present disclosure is described below with reference to fig. 14. The electronic device 1400 shown in fig. 14 is merely an example and should not be construed as limiting the functionality and scope of use of the disclosed embodiments.
As shown in fig. 14, the electronic device 1400 is embodied in the form of a general purpose computing device. Components of electronic device 1400 may include, but are not limited to: the at least one processing unit 1410, the at least one memory unit 1420, a bus 1430 connecting the different system components (including the memory unit 1420 and the processing unit 1410), and a display unit 1440.
Wherein the storage unit stores program code that is executable by the processing unit 1410 such that the processing unit 1410 performs steps according to various exemplary embodiments of the present disclosure described in the above section of the present description of exemplary methods. For example, the processing unit 1410 may perform the steps as shown in fig. 2: step S210, in response to control operation of the touch medium on a virtual control in a graphical user interface, adjusting the display size of an initial selection frame displayed in the graphical user interface to generate a target selection frame; step S220, selecting virtual units in the game according to the target selection frame.
The memory unit 1420 may include readable media in the form of volatile memory units, such as Random Access Memory (RAM) 14201 and/or cache memory 14202, and may further include Read Only Memory (ROM) 14203.
The memory unit 1420 may also include a program/utility 14204 having a set (at least one) of program modules 14205, such program modules 14205 including, but not limited to: an operating system, one or more application programs, other program modules, and program data, each or some combination of which may include an implementation of a network environment.
Bus 1430 may be a local bus representing one or more of several types of bus structures including a memory unit bus or memory unit controller, a peripheral bus, an accelerated graphics port, a processing unit, or using any of a variety of bus architectures.
The electronic device 1400 may also communicate with one or more external devices 1500 (e.g., keyboard, pointing device, bluetooth device, etc.), one or more devices that enable a user to interact with the electronic device 1400, and/or any device (e.g., router, modem, etc.) that enables the electronic device 1400 to communicate with one or more other computing devices. Such communication may occur through an input/output (I/O) interface 1450. Also, electronic device 1400 may communicate with one or more networks such as a Local Area Network (LAN), a Wide Area Network (WAN), and/or a public network, such as the Internet, through network adapter 1460. As shown, the network adapter 1460 communicates with other modules of the electronic device 1400 via the bus 1430. It should be appreciated that although not shown, other hardware and/or software modules may be used in connection with electronic device 1400, including, but not limited to: microcode, device drivers, redundant processing units, external disk drive arrays, RAID systems, tape drives, data backup storage systems, and the like.
From the above description of embodiments, those skilled in the art will readily appreciate that the example embodiments described herein may be implemented in software, or may be implemented in software in combination with the necessary hardware. Thus, the technical solution according to the embodiments of the present disclosure may be embodied in the form of a software product, which may be stored in a non-volatile storage medium (may be a CD-ROM, a U-disk, a mobile hard disk, etc.) or on a network, including several instructions to cause a computing device (may be a personal computer, a server, a terminal device, or a network device, etc.) to perform the method according to the embodiments of the present disclosure.
Furthermore, the above-described figures are only schematic illustrations of processes included in the method according to the exemplary embodiments of the present disclosure, and are not intended to be limiting. It will be readily appreciated that the processes shown in the above figures do not indicate or limit the temporal order of these processes. In addition, it is also readily understood that these processes may be performed synchronously or asynchronously, for example, among a plurality of modules.
Other embodiments of the disclosure will be apparent to those skilled in the art from consideration of the specification and practice of the disclosure disclosed herein. This application is intended to cover any adaptations, uses, or adaptations of the disclosure following, in general, the principles of the disclosure and including such departures from the present disclosure as come within known or customary practice within the art to which the disclosure pertains. It is intended that the specification and examples be considered as exemplary only, with a true scope and spirit of the disclosure being indicated by the following claims.

Claims (11)

1. A method for selecting virtual units in a game, wherein a game picture of the game is displayed through a graphical user interface of a display assembly, the game picture comprises a part or all of a game scene and a plurality of virtual units to be selected positioned in the game scene, and the method comprises:
responding to control operation of the touch medium for a virtual control in a graphical user interface, and adjusting the display size of an initial selection frame displayed in the graphical user interface to generate a target selection frame;
selecting a target virtual unit from the plurality of virtual units to be selected according to the position relation between the target selection frame and the plurality of virtual units to be selected;
controlling the target virtual unit to execute game actions;
displaying a game picture obtained by shooting part or all of game scenes and a plurality of virtual units positioned in the game scenes through a graphical user interface;
selecting the target virtual unit from the plurality of virtual units to be selected according to the position relationship between the target selection frame and the plurality of virtual units to be selected, including:
responding to the sliding operation of a second preset area in a graphical user interface, and adjusting the pose of a virtual camera in the game so that the display position of a virtual unit to be selected in the graphical user interface is at least partially positioned in the target selection frame;
Selecting a target virtual unit from the plurality of virtual units to be selected according to the virtual units to be selected, which are at least partially positioned in the target selection frame, at a display position in the graphical user interface; or alternatively
Selecting the target virtual unit from the plurality of virtual units to be selected according to the position relationship between the target selection frame and the plurality of virtual units to be selected, including:
responding to a sliding operation of a second preset area in a graphical user interface, and moving the target selection frame in the graphical user interface so that the display position of a virtual unit to be selected in the graphical user interface is at least partially positioned in the target selection frame;
and selecting a target virtual unit from the plurality of virtual units to be selected according to the virtual units to be selected, which are at least partially positioned in the target selection frame, of the display positions in the graphical user interface.
2. The method of claim 1, wherein prior to responding to a control operation of the touch medium for a virtual control in the graphical user interface, the method further comprises:
and responding to the triggering operation of the touch medium for a first preset area in the graphical user interface, displaying a virtual control in the graphical user interface, and displaying an initial selection frame at a preset position of the graphical user interface.
3. The method of in-game virtual unit selection according to claim 1 or 2, wherein the virtual control comprises a rocker and a chassis, the rocker being located in the chassis;
the step of responding to the control operation of the touch medium for the virtual control in the graphical user interface, the step of adjusting the display size of the initial selection frame displayed in the graphical user interface to generate a target selection frame comprises the following steps:
and in response to the sliding operation of the touch medium on the rocker in the chassis, adjusting the display size of an initial selection frame displayed in the graphical user interface according to the sliding distance of the sliding operation so as to generate a target selection frame.
4. The method of claim 1, wherein the virtual control comprises a rocker and a chassis, the rocker being located in the chassis;
the method for generating the target selection frame comprises the steps of responding to control operation of a touch medium for a virtual control in a graphical user interface, adjusting the display size of an initial selection frame displayed in the graphical user interface to generate the target selection frame, and comprising the following steps:
detecting a pressing force value of the touch medium for the rocker in response to a pressing operation of the touch medium for the rocker;
And adjusting the display size of the initial selection frame displayed in the graphical user interface according to the pressing force value based on a preset mapping relation so as to generate a target selection frame.
5. The method of claim 1, wherein the virtual control comprises a rocker and a chassis, the rocker being located in the chassis;
the method for generating the target selection frame comprises the steps of responding to control operation of a touch medium for a virtual control in a graphical user interface, adjusting the display size of an initial selection frame displayed in the graphical user interface to generate the target selection frame, and comprising the following steps:
and when the touch medium is detected to be in a contact state with the rocker, responding to the sliding operation in a second preset area, and adjusting the display size of an initial selection frame displayed in the graphical user interface according to the sliding distance and/or the sliding direction of the sliding operation so as to generate a target selection frame.
6. The method for selecting a virtual unit in a game according to claim 4 or 5, wherein selecting a target virtual unit from a plurality of virtual units to be selected according to a positional relationship between the target selection frame and the plurality of virtual units to be selected, comprises:
Responding to the sliding operation of a touch medium in the chassis for the rocker, and moving the target selection frame in the graphical user interface so that the display position of a virtual unit to be selected in the graphical user interface is at least partially positioned in the target selection frame;
and selecting a target virtual unit from the plurality of virtual units to be selected according to the virtual units to be selected, which are at least partially positioned in the target selection frame, of the display positions in the graphical user interface.
7. The method for selecting virtual units in a game according to claim 1, wherein the selecting a target virtual unit from the plurality of virtual units to be selected based on the virtual units to be selected located at least partially within the target selection frame at a display position in the graphical user interface comprises:
and selecting a target virtual unit from the plurality of virtual units to be selected according to the position overlapping degree between the display position of the virtual unit to be selected and the target selection frame.
8. The method for selecting a virtual unit in a game according to claim 1, wherein selecting a target virtual unit from a plurality of virtual units to be selected according to a positional relationship between the target selection frame and the plurality of virtual units to be selected, comprises:
When the touch medium and the virtual control are detected to change from a contact state to a non-contact state, selecting a target virtual unit from the plurality of virtual units to be selected according to the position relation between the target selection frame and the plurality of virtual units to be selected.
9. A virtual unit selection apparatus in a game applied to the virtual unit selection method in a game of claim 1, wherein a game screen of the game is presented through a graphical user interface of a display assembly, the game screen including a part or all of a game scene and a plurality of virtual units to be selected located in the game scene, the apparatus comprising:
the target selection frame generation module is configured to respond to control operation of the touch medium on the virtual control in the graphical user interface, and adjust the display size of the initial selection frame displayed in the graphical user interface to generate a target selection frame;
the selection module is configured to select a target virtual unit from the plurality of virtual units to be selected according to the position relation between the target selection frame and the plurality of virtual units to be selected;
a control module configured to control the target virtual unit to perform a game action;
Displaying a game picture obtained by shooting part or all of game scenes and a plurality of virtual units positioned in the game scenes through a graphical user interface;
the selection module is configured to:
responding to the sliding operation of a second preset area in a graphical user interface, and adjusting the pose of a virtual camera in the game so that the display position of a virtual unit to be selected in the graphical user interface is at least partially positioned in the target selection frame;
selecting a target virtual unit from the plurality of virtual units to be selected according to the virtual units to be selected, which are at least partially positioned in the target selection frame, at a display position in the graphical user interface; or alternatively
The selection module is configured to:
responding to a sliding operation of a second preset area in a graphical user interface, and moving the target selection frame in the graphical user interface so that the display position of a virtual unit to be selected in the graphical user interface is at least partially positioned in the target selection frame;
and selecting a target virtual unit from the plurality of virtual units to be selected according to the virtual units to be selected, which are at least partially positioned in the target selection frame, of the display positions in the graphical user interface.
10. A computer readable medium, on which a computer program is stored, characterized in that the program, when being executed by a processor, implements the method according to any one of claims 1 to 8.
11. An electronic device, comprising:
one or more processors;
storage means for storing one or more programs which when executed by the one or more processors cause the one or more processors to implement the method of any of claims 1 to 8.
CN202110856819.8A 2021-07-28 2021-07-28 Virtual unit selection method and device in game, storage medium and electronic equipment Active CN113457144B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110856819.8A CN113457144B (en) 2021-07-28 2021-07-28 Virtual unit selection method and device in game, storage medium and electronic equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110856819.8A CN113457144B (en) 2021-07-28 2021-07-28 Virtual unit selection method and device in game, storage medium and electronic equipment

Publications (2)

Publication Number Publication Date
CN113457144A CN113457144A (en) 2021-10-01
CN113457144B true CN113457144B (en) 2024-02-02

Family

ID=77882860

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110856819.8A Active CN113457144B (en) 2021-07-28 2021-07-28 Virtual unit selection method and device in game, storage medium and electronic equipment

Country Status (1)

Country Link
CN (1) CN113457144B (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114377383A (en) * 2021-12-02 2022-04-22 网易(杭州)网络有限公司 Information processing method, device, equipment and storage medium

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109771941A (en) * 2019-03-13 2019-05-21 网易(杭州)网络有限公司 Selection method and device, the equipment and medium of virtual objects in game
JP2019100417A (en) * 2017-11-30 2019-06-24 ジヤトコ株式会社 Vehicle control device and vehicle control method
CN110302530A (en) * 2019-08-08 2019-10-08 网易(杭州)网络有限公司 Virtual unit control method, device, electronic equipment and storage medium
CN110420459A (en) * 2019-07-29 2019-11-08 网易(杭州)网络有限公司 Virtual unit is formed a team control method, device, electronic equipment and storage medium
CN110665225A (en) * 2019-10-08 2020-01-10 网易(杭州)网络有限公司 Control method and device in game
CN111346369A (en) * 2020-03-02 2020-06-30 网易(杭州)网络有限公司 Shooting game interaction method and device, electronic equipment and storage medium
CN112933591A (en) * 2021-03-15 2021-06-11 网易(杭州)网络有限公司 Method and device for controlling game virtual character, storage medium and electronic equipment

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2019100417A (en) * 2017-11-30 2019-06-24 ジヤトコ株式会社 Vehicle control device and vehicle control method
CN109771941A (en) * 2019-03-13 2019-05-21 网易(杭州)网络有限公司 Selection method and device, the equipment and medium of virtual objects in game
CN110420459A (en) * 2019-07-29 2019-11-08 网易(杭州)网络有限公司 Virtual unit is formed a team control method, device, electronic equipment and storage medium
CN110302530A (en) * 2019-08-08 2019-10-08 网易(杭州)网络有限公司 Virtual unit control method, device, electronic equipment and storage medium
CN110665225A (en) * 2019-10-08 2020-01-10 网易(杭州)网络有限公司 Control method and device in game
CN111346369A (en) * 2020-03-02 2020-06-30 网易(杭州)网络有限公司 Shooting game interaction method and device, electronic equipment and storage medium
CN112933591A (en) * 2021-03-15 2021-06-11 网易(杭州)网络有限公司 Method and device for controlling game virtual character, storage medium and electronic equipment

Also Published As

Publication number Publication date
CN113457144A (en) 2021-10-01

Similar Documents

Publication Publication Date Title
CN107913520B (en) Information processing method, information processing device, electronic equipment and storage medium
US20220326844A1 (en) Displaying a three dimensional user interface
CN107977141B (en) Interaction control method and device, electronic equipment and storage medium
WO2018177170A1 (en) Display control method and apparatus for game picture, storage medium and electronic device
JP6013583B2 (en) Method for emphasizing effective interface elements
US9436369B2 (en) Touch interface for precise rotation of an object
US9437038B1 (en) Simulating three-dimensional views using depth relationships among planes of content
US20140118268A1 (en) Touch screen operation using additional inputs
JP6048898B2 (en) Information display device, information display method, and information display program
US10191612B2 (en) Three-dimensional virtualization
CN112907760B (en) Three-dimensional object labeling method and device, tool, electronic equipment and storage medium
CN113546419B (en) Game map display method, game map display device, terminal and storage medium
CN113559501B (en) Virtual unit selection method and device in game, storage medium and electronic equipment
CN108355352B (en) Virtual object control method and device, electronic device and storage medium
CN111481923B (en) Rocker display method and device, computer storage medium and electronic equipment
WO2019166005A1 (en) Smart terminal, sensing control method therefor, and apparatus having storage function
CN113457144B (en) Virtual unit selection method and device in game, storage medium and electronic equipment
CN113457117B (en) Virtual unit selection method and device in game, storage medium and electronic equipment
KR20180058097A (en) Electronic device for displaying image and method for controlling thereof
CN112473138B (en) Game display control method and device, readable storage medium and electronic equipment
CN108499102B (en) Information interface display method and device, storage medium and electronic equipment
CN113360064A (en) Method and device for searching local area of picture, medium and electronic equipment
CN117695648A (en) Virtual character movement and visual angle control method, device, electronic equipment and medium
EP4207090A1 (en) A method of learning a target object using a virtual viewpoint camera and a method of augmenting a virtual model on a real object that implements the target object using the same
CN116020110A (en) Game picture control method and device, computer storage medium and electronic equipment

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant