CN114504811A - Virtual object selection method and device, storage medium and processor - Google Patents

Virtual object selection method and device, storage medium and processor Download PDF

Info

Publication number
CN114504811A
CN114504811A CN202210094646.5A CN202210094646A CN114504811A CN 114504811 A CN114504811 A CN 114504811A CN 202210094646 A CN202210094646 A CN 202210094646A CN 114504811 A CN114504811 A CN 114504811A
Authority
CN
China
Prior art keywords
touch
area
touch area
user interface
graphical user
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202210094646.5A
Other languages
Chinese (zh)
Inventor
许展豪
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Netease Hangzhou Network Co Ltd
Original Assignee
Netease Hangzhou Network Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Netease Hangzhou Network Co Ltd filed Critical Netease Hangzhou Network Co Ltd
Priority to CN202210094646.5A priority Critical patent/CN114504811A/en
Publication of CN114504811A publication Critical patent/CN114504811A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/40Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment
    • A63F13/42Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment by mapping the input signals into game commands, e.g. mapping the displacement of a stylus on a touch screen to the steering angle of a virtual vehicle
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/0486Drag-and-drop
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Multimedia (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The invention discloses a virtual object selection method, a virtual object selection device, a storage medium and a processor. The method comprises the following steps: responding to a first touch operation acting on the second touch area, and generating a third touch area in the first touch area; detecting a second touch operation acting on the third touch area, and generating a target selection area on the graphical user interface based on the movement of a touch point of the second touch operation; a virtual object in the target selection area is selected. By the method and the device, the technical effect of improving the efficiency of selecting the virtual object is achieved.

Description

Virtual object selection method and device, storage medium and processor
Technical Field
The present invention relates to the field of computers, and in particular, to a method, an apparatus, a storage medium, and a processor for selecting a virtual object.
Background
At present, in a game with high instantaneity such as a real-time strategic game, a very important operation is an operation of framing a virtual object.
In the related art, the frame selection can be triggered by long-pressing the screen, but the interaction mode has insufficient real-time performance, response can be realized by long-pressing, and meanwhile, the interaction mode is easy to be confused with other operations, so that the technical problem of low efficiency of selecting the virtual object exists.
Aiming at the problem of low efficiency of selecting virtual objects in the prior art, no effective solution is provided at present.
Disclosure of Invention
The invention mainly aims to provide a virtual object selection method, a virtual object selection device, a storage medium and a processor, and at least solves the technical problem of low efficiency of virtual object selection.
In order to achieve the above object, according to one aspect of the present invention, there is provided a method of selecting a virtual object. The method can comprise the following steps: responding to a first touch operation acting on the second touch area, and generating a third touch area in the first touch area; detecting a second touch operation acting on the third touch area, and generating a target selection area on the graphical user interface based on the movement of a touch point of the second touch operation; a virtual object in the target selection area is selected.
Optionally, generating a third touch area in the first touch area in response to the first touch operation acting on the second touch area includes: responding to a first sub-touch operation acting on a second touch area, and determining an original position on the graphical user interface based on an original touch point of the first sub-touch operation, wherein the second touch operation comprises the first sub-touch operation; generating a fourth touch area in the graphical user interface based on the original position; a third touch area is generated in the first touch area based on the fourth touch area.
Optionally, generating a third touch area in the first touch area based on the fourth touch area includes: and responding to a second sub-touch operation acting on the second touch area, and converting the fourth touch area from the original position to the target position to obtain a third touch area, wherein the second touch operation comprises the second sub-touch operation.
Optionally, generating a third touch area in the first touch area based on the fourth touch area includes: and determining the fourth touch area as a third touch area.
Optionally, determining an original position on the graphical user interface based on the original touch point of the first sub-touch operation, including: determining a target direction of the original touch point pointing to the graphical user interface, wherein the target direction is perpendicular to the edge of the terminal device; an original position is determined along the target direction on the graphical user interface.
Optionally, the second touch operation is a sliding touch operation, detecting the second touch operation acting on the third touch area, and generating a target selection area on the graphical user interface based on movement of a touch point of the second touch operation, including: responding to sliding touch operation acting on a third touch area and a fifth touch area, determining a starting touch point when the sliding touch operation starts in the third touch area, and determining a termination touch point when the sliding touch operation ends in the fifth touch area, wherein the fifth touch area is a touch area except the third touch area in the area on the graphical user interface; a target selection area is generated on the graphical user interface based on the start touch point and the end touch point.
Optionally, generating a target selection area on the graphical user interface based on the start touch point and the end touch point includes: determining the length and width of the target selection area based on the starting touch point and the ending touch point; and determining a target selection area based on the length and the width, wherein the target selection area is a rectangular area.
Optionally, if it is determined that the virtual object does not appear in the target selection area, adjusting the target selection area in response to a third touch operation applied to the target selection area until the virtual object appears in the target selection area.
Optionally, when it is detected that the first touch operation is ended, the third touch area is cancelled in the first touch area.
In order to achieve the above object, according to another aspect of the present invention, there is also provided a virtual object selection apparatus. The apparatus may include: the first generating unit is used for responding to a first touch operation acted on the second touch area and generating a third touch area in the first touch area; a second generation unit configured to detect a second touch operation applied to the third touch area, and generate a target selection area on the graphical user interface based on movement of a touch point of the second touch operation; and the selection unit is used for selecting the virtual object in the target selection area.
In order to achieve the above object, according to another aspect of the present invention, there is also provided a nonvolatile storage medium. The non-volatile storage medium stores a computer program, wherein when the computer program is executed by a processor, the apparatus where the computer readable storage medium is located is controlled to execute the method for selecting a virtual object according to the embodiment of the present invention.
To achieve the above object, according to another aspect of the present invention, there is also provided a processor. The processor is configured to run a program, wherein the program is configured to perform the method for selecting a virtual object of an embodiment of the invention when running.
In order to achieve the above object, according to another aspect of the present invention, an electronic device is provided. The electronic device comprises a memory and a processor, and is characterized in that the memory stores a computer program, and the processor is configured to run the computer program to execute the virtual object selection method of the embodiment of the invention.
In at least some embodiments of the present invention, in response to a first touch operation acting on a second touch area, a third touch area is generated in the first touch area; detecting a second touch operation acting on the third touch area, and generating a target selection area on the graphical user interface based on the movement of a touch point of the second touch operation; a virtual object in the target selection area is selected. In other words, the invention generates a third touch area (hot area) on the first touch area (front screen) by operating at the second touch area (special touch area), and determines the frame for frame selection based on the third touch operation of the third touch area, thereby achieving the technical effect of improving the efficiency of selecting the virtual object and solving the technical problem of low efficiency of selecting the virtual object.
Drawings
The accompanying drawings, which are incorporated in and constitute a part of this application, illustrate embodiments of the invention and, together with the description, serve to explain the invention and not to limit the invention. In the drawings:
fig. 1 is a block diagram of a hardware configuration of a mobile terminal according to a method for selecting a virtual object according to an embodiment of the present invention;
FIG. 2 is a flow diagram of a method for selecting a virtual object according to one embodiment of the invention;
FIG. 3 is a schematic diagram of a selection interface for a virtual object according to one embodiment of the invention;
fig. 4 is a block diagram of a virtual object selection apparatus according to an embodiment of the present invention.
Detailed Description
In order to make the technical solutions of the present invention better understood, the technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are only a part of the embodiments of the present invention, and not all of the embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
It should be noted that the terms "first," "second," and the like in the description and claims of the present invention and in the drawings described above are used for distinguishing between similar elements and not necessarily for describing a particular sequential or chronological order. It is to be understood that the data so used is interchangeable under appropriate circumstances such that the embodiments of the invention described herein are capable of operation in sequences other than those illustrated or described herein. Furthermore, the terms "comprises," "comprising," and "having," and any variations thereof, are intended to cover a non-exclusive inclusion, such that a process, method, system, article, or apparatus that comprises a list of steps or elements is not necessarily limited to those steps or elements expressly listed, but may include other steps or elements not expressly listed or inherent to such process, method, article, or apparatus.
In accordance with one embodiment of the present invention, there is provided an embodiment of a method for selecting a virtual object, it should be noted that the steps illustrated in the flowchart of the accompanying drawings may be performed in a computer system such as a set of computer-executable instructions, and that while a logical order is illustrated in the flowchart, in some cases the steps illustrated or described may be performed in an order different than here.
The method embodiments may be performed in a mobile terminal, a computer terminal or a similar computing device. Taking the example of the mobile terminal running on the mobile terminal, the mobile terminal may be a terminal device such as a smart phone (e.g., an Android phone, an iOS phone, etc.), a tablet computer, a palm computer, a mobile internet device (MID for short), a PAD, a game machine, etc. Fig. 1 is a block diagram of a hardware configuration of a mobile terminal according to a method for selecting a virtual object according to an embodiment of the present invention. As shown in fig. 1, the mobile terminal may include one or more (only one shown in fig. 1) processors 102 (the processors 102 may include, but are not limited to, a Central Processing Unit (CPU), a Graphics Processing Unit (GPU), a Digital Signal Processing (DSP) chip, a Microprocessor (MCU), a programmable logic device (FPGA), a neural Network Processor (NPU), a Tensor Processor (TPU), an Artificial Intelligence (AI) type processor, etc.) and a memory 104 for storing data. Optionally, the mobile terminal may further include a transmission device 106, an input/output device 108, and a display device 110 for communication functions. It will be understood by those skilled in the art that the structure shown in fig. 1 is only an illustration, and does not limit the structure of the mobile terminal. For example, the mobile terminal may also include more or fewer components than shown in FIG. 1, or have a different configuration than shown in FIG. 1.
The memory 104 may be used to store a computer program, for example, a software program and a module of application software, such as a computer program corresponding to the virtual object selection method in the embodiment of the present invention, and the processor 102 executes various functional applications and data processing by running the computer program stored in the memory 104, so as to implement the above-mentioned virtual object selection method. The memory 104 may include high speed random access memory, and may also include non-volatile memory, such as one or more magnetic storage devices, flash memory, or other non-volatile solid-state memory. In some examples, the memory 104 may further include memory located remotely from the processor 102, which may be connected to the mobile terminal over a network. Examples of such networks include, but are not limited to, the internet, intranets, local area networks, mobile communication networks, and combinations thereof.
The transmission device 106 is used to receive or transmit data via a network. Specific examples of the network described above may include a wireless network provided by a communication provider of the mobile terminal. In one example, the transmission device 106 includes a Network adapter (NIC) that can be connected to other Network devices through a base station to communicate with the internet. In one example, the transmission device 106 may be a Radio Frequency (RF) module, which is used to communicate with the internet in a wireless manner.
The inputs in the input output Device 108 may come from a plurality of Human Interface Devices (HIDs). For example: keyboard and mouse, game pad, other special game controller (such as steering wheel, fishing rod, dance mat, remote controller, etc.). Some human interface devices may provide output functions in addition to input functions, such as: force feedback and vibration of the gamepad, audio output of the controller, etc.
The display device 110 may be, for example, a head-up display (HUD), a touch screen type Liquid Crystal Display (LCD), and a touch display (also referred to as a "touch screen" or "touch display screen"). The liquid crystal display may enable a user to interact with a user interface of the mobile terminal. In some embodiments, the mobile terminal has a Graphical User Interface (GUI) with which a user can interact by touching finger contacts and/or gestures on a touch-sensitive surface, where the human-machine interaction function optionally includes the following interactions: executable instructions for creating web pages, drawing, word processing, making electronic documents, games, video conferencing, instant messaging, emailing, call interfacing, playing digital video, playing digital music, and/or web browsing, etc., for performing the above-described human-computer interaction functions, are configured/stored in one or more processor-executable computer program products or readable storage media.
The virtual object selection method in one embodiment of the present disclosure may be executed on a local terminal device or a server. When the virtual object selection method is executed on a server, the method can be implemented and executed based on a cloud interaction system, wherein the cloud interaction system comprises the server and a client device.
In an optional embodiment, various cloud applications may be run under the cloud interaction system, for example: and (5) cloud games. Taking a cloud game as an example, a cloud game refers to a game mode based on cloud computing. In the cloud game operation mode, the game program operation main body and the game picture presentation main body are separated, the storage and operation of the virtual object selection method are completed on the cloud game server, and the client device is used for receiving and sending data and presenting the game picture, for example, the client device can be a display device with a data transmission function close to a user side, such as a mobile terminal, a television, a computer, a palm computer and the like; but the cloud gaming server in the cloud end performs the selection of the virtual object. When a game is played, a player operates the client device to send an operation instruction to the cloud game server, the cloud game server runs the game according to the operation instruction, data such as game pictures and the like are encoded and compressed, the data are returned to the client device through a network, and finally the data are decoded through the client device and the game pictures are output.
In an optional implementation manner, taking a game as an example, the local terminal device stores a game program and is used for presenting a game screen. The local terminal device is used for interacting with the player through a graphical user interface, namely, a game program is downloaded and installed and operated through an electronic device conventionally. The manner in which the local terminal device provides the graphical user interface to the player may include a variety of ways, for example, it may be rendered for display on a display screen of the terminal or provided to the player through holographic projection. For example, the local terminal device may include a display screen for presenting a graphical user interface including a game screen and a processor for running the game, generating the graphical user interface, and controlling display of the graphical user interface on the display screen.
In this embodiment, a method for selecting a virtual object running on the mobile terminal is provided, where a graphical user interface is provided through a touch display screen of a terminal device, content displayed by the graphical user interface includes a first touch area, and a second touch area is arranged at an edge of the terminal device, where touch display may be a front screen to be operated by a mobile terminal; the second touch area can be a special contact area, can be a special borrow according to actual requirements, and can also be the edge of a curved screen.
Optionally, the content presented by the graphical user interface may include all of the game scene, or may be a part of the game scene, and the game scene may include virtual objects such as virtual game characters, the ground, mountains, stones, flowers, grasses, trees, buildings, and the like.
Alternatively, the content presented by the graphical user interface may include all of the virtual object or may be part of the virtual object. For example, in the third person perspective game, the content presented by the graphical user interface may include all of the virtual object, and in the first person perspective game, the content presented by the graphical user interface may include part of the virtual object.
Fig. 2 is a flowchart of a method for selecting a virtual object, which provides a graphical user interface through a terminal device, according to an embodiment of the present invention, and as shown in fig. 2, the method may include the following steps.
In step S202, a third touch area is generated in the first touch area in response to the first touch operation acting on the second touch area.
In the technical solution provided by step S202 of the present invention, a first touch operation is performed on the second touch area, and a third touch area is generated in the first touch area in response to the first touch operation acting on the second touch area, where the first touch operation may be a constant pressure operation, and is not limited herein, as long as the third touch area generated by the first touch operation on the second touch area should be within the protection range of the present invention, the third touch area may be a zone generated on the first touch area from the second touch area, and may also be referred to as a hot zone.
Optionally, the second touch area is pressed, and a hot area is formed on the front screen of the mobile terminal in response to the pressing operation acting on the second touch area.
Step S204 is to detect a second touch operation applied to the third touch area, and generate a target selection area on the graphical user interface based on movement of a touch point of the second touch operation.
In the technical solution provided in step S204 of the present invention, a second touch operation acting on the third touch area is detected, and a touch point of the second touch operation moves on the graphical user interface to generate a target selection area, where the second touch operation may be any position of pressing the third touch area; the target selection area may be an area in which frame selection is finally determined in the first touch area; it can be concluded that the first touch area comprises the third touch area and the target selection area.
Optionally, any position of the third touch area may be clicked and dragged to any position of the first touch area, a rectangular area is determined from the clicked position to the last released position, and a finally determined frame selection area, that is, a target selection area, is obtained.
In step S206, a virtual object in the target selection area is selected.
In the technical solution provided by step S206 of the present invention, the framed virtual object in the target selection area may be displayed in a selection frame of the graphical user interface, and the virtual object in the selection frame may be selected according to actual requirements.
Through the above steps S202 to S206 of the present invention, a third touch area is generated in the first touch area in response to the first touch operation acting on the second touch area; detecting a second touch operation acting on the third touch area, and generating a target selection area on the graphical user interface based on the movement of a touch point of the second touch operation; a virtual object in the target selection area is selected. In other words, the invention generates a third touch area (hot area) on the first touch area (front screen) by operating at the second touch area (special touch area), and determines the frame for frame selection based on the third touch operation of the third touch area, thereby achieving the technical effect of improving the efficiency of selecting the virtual object and solving the technical problem of low efficiency of selecting the virtual object.
The above method of this embodiment is further described below.
As an optional implementation manner, in step S202, in response to the first touch operation acting on the second touch area, generating a third touch area in the first touch area includes: responding to a first sub-touch operation acting on a second touch area, and determining an original position on the graphical user interface based on an original touch point of the first sub-touch operation, wherein the second touch operation comprises the first sub-touch operation; generating a fourth touch area in the graphical user interface based on the original position; a third touch area is generated in the first touch area based on the fourth touch area.
In this embodiment, in response to a first sub-touch operation acting on a second touch area, an original position is determined on the graphical user interface based on an original touch point of the first sub-touch operation, a fourth touch area is generated in the graphical user interface based on the original position, and a third touch area is generated in the first touch area based on the fourth touch area, wherein the second touch operation includes the first sub-touch operation, the first sub-touch operation may be a long press operation, a click operation, a slide operation, or other touch operations, the original touch point may be a point which is contacted by the first sub-touch operation at the beginning, and the fourth touch area may be a hot area which is formed at the beginning.
Optionally, a long press operation is performed on the original touch point of the second touch operation area, based on the original touch point position of the first sub-touch operation, and based on the original touch point position of the first sub-touch operation, the position of a fourth touch area (a hot area formed at the beginning), that is, the original position of the third touch area is determined on the graphical user interface, and the fourth touch area is moved by moving the touch point position of the first sub-touch operation in the first touch area, so as to achieve the purpose of generating the third touch area in the first touch area.
As an optional implementation manner, generating the third touch area in the first touch area based on the fourth touch area includes: and responding to a second sub-touch operation acting on the second touch area, and converting the fourth touch area from the original position to the target position to obtain a third touch area, wherein the second touch operation comprises the second sub-touch operation.
In this embodiment, in response to a second sub-touch operation applied to the second touch area, the fourth touch area is converted from the original position to the target position to obtain a third touch area, where the second touch operation includes the second sub-touch operation, and the second sub-touch operation may be a sliding operation.
Optionally, in response to the second sub-touch operation applied to the second touch area, the contact point of the second touch area (edge screen) may be optionally moved to change the position of the hot area on the front screen, so as to convert the fourth touch area from the original position to the target position, thereby obtaining the third touch area.
As an optional implementation manner, generating the third touch area in the first touch area based on the fourth touch area includes: and determining the fourth touch area as a third touch area.
In this embodiment, in response to the second sub-touch operation acting on the second touch area, the fourth touch area is converted from the original position to the target position, and the fourth touch area is determined as the third touch area, so as to generate the third touch area in the first touch area.
Optionally, the position of the hot area (i.e., the third touch area) is determined only by the position of the edge screen, the contact point changes, and the position of the hot area also changes, that is, the final position of the fourth touch area is determined by converting the fourth touch area from the original position to the target position in response to the second sub-touch operation applied to the second touch area, and the fourth touch area is determined as the third touch area, so as to obtain the third touch area.
As an optional implementation, determining an original position on the graphical user interface based on the original touch point of the first sub-touch operation includes: determining a target direction of the original touch point pointing to the graphical user interface, wherein the target direction is perpendicular to the edge of the terminal device; an original position is determined along the target direction on the graphical user interface.
In this embodiment, it is determined that the original touch point points in a target direction of the graphical user interface, and the original position is determined on the graphical user interface along the target direction, where the target direction may be perpendicular to an edge of the terminal device.
Optionally, pressing on the edge curved screen of the mobile terminal forms a hot area on the front screen vertically downwards from the original touch point, and the original position of the target touch area is determined.
As an optional implementation manner, the second touch operation is a sliding touch operation, the second touch operation applied to the third touch area is detected, and the target selection area is generated on the graphical user interface based on the movement of the touch point of the second touch operation, including: responding to sliding touch operation acting on a third touch area and a fifth touch area, determining a starting touch point when the sliding touch operation starts in the third touch area, and determining a termination touch point when the sliding touch operation ends in the fifth touch area, wherein the fifth touch area is a touch area except the third touch area in the area on the graphical user interface; a target selection area is generated on the graphical user interface based on the start touch point and the end touch point.
In this embodiment, in response to a sliding touch operation acting on the third touch area and a fifth touch area, a start touch point at the start of the sliding touch operation is determined in the third touch area, and an end touch point at the end of the sliding touch operation is determined in the fifth touch area, and a target selection area is generated on the graphical user interface based on the start touch point and the end touch point, wherein the fifth touch area may be the first touch area in which an area other than the third touch area is processed.
Optionally, any position point selected on the third touch area is used as an initial touch point, the initial touch point is dragged to any position point on a fifth touch area in the first touch area, any position point selected on the third touch area is used as an initial touch point, any position point on the fifth touch area is used as a termination touch point, and a target selection area is generated on the graphical user interface based on the initial touch point and the termination touch point.
As an optional implementation, generating a target selection area on the graphical user interface based on the start touch point and the end touch point includes: determining the length and width of the target selection area based on the starting touch point and the ending touch point; and determining a target selection area based on the length and the width, wherein the target selection area is a rectangular area.
In this embodiment, any position point selected on the third touch area is used as a start touch point, the start touch point is dragged to any position point on the fifth touch area in the first touch area, any position point selected on the third touch area is used as a start touch point, any position point on the fifth touch area is used as an end touch point, and the length and width of the target selection area are determined based on the start touch point and the end touch point; a target selection area of a rectangular shape is determined based on the length and width.
As an optional implementation manner, when it is determined that the virtual object does not appear in the target selection area, the target selection area is adjusted in response to the third touch operation applied to the target selection area until the virtual object appears in the target selection area.
In this embodiment, if the virtual object does not appear in the target selection area, the target selection area is adjusted until the virtual object appears in the target selection area, wherein the target selection area may be adjusted by performing a third touch operation on the target selection area, and the third touch operation may be a dragging operation.
As an optional implementation manner, when it is detected that the first touch operation is ended, the third touch area is cancelled in the first touch area.
In this embodiment, when it is detected that the first touch operation is ended, the third touch area is cancelled in the first touch area by cancelling the first touch operation in the second touch area.
In the embodiment, a third touch area (hot area) is generated on the first touch area (front screen) by operating at the second touch area (special touch area), and the frame selection frame is determined based on the third touch operation of the third touch area, so that the technical effect of improving the virtual object selection efficiency is achieved, and the technical problem of low virtual object selection efficiency is solved.
The technical solutions of the embodiments of the present invention are further described below with reference to preferred embodiments.
Currently, in a game with high instantaneity, such as a real-time strategic game, a very important operation is an operation of framing a virtual object.
In the related art, the frame selection can be triggered by long-pressing the screen, but the interaction mode has insufficient real-time performance, response can be realized by long-pressing, and meanwhile, the interaction mode is easy to be confused with other operations, so that the technical problem of low efficiency of selecting the virtual object exists.
Therefore, the invention provides a scheme of rapid frame selection interaction based on the edge of the touch screen, and the technical effect of improving the efficiency of selecting the virtual object is realized.
The above-described method of this embodiment is further described below.
Fig. 3 is a schematic diagram of a selection interface of a virtual object according to an embodiment of the present invention, as shown in fig. 3, a hot area is formed on the front screen by a contact point vertically downward at an edge curved screen of the mobile phone, a position of the hot area on the front screen can be changed by randomly moving the contact point of the edge screen, at this time, the position of the hot area can be determined only by a position of an index finger contacting the edge screen, the contact point of the index finger changes, the position of the hot area changes accordingly, another finger is pressed and dragged out from the hot area, a box selected by the start point of dragging and the end point of final releasing is determined, and when the finger contacting the edge screen releases, the hot area disappears.
For a non-curved screen mobile phone and a common mobile phone screen, a special contact area is arranged at the top of the mobile phone screen to simulate the effect; in some foldable screens, since the connecting portion of the two screens is also a touch portion of the edge screen similar to the curved screen, and is a special contact area, the interaction effect can be realized, and the foldable screens also belong to the second touch area arranged at the edge of the terminal device. .
In the embodiment, a hot area is formed on the front screen vertically downwards by pressing the contact point at the edge curved screen of the mobile phone, the position of the hot area on the screen can be changed by moving the contact point of the edge screen, the hot area is clicked for dragging, and the frame for frame selection is determined by starting the pressed point and the finally loosened position, so that the technical effect of quickly and flexibly selecting the object is realized, and the technical problem that the object cannot be quickly and flexibly selected is solved.
Through the description of the foregoing embodiments, it is clear to those skilled in the art that the method according to the foregoing embodiments may be implemented by software plus a necessary general hardware platform, and certainly may also be implemented by hardware, but the former is a better implementation mode in many cases. Based on such understanding, the technical solutions of the present invention may be embodied in the form of a software product, which is stored in a storage medium (e.g., ROM/RAM, magnetic disk, optical disk) and includes instructions for enabling a terminal device (e.g., a mobile phone, a computer, a server, or a network device) to execute the method according to the embodiments of the present invention.
The embodiment of the present invention further provides a device for selecting a virtual object, where the device is used to implement the foregoing embodiment and preferred embodiments, and details of which have been already described are not repeated. As used below, the term "unit" may be a combination of software and/or hardware that implements a predetermined function. Although the means described in the embodiments below are preferably implemented in software, an implementation in hardware, or a combination of software and hardware is also possible and contemplated.
Fig. 4 is a block diagram of a configuration of a virtual object selection apparatus according to an embodiment of the present invention, where a graphical user interface is provided through a touch display screen of a terminal device, and content displayed by the graphical user interface includes a first touch area and a second touch area is disposed at an edge of the terminal device, as shown in fig. 4, the virtual object selection apparatus 400 may include: a first generation unit 401, a second generation unit 402 and a selection unit 403.
The first generating unit 401 is configured to generate a third touch area in the first touch area in response to the first touch operation acting on the second touch area.
A second generating unit 402, configured to detect a second touch operation applied to the third touch area, and generate a target selection area on the graphical user interface based on movement of a touch point of the second touch operation.
A selecting unit 403, configured to select a virtual object in the target selection area.
Optionally, the first generating unit 401 includes a first generating module, configured to determine, in response to a first sub-touch operation acting on a second touch area, an original position on the graphical user interface based on an original touch point of the first sub-touch operation, where the second touch operation includes the first sub-touch operation; generating a fourth touch area in the graphical user interface based on the original position; a third touch area is generated in the first touch area based on the fourth touch area.
Optionally, the first generating module comprises: and the first generation submodule is used for responding to a second sub-touch operation acting on the second touch area, converting the fourth touch area from the original position to the target position and obtaining a third touch area, wherein the second touch operation comprises the second sub-touch operation.
Optionally, the first generating module comprises: and the first determining submodule is used for determining the fourth touch area as the third touch area.
Optionally, the first generating module comprises: the second determining submodule is used for determining the target direction of the original touch point pointing to the graphical user interface, wherein the target direction is perpendicular to the edge of the terminal equipment; an original position is determined along the target direction on the graphical user interface.
Optionally, the second generating unit 402 includes: the first determining module is used for responding to sliding touch operation acting on a third touch area and a fifth touch area, determining a starting touch point when the sliding touch operation starts in the third touch area, and determining a termination touch point when the sliding touch operation ends in the fifth touch area, wherein the fifth touch area is a touch area except the third touch area on the graphical user interface; a target selection area is generated on the graphical user interface based on the start touch point and the end touch point.
Optionally, the second generating unit 402 includes: a second determining module for determining the length and width of the target selection area based on the start touch point and the end touch point; and determining a target selection area based on the length and the width, wherein the target selection area is a rectangular area.
Optionally, the apparatus further comprises: and the first adjusting unit is used for responding to a third touch operation acting on the target selection area and adjusting the target selection area until the virtual object appears in the target selection area when the virtual object does not appear in the target selection area.
In the virtual object selection apparatus of this embodiment, the present invention generates, by the first generation unit, a third touch area in the first touch area in response to the first touch operation acting on the second touch area; detecting, by a second generating unit, a second touch operation acting on a third touch area, and generating a target selection area on the graphical user interface based on movement of a touch point of the second touch operation; the virtual object in the target selection area is selected by a selection unit. In other words, the invention generates a third touch area (hot area) on the first touch area (front screen) by operating at the second touch area (special touch area), and determines the frame for frame selection based on the third touch operation of the third touch area, thereby achieving the technical effect of improving the efficiency of selecting the virtual object and solving the technical problem of low efficiency of selecting the virtual object.
It should be noted that, the above units may be implemented by software or hardware, and for the latter, the following may be implemented, but not limited to: the units are all positioned in the same processor; or, the above units may be located in different processors in any combination.
Embodiments of the present invention also provide a non-volatile storage medium having a computer program stored therein, where the computer program is configured to be executed by a processor to perform the method for selecting a virtual object of an embodiment of the present invention.
Alternatively, in the present embodiment, the above-mentioned nonvolatile storage medium may be configured to store a computer program for executing the steps of:
s1, generating a third touch area in the first touch area in response to the first touch operation acting on the second touch area;
s2, detecting a second touch operation applied to the third touch area, and generating a target selection area on the graphical user interface based on movement of a touch point of the second touch operation;
s3, selecting the virtual object in the target selection area.
Optionally, in this embodiment, the nonvolatile storage medium may include, but is not limited to: various media capable of storing computer programs, such as a usb disk, a Read-Only Memory (ROM), a Random Access Memory (RAM), a removable hard disk, a magnetic disk, or an optical disk.
Embodiments of the present invention also provide a processor for running a program, wherein the program is configured to execute the method for selecting a virtual object of an embodiment of the present invention when running.
Optionally, in this embodiment, the processor may be configured as a computer program operable to:
s1, generating a third touch area in the first touch area in response to the first touch operation acting on the second touch area;
s2, detecting a second touch operation applied to the third touch area, and generating a target selection area on the graphical user interface based on movement of a touch point of the second touch operation;
s3, selecting the virtual object in the target selection area.
Embodiments of the present invention further provide an electronic device, comprising a memory in which a computer program is stored and a processor configured to execute the computer program to perform the steps in any of the above method embodiments.
Optionally, the electronic apparatus may further include a transmission device and an input/output device, wherein the transmission device is connected to the processor, and the input/output device is connected to the processor.
Optionally, in this embodiment, the processor may be configured to execute the following steps by a computer program:
s1, generating a third touch area in the first touch area in response to the first touch operation acting on the second touch area;
s2, detecting a second touch operation applied to the third touch area, and generating a target selection area on the graphical user interface based on movement of a touch point of the second touch operation;
s3, selecting the virtual object in the target selection area.
Optionally, the specific examples in this embodiment may refer to the examples described in the above embodiments and optional implementation manners, and this embodiment is not described herein again.
The above-mentioned serial numbers of the embodiments of the present invention are merely for description and do not represent the merits of the embodiments.
In the above embodiments of the present invention, the descriptions of the respective embodiments have respective emphasis, and for parts that are not described in detail in a certain embodiment, reference may be made to related descriptions of other embodiments.
In the embodiments provided in the present application, it should be understood that the disclosed technical content can be implemented in other manners. The above-described embodiments of the apparatus are merely illustrative, and for example, the division of the units may be a logical division, and in actual implementation, there may be another division, for example, multiple units or components may be combined or integrated into another system, or some features may be omitted, or not executed. In addition, the shown or discussed mutual coupling or direct coupling or communication connection may be an indirect coupling or communication connection through some interfaces, units or modules, and may be in an electrical or other form.
The units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of units. Some or all of the units can be selected according to actual needs to achieve the purpose of the solution of the embodiment.
In addition, functional units in the embodiments of the present invention may be integrated into one processing unit, or each unit may exist alone physically, or two or more units are integrated into one unit. The integrated unit can be realized in a form of hardware, and can also be realized in a form of a software functional unit.
The integrated unit, if implemented in the form of a software functional unit and sold or used as a stand-alone product, may be stored in a computer readable storage medium. Based on such understanding, the technical solution of the present invention may be embodied in the form of a software product, which is stored in a storage medium and includes instructions for causing a computer device (which may be a personal computer, a server, or a network device) to execute all or part of the steps of the method according to the embodiments of the present invention. And the aforementioned storage medium includes: a U-disk, a Read-Only Memory (ROM), a Random Access Memory (RAM), a removable hard disk, a magnetic or optical disk, and other various media capable of storing program codes.
The foregoing is only a preferred embodiment of the present invention, and it should be noted that, for those skilled in the art, various modifications and decorations can be made without departing from the principle of the present invention, and these modifications and decorations should also be regarded as the protection scope of the present invention.

Claims (13)

1. A method for selecting a virtual object is characterized in that a graphical user interface is provided through a touch display screen of a terminal device, the content displayed by the graphical user interface comprises a first touch area, a second touch area is arranged at the edge of the terminal device, and the method comprises the following steps:
responding to a first touch operation acting on the second touch area, and generating a third touch area in the first touch area;
detecting a second touch operation acting on the third touch area, and generating a target selection area on the graphical user interface based on the movement of a touch point of the second touch operation;
and selecting the virtual object in the target selection area.
2. The method of claim 1, wherein generating a third touch area in the first touch area in response to a first touch operation acting on the second touch area comprises:
responding to a first sub-touch operation acting on the second touch area, and determining an original position on the graphical user interface based on an original touch point of the first sub-touch operation, wherein the second touch operation comprises the first sub-touch operation;
generating a fourth touch area in the graphical user interface based on the original position;
generating the third touch area in the first touch area based on the fourth touch area.
3. The method of claim 2, wherein generating the third touch area in the first touch area based on the fourth touch area comprises:
and responding to a second sub-touch operation acting on the second touch area, and converting the fourth touch area from the original position to a target position to obtain the third touch area, wherein the second touch operation comprises the second sub-touch operation.
4. The method of claim 2, wherein generating the third touch area in the first touch area based on the fourth touch area comprises:
determining the fourth touch area as the third touch area.
5. The method of claim 2, wherein determining an original position on the graphical user interface based on an original touch point of the first sub-touch operation comprises:
determining a target direction of the original touch point pointing to the graphical user interface, wherein the target direction is perpendicular to the edge of the terminal device;
determining the origin position along the target direction on the graphical user interface.
6. The method of claim 1, wherein the second touch operation is a sliding touch operation, wherein detecting the second touch operation applied to the third touch area, and wherein generating a target selection area on the graphical user interface based on movement of a touch point of the second touch operation comprises:
in response to the sliding touch operation acting on the third touch area and a fifth touch area, determining a starting touch point when the sliding touch operation starts in the third touch area and determining an ending touch point when the sliding touch operation ends in the fifth touch area, wherein the fifth touch area is a touch area of an area on the graphical user interface except the third touch area;
generating the target selection area on the graphical user interface based on the starting touch point and the ending touch point.
7. The method of claim 6, wherein generating the target selection area on the graphical user interface based on the start touch point and the end touch point comprises:
determining a length and a width of the target selection area based on the starting touch point and the ending touch point;
determining the target selection area based on the length and the width, wherein the target selection area is a rectangular area.
8. The method of claim 6, further comprising:
and if the virtual object does not appear in the target selection area, responding to a third touch operation acting on the target selection area, and adjusting the target selection area until the virtual object appears in the target selection area.
9. The method according to any one of claims 1 to 8,
and when the first touch operation is detected to be finished, canceling the third touch area in the first touch area.
10. A virtual object selection device is characterized in that a graphical user interface is provided through a touch display screen of a terminal device, the content displayed by the graphical user interface comprises a first touch area, a second touch area is arranged at the edge of the terminal device, and the device comprises:
a first generating unit, configured to generate a third touch area in the first touch area in response to a first touch operation performed on the second touch area;
a second generation unit configured to detect a second touch operation applied to the third touch area, and generate a target selection area on the graphical user interface based on movement of a touch point of the second touch operation;
and the selection unit is used for selecting the virtual object in the target selection area.
11. A non-volatile storage medium, in which a computer program is stored, wherein the computer program, when executed by a processor, controls an apparatus in which the storage medium is located to perform the method of any one of claims 1 to 9.
12. A processor, characterized in that the processor is configured to run a program, wherein the program is configured to perform the method of any of claims 1 to 9 when running.
13. An electronic device comprising a memory and a processor, wherein the memory has stored therein a computer program, and wherein the processor is configured to be executed by the processor to execute the computer program to perform the method of any of claims 1 to 9.
CN202210094646.5A 2022-01-26 2022-01-26 Virtual object selection method and device, storage medium and processor Pending CN114504811A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210094646.5A CN114504811A (en) 2022-01-26 2022-01-26 Virtual object selection method and device, storage medium and processor

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210094646.5A CN114504811A (en) 2022-01-26 2022-01-26 Virtual object selection method and device, storage medium and processor

Publications (1)

Publication Number Publication Date
CN114504811A true CN114504811A (en) 2022-05-17

Family

ID=81550407

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210094646.5A Pending CN114504811A (en) 2022-01-26 2022-01-26 Virtual object selection method and device, storage medium and processor

Country Status (1)

Country Link
CN (1) CN114504811A (en)

Similar Documents

Publication Publication Date Title
CN113908550A (en) Virtual character control method, nonvolatile storage medium, and electronic apparatus
JP2009061277A (en) Input terminal emulator for gaming device
CN113262489B (en) Game route generation method and device, nonvolatile storage medium and electronic device
CN113318428A (en) Game display control method, non-volatile storage medium, and electronic device
CN110704058A (en) Page rendering method and device, storage medium, processor and electronic device
CN111467791A (en) Target object control method, device and system
CN112891936A (en) Virtual object rendering method and device, mobile terminal and storage medium
CN107626105B (en) Game picture display method and device, storage medium and electronic equipment
CN114653059A (en) Method and device for controlling virtual character in game and non-volatile storage medium
CN113318429B (en) Control method and device for exiting game, processor and electronic device
CN113952740A (en) Method and device for sharing virtual props in game, storage medium and electronic equipment
CN113262472A (en) Processing method and device of option control, processor and electronic device
CN114504808A (en) Information processing method, information processing apparatus, storage medium, processor, and electronic apparatus
CN114504811A (en) Virtual object selection method and device, storage medium and processor
CN113318430B (en) Method and device for adjusting posture of virtual character, processor and electronic device
CN114832371A (en) Method, device, storage medium and electronic device for controlling movement of virtual character
CN113590013B (en) Virtual resource processing method, nonvolatile storage medium and electronic device
CN113952725A (en) Method and device for controlling skill control in game and electronic device
CN115105831A (en) Virtual object switching method and device, storage medium and electronic device
CN114504809A (en) Control method and device of virtual object, storage medium and processor
CN114504810A (en) Virtual game role selection method, device, storage medium and processor
CN114404932A (en) Skill release control method, skill release control device, storage medium and electronic device
CN113797527A (en) Game processing method, device, equipment, medium and program product
CN113986079A (en) Virtual button setting method and device, storage medium and electronic equipment
CN104036133A (en) Chess game system

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination