WO2021237942A1 - 对象选择方法及装置 - Google Patents
对象选择方法及装置 Download PDFInfo
- Publication number
- WO2021237942A1 WO2021237942A1 PCT/CN2020/107948 CN2020107948W WO2021237942A1 WO 2021237942 A1 WO2021237942 A1 WO 2021237942A1 CN 2020107948 W CN2020107948 W CN 2020107948W WO 2021237942 A1 WO2021237942 A1 WO 2021237942A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- area
- game
- control
- selection
- user interface
- Prior art date
Links
Images
Classifications
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F13/00—Video games, i.e. games using an electronically generated display having two or more dimensions
- A63F13/50—Controlling the output signals based on the game progress
- A63F13/53—Controlling the output signals based on the game progress involving additional visual information provided to the game scene, e.g. by overlay to simulate a head-up display [HUD] or displaying a laser sight in a shooting game
- A63F13/537—Controlling the output signals based on the game progress involving additional visual information provided to the game scene, e.g. by overlay to simulate a head-up display [HUD] or displaying a laser sight in a shooting game using indicators, e.g. showing the condition of a game character on screen
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F13/00—Video games, i.e. games using an electronically generated display having two or more dimensions
- A63F13/20—Input arrangements for video game devices
- A63F13/21—Input arrangements for video game devices characterised by their sensors, purposes or types
- A63F13/214—Input arrangements for video game devices characterised by their sensors, purposes or types for locating contacts on a surface, e.g. floor mats or touch pads
- A63F13/2145—Input arrangements for video game devices characterised by their sensors, purposes or types for locating contacts on a surface, e.g. floor mats or touch pads the surface being also a display device, e.g. touch screens
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F13/00—Video games, i.e. games using an electronically generated display having two or more dimensions
- A63F13/40—Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment
- A63F13/42—Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment by mapping the input signals into game commands, e.g. mapping the displacement of a stylus on a touch screen to the steering angle of a virtual vehicle
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F13/00—Video games, i.e. games using an electronically generated display having two or more dimensions
- A63F13/40—Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment
- A63F13/42—Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment by mapping the input signals into game commands, e.g. mapping the displacement of a stylus on a touch screen to the steering angle of a virtual vehicle
- A63F13/426—Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment by mapping the input signals into game commands, e.g. mapping the displacement of a stylus on a touch screen to the steering angle of a virtual vehicle involving on-screen location information, e.g. screen coordinates of an area at which the player is aiming with a light gun
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F13/00—Video games, i.e. games using an electronically generated display having two or more dimensions
- A63F13/50—Controlling the output signals based on the game progress
- A63F13/53—Controlling the output signals based on the game progress involving additional visual information provided to the game scene, e.g. by overlay to simulate a head-up display [HUD] or displaying a laser sight in a shooting game
- A63F13/533—Controlling the output signals based on the game progress involving additional visual information provided to the game scene, e.g. by overlay to simulate a head-up display [HUD] or displaying a laser sight in a shooting game for prompting the player, e.g. by displaying a game menu
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F13/00—Video games, i.e. games using an electronically generated display having two or more dimensions
- A63F13/55—Controlling game characters or game objects based on the game progress
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0481—Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
- G06F3/0482—Interaction with lists of selectable items, e.g. menus
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0484—Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
- G06F3/04842—Selection of displayed objects or displayed text elements
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0487—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
- G06F3/0488—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
- G06F3/04883—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures for inputting data by handwriting, e.g. gesture or text
Definitions
- the present disclosure relates to the field of computer technology, and in particular to an object selection method and device.
- the interactive objects of the game are usually categorized and typeset, and the game interactive objects after the categorization and typesetting are placed in One-handed easy-to-operate area, so that users can click with one hand.
- the purpose of the present disclosure is to provide an object selection method and device to overcome the problem of not being able to effectively realize the selection of all game entry areas under the condition of one-handed operation.
- an object selection method A graphical user interface is provided through a first terminal device, the graphical user interface includes a game screen, and the game screen includes at least one game interaction object, including: a response function In the first operation of the graphical user interface, obtain the detection area corresponding to the first operation; obtain at least one game interaction object that overlaps with the detection area; control in the display area corresponding to the first operation At least one selection control is provided, wherein each selection control corresponds to one game interaction object; in response to a selection operation acting on at least one selection control, an interaction operation of the game interaction object corresponding to the selection control is performed.
- an object selection device which provides a graphical user interface through a first terminal device, the graphical user interface includes a game screen, the game screen includes at least one game interaction object, and includes: an acquisition module , Configured to obtain a detection area corresponding to the first operation in response to a first operation acting on the graphical user interface; the obtaining module is further configured to obtain at least one game interaction object that overlaps the detection area; A providing module is used to control the provision of at least one selection control in the display area corresponding to the first operation, wherein each selection control corresponds to one of the game interactive object processing module, and is used to respond to at least one selection control The selection operation of the selection control executes the interactive operation of the game interaction object corresponding to the selection control.
- an object selection device including: a memory for storing a program; a processor for executing the program stored in the memory, and when the program is executed, the processing The device is used to execute the method described in the first aspect and any of the various possible designs of the first aspect.
- a computer-readable storage medium including instructions, which when run on a computer, cause the computer to execute the above-mentioned first aspect and any of the various possible designs of the first aspect method.
- FIG. 1A is a schematic diagram of a horizontal display mode provided by an embodiment of the application.
- FIG. 1B is a schematic diagram of a vertical screen performance mode provided by an embodiment of the application.
- Figure 2 is a schematic diagram of a graphical user interface provided by an embodiment of the application.
- FIG. 3 is a flowchart of an object selection method provided by one of the embodiments of this application.
- FIG. 4 is a schematic diagram of a detection area provided by an embodiment of the application.
- FIG. 5 is a schematic diagram of an implementation of a selection control provided by an embodiment of the application.
- FIG. 6 is a schematic diagram of the selection operation provided by an embodiment of the application.
- FIG. 7 is a flowchart of an object selection method provided by another embodiment of this application.
- FIG. 8 is a schematic diagram of a first control provided by an embodiment of the application.
- FIG. 9 is a schematic diagram of a sliding area provided by an embodiment of the application.
- FIG. 10 is a schematic diagram of determining a detection area based on a lateral sliding area according to an embodiment of the application.
- FIG. 11 is a schematic diagram of determining a detection area based on a longitudinal sliding area according to an embodiment of the application.
- FIG. 12 is a schematic diagram of an implementation manner of displaying a first control provided by an embodiment of the application.
- FIG. 13 is a schematic diagram of a detection line provided by an embodiment of the application.
- FIG. 14 is another schematic diagram of a detection area provided by an embodiment of the application.
- FIG. 15 is a schematic diagram of a left-handed hot zone range provided by an embodiment of the application.
- FIG. 16 is a schematic diagram of a hot zone range for right-hand operation provided by an embodiment of the application.
- FIG. 17 is a schematic diagram of a sliding area in a hidden state provided by an embodiment of the application.
- FIG. 18 is a schematic structural diagram of an object selection device provided by an embodiment of the application.
- FIG. 19 is a schematic diagram of the hardware structure of an object selection device provided by an embodiment of the application.
- the object selection method in the embodiment of the present application can be run on a terminal device or a cloud interactive system.
- the cloud interactive system includes a cloud server and user equipment for running cloud applications. Cloud applications run separately.
- cloud gaming refers to a gaming method based on cloud computing.
- the cloud game operation mode the main body of the game program and the main body of the game screen are separated.
- the storage and operation of the object selection method are completed on the cloud game server.
- the role of the cloud game client is used to receive and send data.
- the cloud game client can be a display device with data transmission function close to the user side, such as a mobile terminal, a TV, a computer, a palmtop, etc.; but a terminal device for game data processing It is a cloud game server in the cloud.
- the player operates the cloud game client to send operation instructions to the cloud game server.
- the cloud game server runs the game according to the operation instructions, encodes and compresses the game screen and other data, and returns to the cloud game client through the network.
- the client decodes and outputs the game screen.
- the terminal device may be a local terminal device.
- the local terminal device stores the game program and is used to present the game screen.
- the local terminal device is used to interact with the player through a graphical user interface, that is, the game program is conventionally downloaded, installed and run through an electronic device.
- the local terminal device may provide the graphical user interface to the player in a variety of ways, for example, it may be rendered and displayed on the display screen of the terminal, or may be provided to the player through holographic projection.
- the local terminal device may include a display screen and a processor, the display screen is used to present a graphical user interface, the graphical user interface includes a game screen, and the processor is used to run the game, generate a graphical user interface, and control the graphical user interface The display on the display.
- FIG. 1A It is a schematic diagram of a horizontal screen presentation mode provided by an embodiment of this application, and FIG. 1B is a schematic diagram of a vertical screen presentation mode provided by an embodiment of this application.
- the mobile game in the horizontal screen performance mode may have better screen impact and larger operating space, for example.
- the operation mode of the mobile game in the portrait mode is more in line with the user's operating habits of using the terminal device. Generally, most game operations can be performed with only one hand in a complex environment.
- the portrait mode is easy to operate with one hand, most of the current casual games and idle games use the portrait mode.
- the prior art usually categorizes and typeset multiple game interactive objects, and place the categorized and typeset game interactive objects in an easy-to-operate area on the screen.
- the position of the interactive object is adjusted to a position that is easy to operate with one hand to facilitate the user's one-hand operation.
- Figure 1B a plurality of game interactive objects after categorization and layout are displayed.
- a plurality of game interactive objects after categorization and layout are displayed again, and then the user further selects a certain game interactive object, thereby facilitating the user's one-handed operation.
- the screen size of the current terminal equipment is getting larger and larger, only by adjusting the position of the game interaction object, in the case of one-handed operation, it is impossible to effectively realize the selection of all game interaction objects; and, as the game open function becomes more complex In the layout of game interactive objects, it is also necessary to place different game interactive objects in different areas to facilitate users’ understanding.
- this application provides an object selection method to achieve simple and efficient selection of various game interactive objects in the case of one-handed operation.
- FIG. 2 is a schematic diagram of a graphical user interface provided by an embodiment of the application.
- this embodiment provides a graphical user interface through the first terminal device.
- the graphical user interface includes a game screen, and the game screen includes at least one game interaction object.
- the first terminal device may be the aforementioned local terminal device, or the aforementioned cloud game client.
- Graphical user interface refers to a computer-operated user interface displayed in a graphical manner, which allows players to use an input device to manipulate icons or menu controls on the screen.
- the input device may be a mouse, for example, a touch screen, etc., in this embodiment.
- the player interacts with the client or server by operating through the graphical user interface during the game.
- the graphical user interface includes a game screen, and the game screen includes at least one game interactive object, where the game interactive object may be an interactive object in the game screen.
- the game interaction object may be, for example, a game entry, and by interacting with the game entry, the game can be triggered to enter a corresponding functional module.
- the game interaction object may also be, for example, a virtual object in the game, such as a game character controlled by a user's operation, or a non-player character, or trees, houses, flowers, etc. in the game.
- the game interaction object can also be, for example, a preset button, preset control, etc. in the game.
- This embodiment does not limit the specific implementation of the game interaction object. Any object that can be interacted in the game can be used as this embodiment. Game interactive objects in.
- FIG. 3 is a flowchart of the object selection method provided by one of the embodiments of the application
- FIG. 4 is The schematic diagram of the detection area provided in the embodiment of the application
- FIG. 5 is a schematic diagram of an implementation of the selection control provided in the embodiment of the application
- FIG. 6 is a schematic diagram of the selection operation provided by the embodiment of the application.
- the method includes:
- S301 In response to a first operation acting on the graphical user interface, obtain a detection area corresponding to the first operation.
- the first operation can be, for example, a click operation, or the first operation can also be a long press operation, a sliding operation, etc., where the first operation can be applied to any position in the graphical user interface.
- the first operation can be, for example, a click operation, or the first operation can also be a long press operation, a sliding operation, etc., where the first operation can be applied to any position in the graphical user interface.
- the detection area corresponding to the first operation By responding to the first operation to obtain the detection area corresponding to the first operation, in a possible implementation manner, for example, the detection area corresponding to the touch point of the first operation can be obtained.
- the touch point of the first operation may be used as the starting point, and an area with a preset length and/or a preset width in the preset direction of the first operation may be determined as the detection area.
- the detection area can be displayed in a preset state, where the preset state may be, for example, the state of the light beam as shown in FIG. 4, or may also be any of the states such as highlight and floating window
- the preset state may be, for example, the state of the light beam as shown in FIG. 4, or may also be any of the states such as highlight and floating window
- the area that has a mapping relationship with the touch point of the first operation may also be determined as the detection area; or, the area of the preset range around the touch point of the first operation may also be determined as the detection area.
- This embodiment does not limit the specific relationship between the detection area and the first operation, and it can be expanded according to actual needs.
- the detection area is an area in the graphical user interface
- the game interactive objects also correspond to their respective areas in the graphical user interface, so the detection area may overlap with some game interactive objects.
- the overlap with the detection area may be, for example, that the game interactive object is completely located within the range of the detection area; or, the overlap may also be, for example, that the part of the game interaction object is located within the range of the detection area.
- the current detection area is the area shown in 402
- the area corresponding to each game interaction object is, for example, As shown in Figure 4, the game interactive object can be any shape such as a square, a circle, a triangle, etc.
- the game interaction object may also have an irregular shape. This embodiment does not limit the shape of the game interaction object, and the corresponding area of the game interaction object in the graphical user interface can be set according to the actual game. Make a selection.
- the game interaction objects that currently overlap with the detection area are A, C, D, and F.
- any game interaction object that overlaps the detection area can be determined as the game interaction object that overlaps the detection area in this embodiment.
- the user can move through the first operation to determine the required detection area according to the touch point of the first operation, and in this embodiment, at least one game interaction object that overlaps the detection area can be obtained.
- At least one game interaction object acquired at this time includes a game interaction object to be selected by the user.
- At least one selection control may be provided in the display area corresponding to the first operation, wherein each selection control corresponds to a game interaction object.
- the game interaction objects corresponding to each game object provided in the display area are the game interaction objects described above that overlap with the detection area. Therefore, the selection of the game interaction object required by the user can be achieved simply and efficiently through the selection control.
- the display area corresponding to the first operation and the detection area corresponding to the first operation described above may be the same area, but the functions of the display area and the detection area are different; or, the display corresponding to the first operation
- the area may also be a different area from the detection area, and the rest of the implementation manner of acquiring and displaying may be similar to the implementation manner of acquiring the detection area described above, and will not be repeated here.
- the selection control may be provided in the display area after the first operation ends; or, in response to a trigger operation acting on the graphical user interface, the selection control may be provided in the display area, where the trigger operation may be, for example, In order to act on click operations, long press operations, etc., at any position in the graphical user interface, this embodiment does not limit this.
- the game interaction objects that currently overlap with the detection area are A, C, D, and F.
- four selection controls are currently provided in the display area 500 corresponding to the first area. They are 501, 502, 503, and 504 respectively.
- the selection control 501 corresponds to the game interaction object A
- the selection control 502 corresponds to the game interaction object C
- the selection control 503 corresponds to the game interaction object D
- the selection control 504 corresponds to the game interaction object F.
- the user can quickly and conveniently select the desired game interaction object in the one-handed mode.
- the object identifiers of the overlapping game interaction objects can also be acquired, where the object identifier may be the name of the object, for example, as shown in Figure 4 A, B, C, etc. as shown in the figure; or the object identification can also be the abbreviation, abbreviation, code, digital identification, etc. of the object.
- This embodiment does not limit the implementation of the object identification, as long as the object identification can uniquely indicate Just one game interaction object.
- the object identification of the game interaction object corresponding to each selection control may also be displayed around each selection control.
- the object identification of the corresponding game interaction object may be displayed in each selection control; or
- the object identifier of the corresponding game interactive object can also be displayed above, below, on the side of each selection control, and at the position where there is a mapping relationship. This embodiment does not limit the position where the object identifier of the corresponding game interactive object is displayed.
- the user can quickly and efficiently determine which game interaction object corresponds to the selection control, thereby improving the operation efficiency of the game and enhancing the user's game experience.
- the selection control is pentagonal.
- the shape, color, position, size, arrangement, etc. of the selection control can be selected according to actual needs.
- the selection The display mode of the control is not limited, as long as the selection control can correspond to each game interaction object that overlaps with the detection area.
- S304 In response to a selection operation acting on at least one selection control, perform an interactive operation of a game interaction object corresponding to the selection control.
- the selection operation acting on the selection control can be, for example, a click operation acting on a certain selection control, or it can also be a long-press operation acting on a certain selection control, or it can also be a sliding selection acting on multiple selection controls. Operation, etc., this embodiment does not limit the specific implementation of the selection operation, as long as the selection operation can realize the selection of at least one selection control.
- the interactive operation of the game interaction object corresponding to the selection control can be executed.
- the interactive operation can be to enter the game module corresponding to the game entry; or for example, if the current game interactive object is a virtual game object, the interactive operation can be to select the virtual game object, or In order to control the virtual game object to perform operations, etc., the interactive operation of the game interactive object can be set according to the specific type of the game interactive object, which is not limited here.
- the game interactive object is a game entry
- the current selection operation is a click operation
- the selection operation acts on the selection control corresponding to the game interactive object F
- each game interactive object is placed in a different position, which can facilitate the user to understand the game.
- On the basis of not needing to change the position of the game interactive object it can be realized by detecting the overlap between the area and the game interactive object.
- the selection of game interactive objects can effectively realize one-handed operation in portrait mode.
- the object selection method provided by the embodiment of the present application includes: in response to a first operation acting on a graphical user interface, obtaining a detection area corresponding to the first operation. Obtain at least one game interaction object that overlaps with the detection area.
- the control provides at least one selection control in the display area corresponding to the first operation, wherein each selection control corresponds to a game interaction object.
- an interactive operation of the game interaction object corresponding to the selection control is executed.
- the object selection method provided by the present application may also provide a first control in a graphical user interface.
- the acquisition may be performed based on the first control.
- the control may be pre-displayed in the graphical user interface, or the first control may not be displayed in advance or does not exist in advance, and will be displayed after being triggered by a related operation.
- the following is a specific embodiment based on the first control The method of obtaining the detection area is explained.
- FIG. 8 is a schematic diagram of the first control provided in an embodiment of this application.
- acquiring the detection area corresponding to the touch point of the first operation may include:
- the response area of the first control may be a full-screen response area, for example, all areas of the current graphical user interface are the response areas of the first control; or the response area of the first control may also be a graphical user interface
- the area around the first control, or the edge area corresponding to the first control, etc. this embodiment does not limit the specific implementation of the response area, as long as there is a corresponding relationship between the response area and the first control Can.
- the graphical user interface includes a first control 801, which is pre-displayed at any position in the graphical user interface, and the sliding button 801 can be moved to any position in the graphical user interface according to the user's first operation
- the size, initial position, and shape of the sliding button 801 can all be selected according to actual needs.
- the touch point of the current first operation is the one in FIG. 803 corresponding to the position, at this time, it can be determined that the touch point 803 acts on the response area of the first control.
- the position of the first control may be adjusted according to the movement of the touch point, and the corresponding detection area may be determined according to the position of the first control.
- the position of the first control is adjusted from 802 to 803 according to the movement of the touch point.
- an area with a preset length and/or a preset width in a preset direction corresponding to the first control can be obtained, and This area is determined as the detection area, for example, the area 804 shown in FIG. 8.
- the graphical user interface in this application may also include a sliding area, where the sliding area may be the edge area of the game screen.
- the first control may be displayed in advance.
- the sliding area in the graphical user interface the following describes the implementation of acquiring the detection area based on the first control displayed in the sliding area in conjunction with specific embodiments.
- the sliding area in this application is not necessarily a response area, and the response area in this application includes a sliding area.
- Figure 9 is a schematic diagram of a sliding area provided by an embodiment of this application
- Figure 10 is a schematic diagram of determining a detection area based on a lateral sliding area provided by an embodiment of this application
- Figure 11 is an embodiment of this application Provide a schematic diagram of determining the detection area based on the longitudinal sliding area.
- the sliding area includes a horizontal sliding area and/or a vertical sliding area, where the horizontal sliding area may be located in the horizontal edge area of the graphical user interface, and the vertical sliding area may be located in the vertical edge area of the graphical user interface.
- the graphical user interface may include three sliding areas, namely the vertical sliding area 901, 902 and the horizontal sliding area 903 in FIG.
- the first control can slide on its corresponding sliding area.
- the step of adjusting the position of the first control according to the movement of the touch point may be:
- the first control 904 is included on the sliding area 902, and the first control 904 can be controlled to slide up and down on the sliding area 902 according to the movement of the touch point, thereby adjusting the first control 904 in the sliding area 902 In the location.
- the corresponding detection area can be determined according to the position of the first control, where the detection area can be a detection line perpendicular to the sliding area or an area perpendicular to the sliding area with a preset width as the width value .
- the first operation at this time is the first operation that acts on the lateral sliding area 1000, and the touch point 1001 of the first operation is moved from the first position 1002 to the second position.
- the position of the first control in the sliding area can be adjusted according to the movement of the touch point.
- the position of the first control in the sliding area is adjusted from the first position 1002 to the second position 1003. .
- the first operation is located in the horizontal sliding area 1000, and the step of obtaining the detection area corresponding to the first operation may be, for example:
- a detection area 1005 can be obtained by obtaining an area perpendicular to the horizontal sliding area 1000 with a preset width as the width value.
- the first operation is the first operation that acts on the longitudinal sliding area 1101 at this time, and that the first control is adjusted to slide according to the movement of the touch point at this time.
- the position of the area, the position of the first control in the sliding area is determined to be position 1102.
- the first operation is located in the longitudinal sliding area 1101, and the step of obtaining the detection area corresponding to the first operation may be, for example, as follows:
- an area perpendicular to the longitudinal sliding area 1101 with a preset width as the width value is obtained, and the detection area 1103 can be obtained.
- the implementation of the sliding area and the first control in this embodiment can be as shown in the figure above; alternatively, the number and setting position of the sliding area can also be selected according to actual needs.
- the implementation of the first control The method can also be selected according to actual needs, which is not limited in this embodiment.
- the object selection method provided by the embodiment of the present application provides a sliding area, so that the position of the first control in the sliding area can be adjusted according to the movement of the touch point, which can make the movement of the first control more controllable.
- the first control may not be displayed in advance or not exist in advance, and be displayed after being triggered by a related operation, for example, it may respond to the action on the sliding area
- the first control is displayed in the graphical user interface.
- FIG. 12 is a schematic diagram of an implementation manner of displaying the first control provided by an embodiment of the application.
- the graphical user interface may include three sliding areas, namely the vertical sliding area 1201, 1202 and the horizontal sliding area 1203 in FIG. 12. As shown in 12a in FIG. 12, the first control is not displayed in advance or No, there are only sliding areas at this time.
- the first control in response to the second operation acting on the sliding area, the first control may be controlled to be displayed on the graphical user interface, where the second operation is an operation continuous with the first operation.
- the second operation can be a sliding operation that is continuous with the first operation, as shown in 12b in FIG.
- the first control 1204 is displayed.
- the first control may be displayed in the horizontal sliding area 1203, or may also be displayed in other positions, which is not limited in this embodiment.
- the corresponding detection area can be determined according to the position of the first control.
- the implementation method is similar to that described above, and will not be repeated here.
- the first operation can be the first operation acting on the sliding area, and the detection area can be directly obtained according to the position of the first operation in the sliding area, without providing the first control, as shown in Figure 13 below.
- 14 is a specific embodiment for illustration.
- FIG. 13 is a schematic diagram of a detection line provided by an embodiment of this application
- FIG. 14 is another schematic diagram of a detection area provided by an embodiment of this application.
- the graphical user interface may include three sliding areas, namely the longitudinal sliding area 1301, 1302 and the horizontal sliding area 1303 in FIG. 13.
- the first operation is the first operation that acts on the sliding area, and the first operation is at least one of the following operations: a click operation and a sliding operation, where the detection area can be a detection line perpendicular to the sliding area or a vertical The area with the preset width as the width value in the sliding area.
- the step of obtaining the detection area corresponding to the first operation may be, for example:
- the detection line 1304 perpendicular to the horizontal sliding area 1303 is obtained, and the detection area 1303 can be obtained.
- the step of obtaining the detection area corresponding to the first operation 1402 may be, for example, for:
- an area perpendicular to the longitudinal sliding area 1401 with a preset width as the width value is obtained, and the detection area 1403 can be obtained.
- FIG. 14 differs between FIG. 14 and FIG. 11 is that the first control does not need to be provided in the implementation manner of FIG. 14.
- the object selection method provided by the embodiment of the present application directly obtains the detection area corresponding to the first operation through the first operation that acts on the sliding area, and does not need to provide the first control, thereby improving the simplicity of operation.
- the step of controlling the provision of at least one selection control in the display area corresponding to the first operation in this application may include:
- At least one selection control is provided in the display area.
- At least one selection control can be provided in the display area according to the positional relationship of at least one game interactive object in the game screen according to the corresponding positional relationship; or, according to the positional relationship, around the touch point of the first operation Surround provides at least one selection control, and its various possible implementations can also be extended according to actual needs.
- the sliding area in this embodiment may be located in the edge area of the game screen. Further, the sliding area in this embodiment may be located in the hot zone range of the one-handed operation of the edge area.
- 15 and 16 illustrate the hot zone range of one-handed operation.
- FIG. 15 is a schematic diagram of a left-handed hot zone range provided by an embodiment of this application
- FIG. 16 is a schematic view of a right-handed hot zone range provided by an embodiment of this application.
- the easy-to-operate range, the general range, and the difficult range are as shown in FIG. A collection of ranges and general ranges.
- the easy-to-operate range, the general range, and the difficult-to-operate range are shown in FIG. 16, where the hot zone range can be, for example, an easy-to-operate range, or it can also be an easy-to-operate range.
- the hot zone range can be, for example, an easy-to-operate range, or it can also be an easy-to-operate range.
- the sliding area can be set in the edge area.
- FIG. 17 is a schematic diagram of a sliding area in a hidden state provided by an embodiment of the application.
- the hidden state may be, for example, the transparency state of the sliding area 1701 shown in FIG. 17, or may also be the non-display state of the sliding area 1702 shown in FIG. 17, or may also be the sliding area shown in FIG.
- the dotted line state of 1703 does not limit the specific implementation of the hidden state in this embodiment, as long as the hidden state can weaken the display of the sliding area.
- the simplicity of the game screen can be improved, thereby enhancing the user's game experience.
- FIG. 18 is a schematic structural diagram of an object selection device provided by an embodiment of the application. As shown in FIG. 18, the device 180 includes: an obtaining module 1801, a providing module 1802, and a processing module 1803.
- the obtaining module 1801 is configured to obtain a detection area corresponding to the first operation in response to a first operation performed on the graphical user interface;
- the acquiring module 1801 is further configured to acquire at least one game interaction object that overlaps with the detection area;
- a providing module 1802 configured to control to provide at least one selection control in a display area corresponding to the first operation, wherein each selection control corresponds to one game interaction object;
- the processing module 1803 is configured to perform an interactive operation of a game interaction object corresponding to the selection control in response to a selection operation acting on at least one of the selection controls.
- the acquisition module 1801 is specifically configured to:
- the providing module 1802 is also used to:
- the graphical user interface provides a first control
- the obtaining module 1801 is specifically used for:
- the acquisition module 1801 is specifically configured to:
- the corresponding detection area is determined according to the position of the first control.
- the graphical user interface includes a sliding area, and the sliding area is located at an edge area of the game screen, and the acquiring module 1801 is specifically configured to:
- the position of the first control in the sliding area is adjusted according to the movement of the touch point.
- the providing module 1802 is specifically used for:
- the graphical user interface includes a sliding area
- the first operation is a first operation acting on the sliding area
- the first operation is at least one of the following operations: a click operation and a sliding operation Operation
- the detection area is a detection line perpendicular to the sliding area or an area perpendicular to the sliding area with a preset width as a width value.
- the sliding area includes a horizontal sliding area and/or a vertical sliding area, the horizontal sliding area is located in the horizontal edge area of the graphical user interface, and the vertical sliding area is located in the vertical direction of the graphical user interface.
- Edge area; the acquisition module 1801 is used for at least one of the following steps:
- the providing module 1802 is specifically used for:
- At least one selection control is provided in the display area.
- the detection area and the display area are the same area.
- the acquisition module 1801 is also used to:
- the providing module 1802 is also used for:
- each selection control On the periphery of each selection control, the object identifier of the game interaction object corresponding to the selection control is displayed.
- the providing module 1802 is also used to:
- the sliding area When no operation on the sliding area is detected, the sliding area is set to a hidden state.
- the device provided in this embodiment can be used to implement the technical solutions of the foregoing method embodiments, and its implementation principles and technical effects are similar, and will not be repeated here in this embodiment.
- FIG. 19 is a schematic diagram of the hardware structure of an object selection device provided by an embodiment of this application.
- the object selection device 190 in this embodiment includes: a processor 1901 and a memory 1902; wherein
- the memory 1902 is used to store computer execution instructions
- the processor 1901 is configured to execute computer-executable instructions stored in the memory to implement each step performed by the object selection method in the foregoing embodiment. For details, refer to the relevant description in the foregoing method embodiment.
- the memory 1902 may be independent or integrated with the processor 1901.
- the object selection device further includes a bus 1903 for connecting the memory 1902 and the processor 1901.
- the embodiment of the present application also provides a computer-readable storage medium, the computer-readable storage medium stores computer-executable instructions, and when the processor executes the computer-executable instructions, the object selection method executed by the above object selection device is implemented .
- the disclosed device and method can be implemented in other ways.
- the device embodiments described above are only illustrative.
- the division of the modules is only a logical function division, and there may be other divisions in actual implementation, for example, multiple modules can be combined or integrated. To another system, or some features can be ignored, or not implemented.
- the displayed or discussed mutual coupling or direct coupling or communication connection may be indirect coupling or communication connection through some interfaces, devices or modules, and may be in electrical, mechanical or other forms.
- the above-mentioned integrated module implemented in the form of a software function module may be stored in a computer readable storage medium.
- the above-mentioned software function module is stored in a storage medium and includes several instructions to make a computer device (which can be a personal computer, a server, or a network device, etc.) or a processor (English: processor) execute the various embodiments of this application Part of the method.
- processor may be a central processing unit (English: Central Processing Unit, abbreviated as: CPU), or other general-purpose processors, digital signal processors (English: Digital Signal Processor, abbreviated as: DSP), and application-specific integrated circuits. (English: Application Specific Integrated Circuit, referred to as ASIC) etc.
- the general-purpose processor may be a microprocessor or the processor may also be any conventional processor or the like.
- the steps of the method disclosed in combination with the present disclosure may be directly embodied as being executed and completed by a hardware processor, or executed and completed by a combination of hardware and software modules in the processor.
- the memory may include a high-speed RAM memory, or may also include a non-volatile storage NVM, such as at least one disk storage, and may also be a U disk, a mobile hard disk, a read-only memory, a magnetic disk, or an optical disk.
- NVM non-volatile storage
- the bus may be an Industry Standard Architecture (ISA) bus, Peripheral Component (PCI) bus, or Extended Industry Standard Architecture (EISA) bus, etc.
- ISA Industry Standard Architecture
- PCI Peripheral Component
- EISA Extended Industry Standard Architecture
- the bus can be divided into address bus, data bus, control bus and so on.
- the buses in the drawings of this application are not limited to only one bus or one type of bus.
- the above-mentioned storage medium can be realized by any type of volatile or non-volatile storage device or their combination, such as static random access memory (SRAM), electrically erasable programmable read-only memory (EEPROM), erasable Except programmable read only memory (EPROM), programmable read only memory (PROM), read only memory (ROM), magnetic memory, flash memory, magnetic disk or optical disk.
- SRAM static random access memory
- EEPROM electrically erasable programmable read-only memory
- EPROM erasable except programmable read only memory
- PROM programmable read only memory
- ROM read only memory
- magnetic memory flash memory
- flash memory magnetic disk or optical disk.
- optical disk any available medium that can be accessed by a general-purpose or special-purpose computer.
- a person of ordinary skill in the art can understand that all or part of the steps in the foregoing method embodiments can be implemented by a program instructing relevant hardware.
- the aforementioned program can be stored in a computer readable storage medium.
- the steps included in the foregoing method embodiments are executed; and the foregoing storage medium includes: ROM, RAM, magnetic disk, or optical disk and other media that can store program codes.
Landscapes
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Human Computer Interaction (AREA)
- Physics & Mathematics (AREA)
- General Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Optics & Photonics (AREA)
- General Physics & Mathematics (AREA)
- User Interface Of Digital Computer (AREA)
Abstract
一种对象选择方法及装置,该方法包括:响应作用于图形用户界面的第一操作,获取第一操作对应的检测区域(S301)。获取与检测区域存在重合的至少一个游戏交互对象(S302)。控制在与第一操作对应的显示区域中提供至少一个选择控件,其中,每个选择控件对应一个游戏交互对象(S303)。响应作用于至少一个选择控件的选择操作,执行与选择控件对应的游戏交互对象的交互操作(S304)。通过获取第一操作对应的检测区域,并且获取与检测区域存在重合的至少一个游戏交互对象,从而可以在显示区域中提供各个游戏交互对象对应的选择控件,并且基于选择控件实现对游戏交互对象的选择,从而可以实现在单手操作的情况下,快速便捷的实现对各个游戏交互对象的选择。
Description
相关申请的交叉引用
本申请要求于2020年05月25日提交的申请号为202010450283.5、名称为“对象选择方法及装置”的中国专利申请的优先权,该中国专利申请的全部内容通过引用全部并入本文。
本公开涉及计算机技术领域,特别是涉及一种对象选择方法及装置。
随着游戏的不断发展,若能够保证用户在单手操作的模式下,简单高效的实现对游戏中交互对象的选择,能够有效提升用户体验。
目前,现有技术的在保证用户单手操作的情况下,实现对游戏中交互对象的选择时,通常是将游戏的交互对象进行归类排版,并且将归类排版后的游戏交互对象放置在单手易操作的区域,从而实现用户单手即可实现点击。
然而,随着终端设备的屏幕尺寸越来越大,仅仅通过调整游戏交互对象的位置,无法有效实现在单手操作的情况下对所有游戏交互对象的选择。
发明内容
本公开的目的在于提供一种对象选择方法及装置,以克服无法有效实现在单手操作的情况下对所有游戏入口区域的选择的问题。
根据本公开的第一方面,提供一种对象选择方法,通过第一终端设备提供图形用户界面,所述图形用户界面包括游戏画面,所述游戏画面中包括至少一个游戏交互对象,包括:响应作用于所述图形用户界面的第一操作,获取所述第一操作对应的检测区域;获取与所述检测区域存在重合的至少一个游戏交互对象;控制在与所述第一操作对应的显示区域中提供至少一个选择控件,其中,每个所述选择控件对应一个所述游戏交互对象;响应作用于至少一个所述选择控件的选择操作,执行与所述选择控件对应的游戏交互对象的交互操作。
根据本公开的第二方面,提供一种对象选择装置,通过第一终端设备提供图形用户界面,所述图形用户界面包括游戏画面,所述游戏画面中包括至少一个游戏交互对象,包括:获取模块,用于响应作用于所述图形用户界面的第一操作,获取所述第一操作对应的检测区域;所述获取模块,还用于获取与所述检测区域存在重合的至少一个游戏交互对象;提供模块,用于控制在与所述第一操作对应的显示区域中提供至少一个选择控件,其中,每个所述选择控件对应一个所述游戏交互对象处理模块,用于响应作用于至少一个所述选择 控件的选择操作,执行与所述选择控件对应的游戏交互对象的交互操作。
根据本公开的第三方面,提供一种对象选择设备,包括:存储器,用于存储程序;处理器,用于执行所述存储器存储的所述程序,当所述程序被执行时,所述处理器用于执行如上第一方面以及第一方面各种可能的设计中任一所述的方法。
根据本公开的第四方面,提供一种计算机可读存储介质,包括指令,当其在计算机上运行时,使得计算机执行如上第一方面以及第一方面各种可能的设计中任一所述的方法。
为了更清楚地说明本申请实施例或现有技术中的技术方案,下面将对实施例或现有技术描述中所需要使用的附图作一简单地介绍,显而易见地,下面描述中的附图是本申请的一些实施例,对于本领域普通技术人员来讲,在不付出创造性劳动性的前提下,还可以根据这些附图获得其他的附图。
图1A为本申请实施例提供的横屏表现模式的示意图;
图1B为本申请实施例提供的竖屏表现模式的示意图;
图2为本申请实施例提供的图形用户界面示意图;
图3为本申请其中一实施例提供的对象选择方法的流程图;
图4为本申请实施例提供的检测区域示意图;
图5为本申请实施例提供的选择控件的一种实现示意图;
图6为本申请实施例提供的选择操作示意图;
图7为本申请又一实施例提供的对象选择方法的流程图;
图8为本申请实施例提供的第一控件示意图;
图9为本申请实施例提供的滑动区域的示意图;
图10为本申请实施例提供的确定基于横向滑动区域确定检测区域的示意图;
图11为本申请实施例提供的确定基于纵向滑动区域确定检测区域的示意图;
图12为本申请实施例提供的显示第一控件的实现方式示意图;
图13为本申请实施例提供的检测线示意图;
图14为本申请实施例提供的检测区域的另一示意图;
图15为本申请实施例提供的左手操作的热区范围示意图;
图16为本申请实施例提供的右手操作的热区范围示意图;
图17为本申请实施例提供的隐藏状态的滑动区域的示意图;
图18为本申请实施例提供的对象选择装置的结构示意图;
图19为本申请实施例提供的对象选择设备的硬件结构示意图。
为使本申请实施例的目的、技术方案和优点更加清楚,下面将结合本申请实施例中的 附图,对本申请实施例中的技术方案进行清楚、完整地描述,显然,所描述的实施例是本申请一部分实施例,而不是全部的实施例。基于本申请中的实施例,本领域普通技术人员在没有作出创造性劳动前提下所获得的所有其他实施例,都属于本申请保护的范围。
本申请实施例中的对象选择方法可以运行于终端设备或者是云交互系统。
其中,云交互系统包括云服务器和用户设备,用于运行云应用。云应用分别运行。
在一可选的实施方式中,云游戏是指以云计算为基础的游戏方式。在云游戏的运行模式下,游戏程序的运行主体和游戏画面呈现主体是分离的,对象选择方法的储存与运行是在云游戏服务器上完成的,云游戏客户端的作用用于数据的接收、发送以及游戏画面的呈现,举例而言,云游戏客户端可以是靠近用户侧的具有数据传输功能的显示设备,如,移动终端、电视机、计算机、掌上电脑等;但是进行游戏数据处理的终端设备为云端的云游戏服务器。在进行游戏时,玩家操作云游戏客户端向云游戏服务器发送操作指令,云游戏服务器根据操作指令运行游戏,将游戏画面等数据进行编码压缩,通过网络返回云游戏客户端,最后,通过云游戏客户端进行解码并输出游戏画面。
在一可选的实施方式中,终端设备可以为本地终端设备。本地终端设备存储有游戏程序并用于呈现游戏画面。本地终端设备用于通过图形用户界面与玩家进行交互,即,常规的通过电子设备下载安装游戏程序并运行。该本地终端设备将图形用户界面提供给玩家的方式可以包括多种,例如,可以渲染显示在终端的显示屏上,或者,通过全息投影提供给玩家。举例而言,本地终端设备可以包括显示屏和处理器,该显示屏用于呈现图形用户界面,该图形用户界面包括游戏画面,该处理器用于运行该游戏、生成图形用户界面以及控制图形用户界面在显示屏上的显示。
下面对本申请所涉及的背景技术进行详细介绍:
目前,移动端的手游可以有横屏和竖屏两种表现模式,其中,横屏和竖屏两种表现模式具有各自的特点,其实现方式例如可以如图1A和图1B所示,图1A为本申请实施例提供的横屏表现模式的示意图,图1B为本申请实施例提供的竖屏表现模式的示意图。
参见图1A,其中横屏表现模式的手游例如可以具有更好的画面冲击力以及更大的操作空间。
以及参见图1B,其中竖屏表现模式的手游的操作方式更加符合用户使用终端设备的操作习惯,通常在复杂的环境下只需要单手就可以执行大部分的游戏操作。
在实际实现过程中,可以根据游戏的类型和操作方式等,选择采用横屏的表现模式还是竖屏的表现模式。
在一种可能的实现方式中,因为竖屏表现模式具有单手易操作的特点,因此目前休闲类游戏、放置类游戏等,大多数采用的是竖屏的表现模式。
然而,由于终端设备的型号大小、游戏众多的功能需求、表现需求等原因,在游戏的设计过程中不可避免的需要将一些游戏交互对象放置在单手不易触及的屏幕区域,这样会导致竖屏的表现模式的单手易操作的优势被削弱。
目前,现有技术为了保证竖屏的表现模式的优势,通常是将多个游戏交互对象进行归类排版,并将归类排版后的游戏交互对象放置在屏幕上的易操作区域,通过将游戏交互对象的位置调整至单手易操作的位置,以便于用户的单手操作。
例如可以参见图1B,图1B中显示有多个归类排版后的游戏交互对象,以其中归类排版后的游戏交互对象101为例,在一种可能的实现方式中,当接收到用户对归类排版后的游戏交互对象101的点击操作时,例如可以再显示其归类的多个游戏交互对象,接着用户对某个游戏交互对象进行进一步地选择,从而方便用户的单手操作。
然而,目前终端设备的屏幕尺寸越来越大,仅仅通过调整游戏交互对象的位置,在单手操作的情况下,无法有效实现对所有游戏交互对象的选择;并且,随着游戏开放功能的复杂性,在游戏交互对象的排版上也需要将不同的游戏交互对象放置在不同的区域,以便于用户的理解。
基于现有技术中的问题,本申请提供了一种对象选择方法,以实现在单手操作的情况下,简单高效的实现对各个游戏交互对象的选择。
下面结合图2对本申请的应用场景进行说明,图2为本申请实施例提供的图形用户界面示意图。
具体的,本实施例通过第一终端设备提供图形用户界面,该图形用户界面包括游戏画面,游戏画面中包括至少一个游戏交互对象。
其中,第一终端设备可以是前述提到的本地终端设备,也可以是前述提到的云游戏客户端。
图形用户界面是指采用图形方式显示的计算机操作用户界面,其允许玩家使用输入设备操纵屏幕上的图标或菜单控件,其中输入设备例如可以为鼠标,还例如可以为触摸屏等,本实施例对此不做限制,玩家在游戏过程中,通过图形用户界面进行操作从而与客户端或者服务器进行交互。
在本实施例中,图形用户界面包括游戏画面,游戏画面中包括至少一个游戏交互对象,其中游戏交互对象可以为游戏画面中可进行交互的对象。
在一种可能的实现方式中,游戏交互对象例如可以为游戏入口,通过与游戏入口进行交互,可以触发游戏进入相对应的功能模块。
或者,游戏交互对象还例如可以为游戏中的虚拟对象,例如受控于用户操作的游戏角色、或者非玩家角色、或者游戏中的树木、房屋、花草等。
或者,游戏交互对象还例如可以为游戏中的预设按钮、预设控件等,本实施例对游戏交互对象的具体实现方式不做限定,凡是游戏中可以进行交互的对象均可以作为本实施例中的游戏交互对象。
在一种可能的实现方式中,例如可以参见图2,假设当前的游戏画面中存在6个游戏交互对象,分别是A、B、C、D、E、F,当前示例中对各个游戏交互对象具体是什么不做限定,其例如可以为上述介绍的游戏入口,或者还可以为上述介绍的虚拟对象等,此处 对此不做限制,本实施例对游戏交互对象的形状、大小、位置等具体实现同样不做限定,其可以根据具体的游戏设计进行选择。
在上述介绍的图形用户界面的基础上,下面结合图3至图7对本申请提供的对象选择方法进行介绍,图3为本申请其中一实施例提供的对象选择方法的流程图,图4为本申请实施例提供的检测区域示意图,图5为本申请实施例提供的选择控件的一种实现示意图,图6为本申请实施例提供的选择操作示意图。
如图3所示,该方法包括:
S301、响应作用于图形用户界面的第一操作,获取第一操作对应的检测区域。
在本实施例中,第一操作例如可以为点击操作,或者第一操作还可以为长按操作、滑动操作等,其中,第一操作可以作用于图形用户界面中的任意位置,本实施例对第一操作的实现方式和作用位置不做限制。
通过响应第一操作,获取第一操作对应的检测区域,在一种可能的实现方式中,例如可以获取第一操作的触控点对应的检测区域。
比如,可以以第一操作的触控点作为起始点,将第一操作的预设方向上预设长度和/或预设宽度的区域确定为检测区域。
下面结合图4对当前的实现方式进行说明:
假设当前第一操作的触控点位于图4中401所示的位置,则可以以第一操作的触控点401作为起始点,将第一操作上方的预设宽度和预设长度的区域402作为检测区域。
在一种可能的实现方式中,可以将检测区域显示为预设状态,其中,预设状态例如可以为图4所示意的光柱的状态,或者还可以为高亮、悬浮窗等状态中的任一种,通过将检测区域显示为预设状态,可以使得用户快速清晰的确定检测区域的位置,从而进行及时的调整,本实施例对预设状态的实现方式不做特别限制,只要可以标识检测区域即可。
或者,还可以将和第一操作的触控点存在映射关系的区域确定为检测区域;或者,还可以将第一操作的触控点周边的预设范围的区域确定为检测区域等。
本实施例对检测区域和第一操作之间的具体关系不做限定,其可以根据实际需求进行扩展。
S302、获取与检测区域存在重合的至少一个游戏交互对象。
在本实施例中,检测区域是图形用户界面中的区域,游戏交互对象同样在图形用户界面中对应各自的区域,则检测区域可能与部分游戏交互对象存在重合。
其中,与检测区域存在重合例如可以为游戏交互对象完全位于检测区域的范围内;或者,存在重合还例如可以为游戏交互对象的部分位于检测区域的范围内。
例如参见图4,假设当前检测区域为402所示的区域,以及假设当前存在A、B、C、D、E、F这6个游戏交互对象,其中每个游戏交互对象所对应的区域,如图4所示,游戏交互对象可以为方形、圆形、三角形等任意形状。
在其余可能的实现方式中,游戏交互对象还可以为不规则形状,本实施例对游戏交互 对象的形状不做限定,则游戏交互对象在图形用户界面中对应的区域可以根据实际的游戏设定进行选择。
基于图4中的示例,当前与检测区域存在重合的游戏交互对象即为A、C、D、F。
在本实施例中,凡是与检测区域存在交集的游戏交互对象,无论交集部分的大小如何,均可以确定为本实施例中的与检测区域存在重合的游戏交互对象。
S303、控制在与第一操作对应的显示区域中提供至少一个选择控件,其中,每个选择控件对应一个游戏交互对象。
在本实施例中,用户可以通过第一操作的移动,从而根据第一操作的触控点确定需要的检测区域,以及本实施例中可以获取与检测区域存在重合的至少一个游戏交互对象,此时获取的至少一个游戏交互对象中存在用户待选择的游戏交互对象。
为了便于用户单手即可实现对需要的游戏交互对象的选择,可以在第一操作对应的显示区域中提供至少一个选择控件,其中每个选择控件对应一个游戏交互对象。
其中,提供在显示区域中的各个游戏对象对应的游戏交互对象,为上述介绍的和检测区域存在重合的游戏交互对象,因此通过选择控件可以简单高效的实现对用户需要的游戏交互对象的选择。
在一种可能的实现方式中,第一操作对应的显示区域和上述介绍的第一操作对应的检测区域可以为同一区域,只是显示区域和检测区域的作用不同;或者,第一操作对应的显示区域还可以为与检测区域不同的区域,其中获取显示其余的实现方式可以与上述介绍的获取检测区域的实现方式类似,此处不再赘述。
在一种实施方式中,可以在第一操作结束后,即在显示区域中提供选择控件;或者还可以响应作用于图形用户界面的触发操作,在显示区域中提供选择控件,其中触发操作例如可以为作用于图形用户界面中任意位置的点击操作、长按操作等,本实施例对此不做限制。
下面以检测区域和显示区域为同一区域进行介绍:
假设继续沿用上述示例,当前和检测区域存在重合的游戏交互对象为A、C、D、F,则例如可以参见图5,当前在第一区域对应的显示区域500中提供有4个选择控件,分别为501、502、503、504,其中,选择控件501对应游戏交互对象A,选择控件502对应游戏交互对象C,选择控件503对应游戏交互对象D,选择控件504对应游戏交互对象F。
通过在显示区域中提供和检测区域存在重合的各个游戏交互对象的选择控件,从而使得用户可以在单手模式下,快速便捷的实现对需要的游戏交互对象的选择。
在一种可能的实现方式中,在获取和检测区域存在重合的游戏交互对象之后,还可以获取存在重合的各个游戏交互对象的对象标识,其中对象标识例如可以为该对象的名称,例如图4中所示意的A、B、C等;或者对象标识还可以为该对象的简称、缩写、代码、数字标识等,本实施例对对象标识的实现方式不做限定,只要对象标识可以唯一的指示一个游戏交互对象即可。
本实施例中还可以在每个选择控件的周边,显示各个选择控件对应的游戏交互对象的对象标识,例如可以参照图5,可以在各个选择控件中显示对应的游戏交互对象的对象标识;或者还可以在各个选择控件的上方、下方、侧边、存在映射关系的位置上显示对应的游戏交互对象的对象标识,本实施例对显示对应的游戏交互对象的对象标识的位置不做限定。
通过在选择控件的周边显示对应的游戏交互对象的对象标识,可以使得用户快速高效的确定选择控件对应的游戏交互对象是哪一个,从而提升游戏的操作效率,同时提升用户的游戏体验。
上述图5介绍的实现方式中,选择控件为五边形的,在实际实现过程中,选择控件的形状、颜色、位置、大小、排列方式等均可以根据实际需求进行选择,本实施例对选择控件的显示方式不做限制,只要选择控件可以对应各个和检测区域存在重合的游戏交互对象即可。
S304、响应作用于至少一个选择控件的选择操作,执行与选择控件对应的游戏交互对象的交互操作。
其中,作用于选择控件的选择操作例如可以为作用于某一个选择控件的点击操作,或者还可以为作用于某一个选择控件的长按操作,或者还可以为作用于多个选择控件的滑动选择操作等,本实施例对选择操作的具体实现方式不做限定,只要选择操作可以实现对至少一个选择控件的选择即可。
在本实施例中,可以响应于选择操作,执行与选择控件对应的游戏交互对象的交互操作。
比如说当前的游戏交互对象为游戏入口,则交互操作可以为进入游戏入口对应的游戏模块;或者比如说当前的游戏交互对象为虚拟游戏对象,则交互操作可以为选中该虚拟游戏对象,或者可以为控制该虚拟游戏对象执行操作等,其中游戏交互对象的交互操作可以根据游戏交互对象的具体类型进行设定,此处对此不做限制。
比如说游戏交互对象为游戏入口,则例如可以参照图6,假设当前选择操作为点击操作,并且选择操作作用于游戏交互对象F对应的选择控件,则确定选择F对应的游戏入口,之后例如可以控制游戏进入“F”对应的游戏模块。
在本实施例中,各个游戏交互对象是放置在不同的位置的,可以便于用户对游戏进行理解,在无需改变游戏交互对象的位置的基础上,可以通过检测区域与游戏交互对象的重合,实现对游戏交互对象的选择,从而有效实现在竖屏模式下的单手操作。
本申请实施例提供的对象选择方法,包括:响应作用于图形用户界面的第一操作,获取第一操作对应的检测区域。获取与检测区域存在重合的至少一个游戏交互对象。控制在与第一操作对应的显示区域中提供至少一个选择控件,其中,每个选择控件对应一个游戏交互对象。响应作用于至少一个选择控件的选择操作,执行与选择控件对应的游戏交互对象的交互操作。通过获取第一操作对应的检测区域,并且获取与检测区域存在重合的至少 一个游戏交互对象,从而可以在显示区域中提供各个游戏交互对象对应的选择控件,并且基于选择控件实现对游戏交互对象的选择,从而可以实现在单手操作的情况下,快速便捷的实现对各个游戏交互对象的选择。
在上述实施例的基础上,本申请提供的对象选择方法还可以在图形用户界面中提供第一控件,则获取第一操作对应的检测区域时,可以基于第一控件进行获取,其中,第一控件例如可以为预先显示在图形用户界面中的,或者,第一控件还可以是预先不显示或者预先不存在,通过相关的操作触发之后才进行显示,下面结合具体的实施例对基于第一控件获取检测区域的实现方式进行说明。
首先结合图7和图8对第一控件预先显示在图形用户界面中的实现方式进行说明,在一种可能的实现方式中,第一控件可以预先显示在图形用户界面中的任意位置,图7为本申请又一实施例提供的对象选择方法的流程图,图8为本申请实施例提供的第一控件示意图。
在一种可能的实现方式中,获取第一操作的触控点所对应的检测区域,可以包括:
S701、获取第一操作的触控点,确定触控点作用于第一控件的响应区域。
在本实施例中,第一控件的响应区域可以为全屏响应区域,比如说当前的图形用户界面的全部区域均为第一控件的响应区域;或者第一控件的响应区域还可以为图形用户界面中的部分区域,比如说第一控件周边的区域,或者第一控件所对应的边缘区域等,本实施例对响应区域的具体实现方式不做限制,只要响应区域和第一控件存在对应关系即可。
参见图8,图形用户界面中包括第一控件801,该第一控件801预先显示在图形用户界面中的任意位置,以及滑动按钮801可以根据用户的第一操作移动至图形用户界面中的任意位置,在实际实现过程中,滑动按钮801的大小、初始位置、形状等均可以根据实际需求进行选择。
假设当前第一控件的响应区域为全屏响应区域,以及当前第一操作的触控点从图8中802的位置移动至803的位置,则当前获取第一操作的触控点即为图8中的803对应的位置,此时可以确定触控点803作用于第一控件的响应区域。
S702、获取第一控件的检测区域。
在一种可能的实现方式中,可以根据触控点的移动调整第一控件的位置,并根据第一控件的位置确定对应的检测区域。
例如参见图8,根据触控点的移动将第一控件的位置从802调整至803,此时例如可以获取第一控件对应的预设方向上预设长度和/或预设宽度的区域,将该区域确定为检测区域,比如说图8中所示的区域804。
通过根据触控点的移动调整第一控件的位置,从而根据第一控件的移动确定检测区域,能够为用户提供更加直观的检测区域的确定过程,从而便于用户确定检测区域。
在上述实施例的基础上,本申请中的图形用户界面还可以包括滑动区域,其中滑动区域可以为于游戏画面的边缘区域,在另一种可能的实现方式中,第一控件可以预先显示在 图形用户界面中的滑动区域中,下面结合具体的实施例对基于第一控件显示在滑动区域中时,获取检测区域的实现方式进行说明。
此处需要说明的是,本申请中的滑动区域并不一定是响应区域,本申请中的响应区域包括滑动区域。
结合图9-图11进行介绍,图9为本申请实施例提供的滑动区域的示意图,图10为本申请实施例提供的确定基于横向滑动区域确定检测区域的示意图,图11为本申请实施例提供的确定基于纵向滑动区域确定检测区域的示意图。
在本实施例中,滑动区域和包括横向滑动区域和/或纵向滑动区域,其中,横向滑动区域可以位于图形用户界面的横向边缘区域,纵向滑动区域可以位于图形用户界面的纵向边缘区域。
例如可以参见图9,图形用户界面中可以包括三个滑动区域,分别为图9中的纵向滑动区域901、902以及横向滑动区域903,其中,各个滑动按钮均可以对应各自的第一控件,每个第一控件可以在其对应的滑动区域上进行滑动。
在本实施例中,根据触控点的移动调整第一控件的位置的步骤可以为:
根据触控点的移动调整第一控件在滑动区域的位置。
以滑动区域902为例,在滑动区域902上包括第一控件904,则可以根据触控点的移动控制第一控件904在滑动区域902上进行上下滑动,从而调整第一控件904在滑动区域902中的位置。
在确定第一控件的位置之后,可以根据第一控件的位置确定对应的检测区域,其中,检测区域可以为垂直于滑动区域的检测线或者垂直于滑动区域的以预设宽度为宽度值的区域。
在一种可能的实现方式中,例如参照图10,假设此时第一操作为作用于横向滑动区域1000的第一操作,以及假设第一操作的触控点1001从第一位置1002移动到第二位置1003,则可以根据触控点的移动调整第一控件在滑动区域中的位置,如图10所示,第一控件在滑动区域中的位置对应从第一位置1002调整到第二位置1003。
此时第一操作位于横向滑动区域1000,其中获取第一操作对应的检测区域的步骤例如可以为:
获取垂直于横向滑动区域的检测线或垂直于横向滑动区域的以预设宽度为宽度值的区域。
例如参见图10,获取垂直于横向滑动区域1000的以预设宽度为宽度值的区域,可以得到检测区域1005。
在另一种可能的实现方式中,例如可以参照图11,假设此时第一操作为作用于纵向滑动区域1101的第一操作,以及假设此时根据触控点的移动调整第一控件在滑动区域的位置,确定第一控件在滑动区域的位置为位置1102。
此时第一操作位于纵向滑动区域1101,其中获取第一操作对应的检测区域的步骤例 如可以为:
获取垂直于纵向滑动区域的检测线或垂直于纵向滑动区域的以预设宽度为宽度值的区域。
例如参见图11,获取垂直于纵向滑动区域1101的以预设宽度为宽度值的区域,可以得到检测区域1103。
或者,还可以获取垂直于滑动区域的检测线,从而获取检测区域,其实现方式与上述介绍的类似。
值得说明的是,本实施例中的滑动区域和第一控件的实现方式例如可以如上图中所示;或者,滑动区域的数量、设置位置等还可以根据实际需求进行选择,第一控件的实现方式同样可以根据实际需求进行选择,本实施例对此不做限定。
本申请实施例提供的对象选择方法,通过提供滑动区域,从而可以根据触控点的移动调整第一控件在滑动区域的位置,能够使得第一控件的移动更加具有可控性。
在上述实施例的基础上,在再一种可能的实现方式中,第一控件还可以是预先不显示或者预先不存在,通过相关的操作触发之后才进行显示,例如说可以响应作用于滑动区域的操作,在图形用户界面中显示第一控件,下面结合具体的实施例对提供第一控件的可能的实现方式进行说明。
结合图12进行说明,图12为本申请实施例提供的显示第一控件的实现方式示意图。
参见图12,图形用户界面中可以包括三个滑动区域,分别为图12中的纵向滑动区域1201、1202以及横向滑动区域1203,如图12中的12a所示,第一控件预先不显示或者预先不存在,此时只有各个滑动区域。
在一种可能的实现方式中,可以响应作用于滑动区域的第二操作,控制在图形用户界面显示第一控件,其中,第二操作为与第一操作连续的操作。
假设第一操作为滑动操作,则第二操作可以为与第一操作连续的滑动操作,如图12中的12b所示,响应作用于横向滑动区域1203的第二操作,控制在图形用户界面中显示第一控件1204,其中第一控件例如可以显示在横向滑动区域1203中,或者还可以显示在其余位置,本实施例对此不做限制。
在显示第一控件之后,可以根据第一控件的位置确定对应的检测区域,其实现方式与上述介绍的类似,此处不再赘述。
在上述实施例的基础上,第一操作可以为作用于滑动区域的第一操作,则可以根据第一操作在滑动区域中的位置直接获取检测区域,不需要提供第一控件,下面结合图13和图14以具体的实施例进行说明,图13为本申请实施例提供的检测线示意图,图14为本申请实施例提供的检测区域的另一示意图。
例如可以参见图13,图形用户界面中可以包括三个滑动区域,分别为图13中的纵向滑动区域1301、1302以及横向滑动区域1303。
在本实施例中,第一操作为作用于滑动区域的第一操作,第一操作为如下至少一种操 作:点击操作和滑动操作,其中,检测区域可以为垂直于滑动区域的检测线或垂直于滑动区域的以预设宽度为宽度值的区域。
在一种可能的实现方式中,例如参照图13,假设此时第一操作为作用于横向滑动区域1303的第一操作,此时获取第一操作对应的检测区域的步骤例如可以为:
获取垂直于横向滑动区域的检测线或垂直于横向滑动区域的以预设宽度为宽度值的区域。
例如参见图13,获取垂直于横向滑动区域1303的检测线1304,可以得到检测区域1303。
在另一种可能的实现方式中,例如可以参照图14,假设此时第一操作为作用于纵向滑动区域1401的第一操作1402,此时获取第一操作1402对应的检测区域的步骤例如可以为:
获取垂直于纵向滑动区域的检测线或垂直于纵向滑动区域的以预设宽度为宽度值的区域。
例如参见图14,获取垂直于纵向滑动区域1401的以预设宽度为宽度值的区域,可以得到检测区域1403。
需要说明的是,图14与图11的不同之处在于,图14的实现方式中不需要提供第一控件。
本申请实施例提供的对象选择方法,通过作用于滑动区域的第一操作,直接获取第一操作对应的检测区域,不需要提供第一控件,从而提升了操作的简便性。
在上述实施例的基础上,本申请中控制在与第一操作对应的显示区域中提供至少一个选择控件的步骤,可以包括:
获取与检测区域存在重合的至少一个游戏交互对象的在游戏画面中的位置关系;
根据位置关系,在显示区域中提供至少一个选择控件。
例如可以按照至少一个游戏交互对象在游戏画面中的位置关系,在显示区域中按照对应的位置关系提供至少一个选择控件;或者,还可以根据所述位置关系,在第一操作的触控点周边环绕提供至少一个选择控件,其各种可能的实现方式还可以根据实际需求进行扩展。
基于上述介绍的内容可以确定的是,本实施例中的滑动区域可以位于游戏画面的边缘区域,进一步地,本实施例中的滑动区域可以位于边缘区域的单手操作的热区范围,下面结合图15和图16对单手操作的热区范围进行说明,图15为本申请实施例提供的左手操作的热区范围示意图,图16为本申请实施例提供的右手操作的热区范围示意图。
如图15所示,对于左手操作来说,其中易操作的范围、一般的范围和困难的范围如图15所示,其中热区范围例如可以为易操作的范围,或者还可以为易操作的范围和一般的范围的集合。
如图16所示,对于右手操作来说,其中易操作的范围、一般的范围和困难的范围如 图16所示,其中热区范围例如可以为易操作的范围,或者还可以为易操作的范围和一般的范围的集合。
在一种可能的实现方式中,结合左手操作的热区范围和右手操作的热区范围,可以将滑动区域设置于边缘区域。
通过将滑动区域设置在边缘区域的单手操作的热区范围,可以便于实现单手操作,从而提升对各个游戏入口区域选择的便捷性。
在上述实施例的基础上,在图形用户界面中包括滑动区域时,在一种可能的实现方式中,在没有检测到对各滑动区域的操作时,可以将各滑动区域设置为隐藏状态,下面结合图17进行说明,图17为本申请实施例提供的隐藏状态的滑动区域的示意图。
其中,隐藏状态例如可以为图17中所示的滑动区域1701的透明度状态,或者还可以为图17中所示的滑动区域1702的不显示状态,或者还可以为图17中所示的滑动区域1703的虚线状态,本实施例对隐藏状态的具体实现方式不做限定,只要隐藏状态可以弱化滑动区域的显示即可。
在本实施例中,通过在没有检测到对滑动区域的操作时,将滑动区域设置为隐藏状态,可以提升游戏画面的精简度,从而提升用户的游戏体验。
图18为本申请实施例提供的对象选择装置的结构示意图。如图18所示,该装置180包括:获取模块1801、提供模块1802以及处理模块1803。
获取模块1801,用于响应作用于所述图形用户界面的第一操作,获取所述第一操作对应的检测区域;
所述获取模块1801,还用于获取与所述检测区域存在重合的至少一个游戏交互对象;
提供模块1802,用于控制在与所述第一操作对应的显示区域中提供至少一个选择控件,其中,每个所述选择控件对应一个所述游戏交互对象;
处理模块1803,用于响应作用于至少一个所述选择控件的选择操作,执行与所述选择控件对应的游戏交互对象的交互操作。
在一种可能的设计中,所述获取模块1801具体用于:
获取所述第一操作的触控点所对应的检测区域。
在一种可能的设计中,所述提供模块1802还用于:
所述图形用户界面提供第一控件;
其中,所述获取模块1801具体用于:
获取所述第一操作的触控点,确定所述触控点作用于所述第一控件的响应区域;
获取所述第一控件的检测区域。
在一种可能的设计中,所述获取模块1801具体用于:
根据所述触控点的移动调整所述第一控件的位置;
根据所述第一控件的位置确定对应的检测区域。
在一种可能的设计中,所述图形用户界面包括滑动区域,所述滑动区域位于所述游戏 画面的边缘区域,所述获取模块1801具体用于:
根据所述触控点的移动调整所述第一控件在所述滑动区域的位置。
在一种可能的设计中,所述提供模块1802具体用于:
响应作用于所述滑动区域的第二操作,控制在所述图形用户界面显示第一控件,其中,所述第二操作为与所述第一操作连续的操作。
在一种可能的设计中,所述图形用户界面包括滑动区域,所述第一操作为作用于所述滑动区域的第一操作,所述第一操作为如下至少一种操作:点击操作和滑动操作,其中,所述检测区域为垂直于所述滑动区域的检测线或垂直于所述滑动区域的以预设宽度为宽度值的区域。
在一种可能的设计中,所述滑动区域包括横向滑动区域和/或纵向滑动区域,横向滑动区域位于所述图形用户界面的横向边缘区域,所述纵向滑动区域位于所述图形用户界面的纵向边缘区域;所述获取模块1801用于如下中的至少一种步骤:
确定所述第一操作位于所述横向滑动区域,获取垂直于所述横向滑动区域的检测线或垂直于所述横向滑动区域的以预设宽度为宽度值的区域;和
确定所述第一操作位于所述纵向滑动区域时,获取垂直于所述纵向滑动区域的检测线或垂直于所述纵向滑动区域的以预设宽度为宽度值的区域。
在一种可能的设计中,所述提供模块1802具体用于:
获取与所述检测区域存在重合的至少一个游戏交互对象的在所述游戏画面中的位置关系;
根据所述位置关系,在所述显示区域中提供至少一个选择控件。
在一种可能的设计中,所述检测区域与所述显示区域为同一区域。
在一种可能的设计中,所述获取模块1801还用于:
在所述获取与所述检测区域存在重合的至少一个游戏交互对象之后,获取所述至少一个游戏交互对象的对象标识;
所述提供模块1802还用于:
在每个所述选择控件的周边,显示所述选择控件对应的游戏交互对象的对象标识。
在一种可能的设计中,所述提供模块1802还用于:
在没有检测到对所述滑动区域的操作时,将所述滑动区域设置为隐藏状态。
本实施例提供的装置,可用于执行上述方法实施例的技术方案,其实现原理和技术效果类似,本实施例此处不再赘述。
图19为本申请实施例提供的对象选择设备的硬件结构示意图,如图19所示,本实施例的对象选择设备190包括:处理器1901以及存储器1902;其中
存储器1902,用于存储计算机执行指令;
处理器1901,用于执行存储器存储的计算机执行指令,以实现上述实施例中对象选择方法所执行的各个步骤。具体可以参见前述方法实施例中的相关描述。
可选地,存储器1902既可以是独立的,也可以跟处理器1901集成在一起。
当存储器1902独立设置时,该对象选择设备还包括总线1903,用于连接所述存储器1902和处理器1901。
本申请实施例还提供一种计算机可读存储介质,所述计算机可读存储介质中存储有计算机执行指令,当处理器执行所述计算机执行指令时,实现如上对象选择设备所执行的对象选择方法。
在本申请所提供的几个实施例中,应该理解到,所揭露的设备和方法,可以通过其它的方式实现。例如,以上所描述的设备实施例仅仅是示意性的,例如,所述模块的划分,仅仅为一种逻辑功能划分,实际实现时可以有另外的划分方式,例如多个模块可以结合或者可以集成到另一个系统,或一些特征可以忽略,或不执行。另一点,所显示或讨论的相互之间的耦合或直接耦合或通信连接可以是通过一些接口,装置或模块的间接耦合或通信连接,可以是电性,机械或其它的形式。
上述以软件功能模块的形式实现的集成的模块,可以存储在一个计算机可读取存储介质中。上述软件功能模块存储在一个存储介质中,包括若干指令用以使得一台计算机设备(可以是个人计算机,服务器,或者网络设备等)或处理器(英文:processor)执行本申请各个实施例所述方法的部分步骤。
应理解,上述处理器可以是中央处理单元(英文:Central Processing Unit,简称:CPU),还可以是其他通用处理器、数字信号处理器(英文:Digital Signal Processor,简称:DSP)、专用集成电路(英文:Application Specific Integrated Circuit,简称:ASIC)等。通用处理器可以是微处理器或者该处理器也可以是任何常规的处理器等。结合本公开所公开的方法的步骤可以直接体现为硬件处理器执行完成,或者用处理器中的硬件及软件模块组合执行完成。
存储器可能包含高速RAM存储器,也可能还包括非易失性存储NVM,例如至少一个磁盘存储器,还可以为U盘、移动硬盘、只读存储器、磁盘或光盘等。
总线可以是工业标准体系结构(Industry Standard Architecture,ISA)总线、外部设备互连(Peripheral Component,PCI)总线或扩展工业标准体系结构(Extended Industry Standard Architecture,EISA)总线等。总线可以分为地址总线、数据总线、控制总线等。为便于表示,本申请附图中的总线并不限定仅有一根总线或一种类型的总线。
上述存储介质可以是由任何类型的易失性或非易失性存储设备或者它们的组合实现,如静态随机存取存储器(SRAM),电可擦除可编程只读存储器(EEPROM),可擦除可编程只读存储器(EPROM),可编程只读存储器(PROM),只读存储器(ROM),磁存储器,快闪存储器,磁盘或光盘。存储介质可以是通用或专用计算机能够存取的任何可用介质。
本领域普通技术人员可以理解:实现上述各方法实施例的全部或部分步骤可以通过程序指令相关的硬件来完成。前述的程序可以存储于一计算机可读取存储介质中。该程序在 执行时,执行包括上述各方法实施例的步骤;而前述的存储介质包括:ROM、RAM、磁碟或者光盘等各种可以存储程序代码的介质。
最后应说明的是:以上各实施例仅用以说明本申请的技术方案,而非对其限制;尽管参照前述各实施例对本申请进行了详细的说明,本领域的普通技术人员应当理解:其依然可以对前述各实施例所记载的技术方案进行修改,或者对其中部分或者全部技术特征进行等同替换;而这些修改或者替换,并不使相应技术方案的本质脱离本申请各实施例技术方案的范围。
Claims (15)
- 一种对象选择方法,通过第一终端设备提供图形用户界面,所述图形用户界面包括游戏画面,所述游戏画面中包括至少一个游戏交互对象,所述方法包括:响应作用于所述图形用户界面的第一操作,获取所述第一操作对应的检测区域;获取与所述检测区域存在重合的至少一个游戏交互对象;控制在与所述第一操作对应的显示区域中提供至少一个选择控件,其中,每个所述选择控件对应一个所述游戏交互对象;响应作用于至少一个所述选择控件的选择操作,执行与所述选择控件对应的游戏交互对象的交互操作。
- 根据权利要求1所述的方法,其中,所述获取所述第一操作对应的检测区域,包括:获取所述第一操作的触控点所对应的检测区域。
- 根据权利要求2所述的方法,其中,所述方法还包括:所述图形用户界面提供第一控件;其中,所述获取所述第一操作的触控点所对应的检测区域,包括:获取所述第一操作的触控点,确定所述触控点作用于所述第一控件的响应区域;获取所述第一控件的检测区域。
- 根据权利要求3所述的方法,其中,所述获取所述第一控件的检测区域的步骤,包括:根据所述触控点的移动调整所述第一控件的位置;根据所述第一控件的位置确定对应的检测区域。
- 根据权利要求4所述的方法,其中,所述图形用户界面包括滑动区域,所述滑动区域位于所述游戏画面的边缘区域,所述根据所述触控点的移动调整所述第一控件的位置的步骤为根据所述触控点的移动调整所述第一控件在所述滑动区域的位置。
- 根据权利要求5所述的方法,其中,所述图形用户界面提供第一控件的步骤,包括:响应作用于所述滑动区域的第二操作,控制在所述图形用户界面显示第一控件,其中,所述第二操作为与所述第一操作连续的操作。
- 根据权利要求1所述的方法,其中,所述图形用户界面包括滑动区域,所述第一操作为作用于所述滑动区域的第一操作,所述第一操作为如下至少一种操作:点击操作和滑动操作,其中,所述检测区域为垂直于所述滑动区域的检测线或垂直于所述滑动区域的以预设宽度为宽度值的区域。
- 根据权利要求7所述的方法,其中,所述滑动区域包括横向滑动区域和/或纵向滑动区域,横向滑动区域位于所述图形用户界面的横向边缘区域,所述纵向滑动区域位于所 述图形用户界面的纵向边缘区域;获取所述第一操作对应的检测区域的步骤包括如下至少一种步骤:确定所述第一操作位于所述横向滑动区域,获取垂直于所述横向滑动区域的检测线或垂直于所述横向滑动区域的以预设宽度为宽度值的区域;和确定所述第一操作位于所述纵向滑动区域时,获取垂直于所述纵向滑动区域的检测线或垂直于所述纵向滑动区域的以预设宽度为宽度值的区域。
- 根据权利要求1所述的方法,其中,控制在与所述第一操作对应的显示区域中提供至少一个选择控件的步骤,包括:获取与所述检测区域存在重合的至少一个游戏交互对象的在所述游戏画面中的位置关系;根据所述位置关系,在所述显示区域中提供至少一个选择控件。
- 根据权利要求1所述的方法,其中,所述检测区域与所述显示区域为同一区域。
- 根据权利要求1所述的方法,其中,所述获取与所述检测区域存在重合的至少一个游戏交互对象之后,所述方法还包括:获取所述至少一个游戏交互对象的对象标识;所述方法还包括:在每个所述选择控件的周边,显示所述选择控件对应的游戏交互对象的对象标识。
- 根据权利要求5所述的方法,其中,所述方法还包括:在没有检测到对所述滑动区域的操作时,将所述滑动区域设置为隐藏状态。
- 一种对象选择装置,通过第一终端设备提供图形用户界面,所述图形用户界面包括游戏画面,所述游戏画面中包括至少一个游戏交互对象,所述装置包括:获取模块,用于响应作用于所述图形用户界面的第一操作,获取所述第一操作对应的检测区域;所述获取模块,还用于获取与所述检测区域存在重合的至少一个游戏交互对象;提供模块,用于控制在与所述第一操作对应的显示区域中提供至少一个选择控件,其中,每个所述选择控件对应一个所述游戏交互对象;处理模块,用于响应作用于至少一个所述选择控件的选择操作,执行与所述选择控件对应的游戏交互对象的交互操作。
- 一种对象选择设备,包括:存储器,用于存储程序;处理器,用于执行所述存储器存储的所述程序,当所述程序被执行时,所述处理器用于执行如权利要求1至12中任一所述的方法。
- 一种计算机可读存储介质,包括指令,当其在计算机上运行时,使得计算机执行如权利要求1至12中任一所述的方法。
Priority Applications (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2022525832A JP7416931B2 (ja) | 2020-05-25 | 2020-08-07 | 対象選択方法及び装置 |
US17/759,376 US20230066930A1 (en) | 2020-05-25 | 2020-08-07 | Object selection method and apparatus |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202010450283.5 | 2020-05-25 | ||
CN202010450283.5A CN111603759B (zh) | 2020-05-25 | 2020-05-25 | 对象选择方法及装置 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2021237942A1 true WO2021237942A1 (zh) | 2021-12-02 |
Family
ID=72196765
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/CN2020/107948 WO2021237942A1 (zh) | 2020-05-25 | 2020-08-07 | 对象选择方法及装置 |
Country Status (4)
Country | Link |
---|---|
US (1) | US20230066930A1 (zh) |
JP (1) | JP7416931B2 (zh) |
CN (1) | CN111603759B (zh) |
WO (1) | WO2021237942A1 (zh) |
Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20140315636A1 (en) * | 2013-04-18 | 2014-10-23 | Screenovate Technologies Ltd. | Method for enabling usage of a computerized mobile device as a game graphical user interface for an application game |
CN105867684A (zh) * | 2016-03-25 | 2016-08-17 | 乐视控股(北京)有限公司 | 一种对象的控制方法及装置 |
CN109771941A (zh) * | 2019-03-13 | 2019-05-21 | 网易(杭州)网络有限公司 | 游戏中虚拟对象的选择方法及装置、设备和介质 |
CN110052021A (zh) * | 2019-04-12 | 2019-07-26 | 网易(杭州)网络有限公司 | 游戏对象处理方法、移动终端设备、电子设备及存储介质 |
Family Cites Families (10)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP5798532B2 (ja) | 2012-08-23 | 2015-10-21 | 株式会社Nttドコモ | ユーザインタフェース装置、ユーザインタフェース方法及びプログラム |
JP5563633B2 (ja) * | 2012-08-31 | 2014-07-30 | 株式会社スクウェア・エニックス | ビデオゲーム処理装置、およびビデオゲーム処理プログラム |
JP6053500B2 (ja) | 2012-12-21 | 2016-12-27 | 京セラ株式会社 | 携帯端末ならびにユーザインターフェース制御プログラムおよび方法 |
US10671275B2 (en) * | 2014-09-04 | 2020-06-02 | Apple Inc. | User interfaces for improving single-handed operation of devices |
CN106708399A (zh) | 2015-11-17 | 2017-05-24 | 天津三星通信技术研究有限公司 | 用于具有双侧边曲面屏幕的电子终端的触控方法和设备 |
CN105487805B (zh) * | 2015-12-01 | 2020-06-02 | 小米科技有限责任公司 | 对象操作方法及装置 |
US10572054B2 (en) | 2016-10-28 | 2020-02-25 | Nanning Fugui Precision Industrial Co., Ltd. | Interface control method for operation with one hand and electronic device thereof |
CN108733275A (zh) | 2018-04-28 | 2018-11-02 | 维沃移动通信有限公司 | 一种对象显示方法及终端 |
CN109847355B (zh) * | 2019-03-11 | 2020-01-07 | 网易(杭州)网络有限公司 | 游戏对象的选择方法及装置 |
CN116459506A (zh) * | 2019-07-04 | 2023-07-21 | 网易(杭州)网络有限公司 | 游戏对象选择方法及装置 |
-
2020
- 2020-05-25 CN CN202010450283.5A patent/CN111603759B/zh active Active
- 2020-08-07 US US17/759,376 patent/US20230066930A1/en active Pending
- 2020-08-07 WO PCT/CN2020/107948 patent/WO2021237942A1/zh active Application Filing
- 2020-08-07 JP JP2022525832A patent/JP7416931B2/ja active Active
Patent Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20140315636A1 (en) * | 2013-04-18 | 2014-10-23 | Screenovate Technologies Ltd. | Method for enabling usage of a computerized mobile device as a game graphical user interface for an application game |
CN105867684A (zh) * | 2016-03-25 | 2016-08-17 | 乐视控股(北京)有限公司 | 一种对象的控制方法及装置 |
CN109771941A (zh) * | 2019-03-13 | 2019-05-21 | 网易(杭州)网络有限公司 | 游戏中虚拟对象的选择方法及装置、设备和介质 |
CN110052021A (zh) * | 2019-04-12 | 2019-07-26 | 网易(杭州)网络有限公司 | 游戏对象处理方法、移动终端设备、电子设备及存储介质 |
Also Published As
Publication number | Publication date |
---|---|
JP2023500123A (ja) | 2023-01-04 |
CN111603759A (zh) | 2020-09-01 |
US20230066930A1 (en) | 2023-03-02 |
JP7416931B2 (ja) | 2024-01-17 |
CN111603759B (zh) | 2021-06-22 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US11577169B2 (en) | Game perspective control method and apparatus | |
KR102649254B1 (ko) | 디스플레이 제어 방법, 저장 매체 및 전자 장치 | |
CN107632895B (zh) | 一种信息共享方法及移动终端 | |
WO2019174467A1 (zh) | 输入法界面显示方法、装置、终端及存储介质 | |
US11402992B2 (en) | Control method, electronic device and non-transitory computer readable recording medium device | |
CN106951163B (zh) | 一种显示控制方法及装置 | |
CN107977141B (zh) | 交互控制方法、装置、电子设备及存储介质 | |
US11693544B2 (en) | Mobile terminal display picture control method, apparatus, and device and storage medium | |
BR112015025939B1 (pt) | Método e sistema para gerenciamento de janela | |
CN110928614B (zh) | 界面显示方法、装置、设备及存储介质 | |
WO2020215978A1 (zh) | 游戏对象控制方法及装置 | |
JP2023552659A (ja) | インターフェース表示状態の調整方法及び装置、デバイス、記憶媒体 | |
CN114779977A (zh) | 界面显示方法、装置、电子设备及存储介质 | |
JP2024526024A (ja) | 表示制御方法、装置 | |
WO2015024375A1 (zh) | 微件面积调节的方法及装置 | |
CN111142754A (zh) | 截图处理方法、装置及存储介质 | |
JP6661780B2 (ja) | 顔モデル編集方法及び装置 | |
CN110215687B (zh) | 游戏对象选择方法及装置 | |
WO2021237942A1 (zh) | 对象选择方法及装置 | |
CN110413167A (zh) | 一种终端设备的截屏方法及终端设备 | |
CN111124246B (zh) | 界面交互方法、设备及存储介质 | |
CN113835578A (zh) | 显示方法、装置及电子设备 | |
WO2020215778A1 (zh) | 游戏对象控制方法及装置 | |
KR102223554B1 (ko) | 단말기, 이의 제어 방법 및 상기 방법을 구현하기 위한 프로그램을 기록한 기록 매체 | |
TWI840315B (zh) | 顯示系統和顯示方法 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 20937298 Country of ref document: EP Kind code of ref document: A1 |
|
ENP | Entry into the national phase |
Ref document number: 2022525832 Country of ref document: JP Kind code of ref document: A |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 20937298 Country of ref document: EP Kind code of ref document: A1 |