WO2020098444A1 - 对象的显示方法、装置、存储介质及电子装置 - Google Patents

对象的显示方法、装置、存储介质及电子装置 Download PDF

Info

Publication number
WO2020098444A1
WO2020098444A1 PCT/CN2019/111635 CN2019111635W WO2020098444A1 WO 2020098444 A1 WO2020098444 A1 WO 2020098444A1 CN 2019111635 W CN2019111635 W CN 2019111635W WO 2020098444 A1 WO2020098444 A1 WO 2020098444A1
Authority
WO
WIPO (PCT)
Prior art keywords
target
response
display
game
game interface
Prior art date
Application number
PCT/CN2019/111635
Other languages
English (en)
French (fr)
Inventor
林孔伟
Original Assignee
腾讯科技(深圳)有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 腾讯科技(深圳)有限公司 filed Critical 腾讯科技(深圳)有限公司
Publication of WO2020098444A1 publication Critical patent/WO2020098444A1/zh
Priority to US17/078,059 priority Critical patent/US11400375B2/en
Priority to US17/848,293 priority patent/US20220314116A1/en

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/20Input arrangements for video game devices
    • A63F13/21Input arrangements for video game devices characterised by their sensors, purposes or types
    • A63F13/214Input arrangements for video game devices characterised by their sensors, purposes or types for locating contacts on a surface, e.g. floor mats or touch pads
    • A63F13/2145Input arrangements for video game devices characterised by their sensors, purposes or types for locating contacts on a surface, e.g. floor mats or touch pads the surface being also a display device, e.g. touch screens
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/50Controlling the output signals based on the game progress
    • A63F13/52Controlling the output signals based on the game progress involving aspects of the displayed game scene
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/25Output arrangements for video game devices
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/50Controlling the output signals based on the game progress
    • A63F13/53Controlling the output signals based on the game progress involving additional visual information provided to the game scene, e.g. by overlay to simulate a head-up display [HUD] or displaying a laser sight in a shooting game
    • A63F13/537Controlling the output signals based on the game progress involving additional visual information provided to the game scene, e.g. by overlay to simulate a head-up display [HUD] or displaying a laser sight in a shooting game using indicators, e.g. showing the condition of a game character on screen
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/80Special adaptations for executing a specific game genre or game mode
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/90Constructional details or arrangements of video game devices not provided for in groups A63F13/20 or A63F13/25, e.g. housing, wiring, connections or cabinets
    • A63F13/92Video game devices specially adapted to be hand-held while playing

Definitions

  • This application relates to the field of computers, and in particular, to an object display technology.
  • Embodiments of the present application provide an object display method, device, storage medium, and electronic device, to at least solve the technology in the related art that the large display space occupied by the object in the game interface results in a low utilization rate of the display space of the game interface problem.
  • an object operation control method including: displaying a first object on an object display area in a game interface; if a first operation performed on the first object is detected, In response to the first operation, displaying a second object in the game interface, wherein the first object is used to control the game character to perform the first target operation in response to the second operation performed on the first object; The second object is used to control the game character to perform the second target operation in response to the third operation performed on the second object.
  • an object operation control device including: a first display module for displaying a first object on an object display area in a game interface; and a detection module for detecting A first operation performed on the first object, wherein the first object is used to control the game character to perform a first target operation in response to a second operation performed on the first object; a second display module is used to The detection module detects a first operation performed on the object, and displays a second object in the game interface in response to the first operation, wherein the second object is used to respond to the execution of the second object
  • the third operation controls the game character to perform the second target operation.
  • a storage medium in which a computer program is stored, wherein the computer program is set to execute the method described in any one of the above when it is run.
  • an electronic device including a memory and a processor, wherein the memory stores a computer program, and the processor is configured to be executed by the computer program The method described in any of the above.
  • a computer program product including instructions, which, when run on a computer, cause the computer to execute the method described in any one of the above.
  • the first object is displayed on the object display area in the game interface, and if it is detected that the first operation is performed on the first object, in response to the first operation, the second object is displayed on the game interface,
  • the first object is used to control the game character to perform the first target operation in response to the second operation performed on the first object
  • the second object is used to control the game character to execute the second target in response to a third operation performed on the second object operating.
  • the game interface is divided into an object display area and a non-object display area. In the object display area, it is not necessary to display all objects.
  • the second object is hidden, and only the first object is displayed in the object display area.
  • the hidden second object is displayed for display, thereby saving the space for displaying objects in the game interface, and reducing the influence of the object used to control the game character to perform the target operation on the display of the game screen, so that
  • the game interface can have more sufficient space to display the game screen or other icons and other information, thereby realizing the technical effect of saving the game interface display space occupied by the object, improving the utilization rate of the game interface display space, and thereby solving the object in the related technology
  • the large display space occupied in the game interface leads to a technical problem of low utilization rate of the display space of the game interface.
  • FIG. 1 is a schematic diagram of an optional object display method according to an embodiment of the present application.
  • FIG. 2 is a schematic diagram of an application environment of an optional object display method according to an embodiment of the present application
  • FIG. 3 is a schematic diagram of an optional object display method according to an optional embodiment of the present application.
  • FIG. 4 is a schematic diagram of another alternative object display method according to an alternative embodiment of the present application.
  • FIG. 5 is a schematic diagram of another optional object display method according to an optional embodiment of the present application.
  • FIG. 6 is a schematic diagram of another alternative object display method according to an alternative embodiment of the present application.
  • FIG. 7 is a schematic diagram of another optional object display method according to an optional embodiment of the present application.
  • FIG. 8 is a schematic diagram of an optional object display device according to an embodiment of the present application.
  • FIG. 9 is a schematic diagram of an application scenario of an optional object display method according to an embodiment of the present application.
  • FIG. 10 is a schematic diagram of an optional electronic device according to an embodiment of the present application.
  • a method for displaying an object is provided.
  • the method can be applied to a terminal, which can be a smart terminal such as a smart phone, a computer, a personal digital assistant (PDA), or a tablet computer Other equipment.
  • a terminal can be a smart terminal such as a smart phone, a computer, a personal digital assistant (PDA), or a tablet computer Other equipment.
  • PDA personal digital assistant
  • the method includes:
  • the terminal displays the first object on the object display area in the game interface
  • the first object is used to control the game character to perform the first target operation in response to the second operation performed on the first object; the second object is used to control the game character to perform the second target operation in response to a third operation performed on the second object.
  • the above object display method may be applied to the hardware environment composed of the terminal 202 shown in FIG. 2.
  • the game client 204 is installed on the terminal 202, the user clicks the game client 204 to enter the game, and the game interface 206 is displayed on the full screen of the terminal 202.
  • the game interface 206 displays the game scene, game characters, and The icon of the first object (object A and object B) operated, etc., the area where the icon of the first object (object A and object B) is displayed on the game interface is the object display area, and the other area is the non-object display area.
  • the client 204 displays the first object (object A and object B) on the object display area (area 1) in the game interface, and in the case of detecting the first operation performed on the object A in the first object, responds to the first An operation displays the second object (object C, object D, and object E) in the game interface.
  • the second object may be displayed in a non-object display area (area 2) in the game interface.
  • the above object display method may be, but not limited to, applied to a scene in which objects are displayed on a game interface.
  • the above client may be, but not limited to, various types of game applications, game applets, game websites, etc., for example, horizontal action games, casual puzzle games, action shooting games, sports racing games , Chess and board games, business strategy games, role-playing games, etc.
  • the scene displaying objects on the game interface of the above-mentioned horizontal action game or can be, but not limited to, the scene displaying objects on the game interface of the above role-playing game, to Save the display space of the game interface occupied by objects and improve the utilization of the display space of the game interface.
  • the above is only an example, and there is no limitation on this in this embodiment.
  • the objects displayed on the game interface by default may be referred to as the above-mentioned first objects, and the area for displaying the first object on the game interface may be referred to as the object display area, except for the objects on the game interface All or part of the area outside the display area may be referred to as a non-object display area.
  • the first object and the second object may be, but not limited to, objects with the same function, for example, the first object and the second object may be, but not limited to, different skills for the game character.
  • the first object and the second object may also be, but not limited to, different equipment, props, etc. for the game character.
  • the first object may include, but is not limited to, one or more objects.
  • the game interface may, but not limited to, display only one skill icon with the highest frequency of use or display Two or more frequently used icons. The remaining skill icons are the second object.
  • the first operation may be an operation for triggering the display of the second object
  • the type of the first operation may be, but not limited to, a click operation (for example, a click operation on a touch screen) or a touch operation ( For example: touch operation on the touch screen), swipe operation (for example: swipe operation on the touch screen), key operation (for example: operation on the keyboard or gamepad keys), joystick operation (for example: on the gamepad or notebook) The operation of the joystick on the computer) and so on.
  • the second operation may be an operation to trigger the control of the game character to perform the first target operation
  • the third operation may be the operation to trigger the control of the game character to perform a second target operation operating.
  • the types of the second operation and the third operation may be, but not limited to, click operation, touch operation, swipe operation, key operation, joystick operation, etc.
  • the second operation and the third operation may be the same type of operation
  • the first operation may be an operation different from the operation type to which the second operation and the third operation belong.
  • the first operation is a touch operation
  • the second operation and the third operation are click operations.
  • the second object When the terminal detects that the touch operation is performed on the first object, the second object is displayed in the game interface in response to the touch operation, and when the terminal detects that the click operation is performed on the first object, the game character is controlled to perform the first target operation.
  • the first operation may be the same operation type as the second operation and the third operation.
  • the first operation, the second operation, and the third operation are all touch operations, but the duration of the first operation is different from the second operation and the third operation.
  • the terminal detects the duration of the touch operation performed on the first object
  • the second object is displayed in the game interface in response to the touch operation, and when the terminal detects that the touch operation performed on the first object meets the second threshold, the game character is controlled to perform the first target operation.
  • the first target operation and the second target operation are different operations of the same type.
  • the first object and the second object are different skills of the game character
  • controlling the game character to perform the first target operation or the second target operation may be to control the game character to display different Skill effects (including displaying skill screens, changing character attributes, etc.).
  • the object display process includes the following steps:
  • Step 1 The terminal detects an operation of pressing and holding the skill ICON, that is, the terminal detects a touch operation on the skill A shown in FIG. 3.
  • Step 2 The terminal pops up other extended skill ICONs around the periphery, that is, the terminal pops up the ICONs of skill B and skill C around skill A.
  • Step 3 The terminal detects whether the finger is released on the current ICON, that is, the terminal detects whether the above touch operation is ended on the ICON of skill A. If yes, step 10 is performed, otherwise step 4 is performed.
  • Step 4 The terminal continuously displays other extended skills, that is, the terminal does not detect that the above touch operation is ended on the ICON of skill A, and then continuously displays skill B and skill C.
  • Step 5 The terminal detects whether the finger is swiped away from the current ICON, that is, the terminal detects whether the above-mentioned touch operation is converted into a swipe operation. If yes, step 6 is performed, otherwise step 4 is returned.
  • Step 6 The terminal detects whether the position where the finger stays has the extended skill ICON, that is, the terminal detects whether the swipe operation has moved to the display position of skill B or skill C. If yes, step 7 is performed, otherwise step 8 is performed.
  • step 7 the skill ICON of the selected finger position, that is, the skill corresponding to the position to which the swipe operation is swiped is determined as the second object, for example, the icon of the selected skill C, and step 9 is continued.
  • Step 8 The terminal selects the skill ICON closest to the finger, that is, the distance between the position where the terminal detects the swipe operation and the skill B and skill C. For example, if the distance between it and skill C is closer, then Select skill C.
  • Step 9 the terminal detects whether the finger is raised, that is, the terminal detects whether the swipe operation ends at the current position, and if so, step 10 is performed, otherwise step 4 is returned.
  • Step 10 Use the current skill, that is, the terminal to control the game character to release skill C.
  • Step 11 The terminal replaces the ICON on the UI with the skill used, that is, the terminal replaces the icon of skill A displayed on the object display area of the game interface with the icon of skill C.
  • the game interface on the terminal is divided into an object display area and a non-object display area.
  • the object display area it is not necessary to display all objects, the second object is hidden, and only the first object display area is displayed.
  • Object when the terminal detects the first operation performed on the first object, the hidden second object is displayed in the game interface, thereby saving the space for displaying the object on the game interface and reducing the role used to control the game character
  • the effect of the object performing the target operation on the display of the game screen allows the game interface to have more sufficient space to display the game screen or other icons and other information, thereby saving the game interface display space occupied by the objects and improving the display space of the game interface
  • the technical effect of the utilization rate further solves the technical problem that the display space occupied by the object in the game interface in the related art is large and the utilization rate of the display space of the game interface is low.
  • the terminal may display the second object on the non-object display area in the game interface in response to the first operation.
  • the terminal may also display the second object in the object display area, for example, replacing the first object affected by the first operation with the second object.
  • the terminal detecting the first operation performed on the first object includes:
  • the terminal detects the operation performed on the first object
  • the terminal detects that one of the following operations is performed on the first object, it is determined that the first operation is detected: a first touch operation, a first click operation, and a first swipe operation, where the first touch operation is on the first object A touch operation with a duration that satisfies the first condition, the first click operation is a click operation performed on the first object that triggered the first number of clicks in the first time period, and the first swipe operation is performed on the first object
  • the terminal may configure different kinds of operations as the first operation for triggering the display of the second object, for example: the first touch operation on the screen for a period of time, triggered within a certain period of time The operation of several clicks (click, double-click, triple-click, etc.), the swipe operation from the first object, etc.
  • the terminal detects the click operation performed on the first object, and displays the second object, or the terminal detects that the touch operation performed on the first object lasts for 2 seconds, and displays the second object, and so on.
  • the first operation for triggering the display of the second object may be different operations, for example, a variety of first objects are displayed on the game interface: skill icons , Set icons, prop icons, etc.
  • the terminal can display one of each icon as the first object on the object display area, the other icons are hidden as the second object, the terminal configures one for each icon for triggering
  • the operation of displaying the second object for example: when the terminal detects a click operation on the skill icon, the second object in the skill icon is displayed, and when the terminal detects a double-click operation on the setting icon, the second object in the setting icon is displayed, When the terminal detects a touch operation on the prop icon for 2 seconds, it displays the second object in the prop icon.
  • the terminal may also set the same operation for triggering the display of the second icon for the first objects of different categories.
  • the terminal displaying the second object on the non-object display area in the game interface includes:
  • the terminal obtains the hidden object corresponding to the first object from the display object and the hidden object having the corresponding relationship as the second object, where the display object includes the first object, and the hidden object includes the second object, and displays The object is an object displayed on the object display area in the game interface;
  • the terminal determines a target area for displaying the hidden object corresponding to the first object on the non-object display area;
  • the terminal displays the second object on the target area.
  • the objects that can be displayed in the game interface can be divided into two types: display objects and hidden objects, and the terminal can configure the correspondence between the display objects and hidden objects, that is, the terminal determines that a certain object is detected.
  • the terminal can configure the correspondence between the display objects and hidden objects, that is, the terminal determines that a certain object is detected.
  • the display object includes a first object, and the second object is a hidden object corresponding to the first object.
  • the correspondence between the objects may be configured as described above.
  • the terminal may also configure the correspondence between the object display area and the object.
  • the display area of the display object corresponds to one or more objects.
  • the terminal displays one of the objects as a display object on its corresponding display area, and hides the other objects as hidden objects.
  • the terminal detects the first During operation, all objects corresponding to the display area can be displayed, including the first object.
  • object 1 and object 2 are first objects
  • object 3 to object 5 are second objects.
  • the terminal may configure object 1 to correspond to object 3 and object 4, and object 2 to Object 5, on the game interface, displays object 1 and object 2, and when the terminal detects that the first operation is performed on object 1, object 3 and object 4 are displayed.
  • the terminal may set the object display area to include area 1 and area 2, configure area 1 to correspond to object 1, object 3 and object 4, area 2 to correspond to object 2 and object 5, in area 1 Object 1 is displayed on the top, Object 2 is displayed on the area 2, when the terminal detects the first operation performed on the object 1, the object 1, object 3, and object 4 are displayed, and when the terminal detects the first operation performed on the object 2, , Display object 2 and object 5.
  • the storage manner of the display object and the hidden object having a corresponding relationship may be, but not limited to, storing the display object identifier and the hidden object identifier in the form of a table.
  • the terminal detects the first object In the case of performing the first operation, in response to the first operation, the identifier of the first object is obtained, the identifier of the hidden object corresponding to the identifier of the first object is determined from the table, and then the identifier of the hidden object is obtained in the storage space
  • the terminal displays the acquired display icon as the icon of the second object in the non-object display area of the game interface, and the operation of the object identified by the identification of the hidden object as the second object can control the game character The second target operation performed.
  • the terminal determining the target area for displaying the hidden object corresponding to the first object on the non-object display area includes:
  • the terminal determines the area on the non-object display area whose distance to the first object falls within the target threshold range as the target area.
  • the terminal displaying the second object on the target area includes:
  • the terminal obtains the number of objects of the second object
  • the terminal divides a corresponding area for each of the second objects on the target area according to the number of objects to display.
  • the area on the non-object display area whose distance from the first object falls within the target threshold range may be displayed on the first object
  • the area is a circle center, and a circle area with a radius between the target threshold range or a partial circle area located in a certain direction (such as above) of the first object.
  • the terminal divides the target area according to the number of second objects to be displayed, and then displays the second object on the corresponding area after the division.
  • a first object is displayed on the game interface, and the area displaying the first object is used as the object display area.
  • the area on the game interface other than this area is the non-object display area.
  • the nearby one-third circle area is determined as the target area.
  • the number of objects of the second object is two, which are object 1 and object 2, respectively.
  • the terminal divides the above-mentioned one-third circle area into two and displays them separately.
  • the position where the second object is displayed is not necessarily in the vicinity of the first object, but may be any position in the non-object display area on the game interface, which is not limited in this embodiment.
  • the first object is displayed in the lower right corner of the game interface
  • the second object can be displayed in the upper left corner of the game interface, which can also facilitate two-handed operation of the game.
  • the terminal responds to the first operation after detecting the second operation performed on the first object.
  • the second operation controls the game character to perform the first target operation.
  • the terminal controls the game character to perform the second target operation in response to the third operation when detecting the third operation performed on the second object.
  • the first object and the second object are displayed on the game interface at the same time, the user can select the object that needs to be operated, and the terminal executes the corresponding process in response to the user's operation.
  • the terminal detects the trigger of the first operation performed on the first object to display the second object, if the terminal detects the second operation performed on the first object, it controls the game character to release the first object Corresponding skill (first target operation), if the terminal detects a third operation performed on the second object, the game character is controlled to release the skill corresponding to the second object (second target operation).
  • each object may also have a cooling time after being triggered, and the object within the cooling time may be set to not be allowed to be triggered before the end of the cooling time, in this state displayed in the game interface
  • the object within the cooling time may be set to not be allowed to be triggered before the end of the cooling time, in this state displayed in the game interface
  • the terminal in the case of detecting the second operation performed on the first object, the terminal enters the cooling time of the first object while controlling the game character to perform the first target operation in response to the second operation.
  • controlling the game character to perform the second target operation in response to the third operation includes:
  • the terminal detects the operation performed on the second object
  • the terminal detects that one of the following operations is performed on the second object, it is determined that the third operation is detected: a second touch operation, a second click operation, and a second swipe operation, where the second touch operation is on the second object
  • the execution of the touch operation whose duration meets the second condition, the second click operation is a click operation performed on the second object that triggered a second number of clicks in the second time period, and the second swipe operation is performed on the second object
  • a swiping operation that uses the first object as the starting point and a second object as the end point, or a swiping operation that uses the second object as the starting point to swipe in the second direction;
  • the terminal controls the game character to perform the second target operation in response to the third operation.
  • the form of the third operation for triggering the function of the second object may be various forms, such as: a second touch operation, a second click operation, and a second swipe operation, where the A two-touch operation is a touch operation performed on the second object for a duration that meets the second condition (for example: a touch operation performed on the second object for a duration of 2 seconds or more and 5 seconds or less, if the duration of the touch operation exceeds 5 Seconds can cancel the trigger on the second object), the second click operation is a click operation performed on the second object that triggered a second number of clicks within the second time period (for example: click, double-click, triple-click, etc.) ,
  • the second swipe operation is a swipe operation performed on the second object with the first object as the starting point and the second object as the end point (for example: a swiping operation from the first object to the second object), or,
  • the second object is a swiping operation in which the starting point swipes in the second direction (starting from the second
  • the second object includes multiple objects, and in the case that the terminal detects the third operation performed on the second object, controlling the game character to perform the second target operation in response to the third operation includes:
  • the terminal determines the positional relationship between the operation position of the third operation and multiple objects;
  • the terminal determines an object whose positional relationship among the multiple objects satisfies the target positional relationship as the target object;
  • the terminal controls the game character to perform the second target operation corresponding to the target object.
  • the terminal may determine the target object for responding to the third operation according to the positional relationship between the operation position of the third operation and the display positions of multiple objects in the second object.
  • a first object (the icon of skill 1) is displayed on the object display area in the game interface, and the terminal detects that the user The touch operation of 1 lasted for 2 seconds (as shown by the black dots in FIG. 7), and in response to the touch operation, a second object (skill 2 and skill 3, skill 2 is object 2, skill 3 is object 3), the terminal detects that the user's finger has not left the screen, the user's finger moves from the position of the skill 1 icon to the position of the skill 2 icon, and then leaves the screen (as shown by the white dot in Figure 7), then determines the skill 2 For the target object, control the game character to release skills 2.
  • the terminal determining that the positional relationship among the multiple objects meets the target positional relationship as the target object includes one of the following:
  • the terminal determines the object corresponding to the target area where the operation position falls on the non-object display area as the target object; or, the terminal determines the object closest to the operation position among the plurality of objects as the target object.
  • the terminal may determine the object corresponding to the area where the object within the operation position of the third operation is located as the target object.
  • the terminal may also determine the object closest to the operation position of the third operation as the target object.
  • the manner in which the terminal determines the target object according to the position relationship is not limited to this, and this embodiment does not limit this.
  • the terminal may also determine an object through which a connection line between the operation position of the third operation and the first object passes as a target object.
  • the method further includes:
  • the terminal replaces the first object displayed on the object display area in the game interface with the second object.
  • the terminal may replace the first object displayed in the object display area in the game interface with the selected second object.
  • the terminal may replace the first object displayed in the object display area in the game interface with the selected second object. In order to facilitate the triggering of the function of the second object again.
  • the game records the correspondence between the first object displayed in the object display area and the first target operation performed by the game character by performing the second operation on the first object.
  • the above correspondence relationship recorded in the game can be replaced with the second object and controlled by performing a third operation on the second object Correspondence between the second target operations performed by the game characters. That is to say, the terminal regards the second object as a new display object, and at the same time regards the related information of the second object as the information of the display object.
  • the method further includes one of the following:
  • the terminal When the terminal detects that the first operation is performed on the second object, the terminal displays the first object on the non-object display area in the game interface in response to the first operation; or, the terminal detects that the third operation is performed on the second object In this case, the game character is controlled to perform the second target operation in response to the third operation.
  • the first object may be configured as a hidden object corresponding to the second object, if the terminal detects that the second object is used to trigger The first operation of displaying the hidden object may display the first object in the non-object display area. If the terminal detects that the second operation is performed on the first object, it may control the game character to perform the first target operation in response to the second operation.
  • the second object displayed by the terminal in the object display area may directly control the game character to perform the second target operation in response to the detected third operation performed on it.
  • the first operation is an operation for triggering the display of hidden objects on the game interface.
  • the first operation performed on the second object Operate to trigger the display of the first object as a hidden object on the game interface.
  • the operation performed on the second object to trigger the display of the hidden object on the game interface may also be different from the first operation performed on the first object to trigger the display of the hidden object on the game interface .
  • performing a click operation on the first object can trigger the display of the second object
  • performing a touch operation on the second object can trigger the display of the first object.
  • an object display device for implementing the above object display method. As shown in FIG. 8, the device includes:
  • the first display module 82 is used to display the first object on the object display area in the game interface
  • the detection module 84 is configured to detect the first operation performed on the first object, wherein the first object is used to control the game character to perform the first target operation in response to the second operation performed on the first object;
  • the second display module 86 is configured to display the second object in the game interface in response to the first operation if the detection module 84 detects that the first operation is performed on the first object, wherein the second object is used to respond to the second object
  • the performed third operation controls the game character to perform the second target operation.
  • the game interface is divided into an object display area and a non-object display area.
  • the object display area it is not necessary to display all objects.
  • the second object is hidden, and only the first object is displayed in the object display area.
  • the hidden second object is displayed in the game interface, thereby saving space for displaying the object on the game interface and reducing the amount of control used by the game character to perform the target operation.
  • the influence of objects on the display of the game screen makes the game interface have more sufficient space to display the game screen or other icons and other information, thereby realizing the technology of saving the game interface display space occupied by the object and improving the utilization rate of the game interface display space
  • the effect further solves the technical problem that the display space occupied by the object in the game interface in the related art is large and the utilization rate of the display space in the game interface is low.
  • the second display module is configured to display the second object on a non-object display area in the game interface in response to the first operation.
  • the detection module includes:
  • a first detection unit configured to detect the operation performed on the first object
  • the first determining unit is configured to determine that the first operation is detected if one of the following operations is performed on the first object: a first touch operation, a first click operation, and a first swipe operation, wherein
  • the first touch operation is a touch operation performed on the first object for a duration that meets a first condition
  • the first click operation is performed on the first object and triggers a first click within a first time period
  • the number of click operations, the first swiping operation is a swiping operation performed on the first object with the first object as a starting point and swiping in a first direction.
  • the second display module includes:
  • An obtaining unit configured to obtain the hidden object corresponding to the first object as the second object from the display object and the hidden object having the corresponding relationship in response to the first operation, wherein the display object includes the first object, and the hidden object includes the second object
  • the display object is an object displayed on the object display area in the game interface
  • a determining unit configured to determine a target area for displaying the hidden object corresponding to the first object on the non-object display area
  • the display unit is used to display the second object on the target area.
  • the determining unit includes:
  • the first determining subunit is used to determine an area on the non-object display area whose distance to the first object falls within the target threshold range as the target area.
  • the display unit includes:
  • the dividing subunit is used to divide a corresponding area for each object in the second object on the target area for display according to the number of objects.
  • the device also includes one of the following:
  • the first control module is configured to control the game character to execute the first target operation in response to the second operation when it is detected that the second operation is performed on the first object;
  • the second control module is configured to control the game character to perform the second target operation in response to the third operation when it is detected that the third operation is performed on the second object.
  • the second control module includes:
  • a second detection unit configured to detect the operation performed on the second object
  • the second determining unit is configured to determine that a third operation is detected if one of the following operations is performed on the second object: a second touch operation, a second click operation, and a second swipe operation, where the second touch operation is A touch operation performed on the second object for a duration that meets the second condition.
  • the second click operation is a click operation performed on the second object that triggered a second number of clicks within the second time period
  • the second swipe operation is The second object performs a swiping operation that uses the first object as a starting point and a second object as an end point, or a swiping operation that uses the second object as a starting point to swipe in a second direction;
  • the first control unit is configured to control the game character to execute the second target operation in response to the third operation.
  • the second object includes multiple objects, where the second control module includes:
  • a third determining unit configured to determine the positional relationship between the operation position of the third operation and the plurality of objects when it is detected that the third operation is performed on the second object;
  • a fourth determining unit configured to determine an object whose positional relationship among the plurality of objects satisfies the target positional relationship as the target object
  • the second control unit is configured to control the game character to execute the second target operation corresponding to the target object in response to the third operation.
  • the fourth determining unit includes one of the following:
  • a second determining subunit configured to determine the object corresponding to the target area where the operation position falls on the non-object display area as the target object
  • the third determining subunit is used to determine the object closest to the operation position among the plurality of objects as the target object.
  • the device further includes:
  • the replacement module is used to replace the first object displayed on the object display area in the game interface with the second object.
  • the device also includes one of the following:
  • a third display module configured to display the first object on the non-object display area in the game interface in response to the first operation when it is detected that the first operation is performed on the second object;
  • the third control module is configured to control the game character to perform the second target operation in response to the third operation when it is detected that the third operation is performed on the second object.
  • the application environment in the embodiment of the present application may be, but not limited to, refer to the application environment in the foregoing embodiment, which will not be repeated in this embodiment.
  • the embodiments of the present application provide an optional specific application example for implementing the above-mentioned real-time communication connection method.
  • the above object display method may be, but not limited to, applied to a scene in which objects are displayed in a game interface as shown in FIG. 9.
  • a single-skill UI is provided, and a multi-skill intelligent expansion method is provided according to the player's operation to meet the player's demand for the ability to use skills in battle under the limitation of the number of mobile game UIs.
  • the terminal When the player uses their fingers to press and hold the ICON on the fixed skill UI in the game, the terminal will pop up different multiple skill ICONs around according to the system settings or the player's own settings; at this time, the player can point to the desired skill ICON Swipe your finger and lift your finger on the skill ICON you want to use. At this time, the terminal controls the player character in the game to release the skill corresponding to the current ICON. When the player lifts his finger, multiple skills ICON popped up around the terminal disappear, and the ICON on the fixed skill UI will be replaced with the released skill ICON to meet the player's need for quick use again.
  • the terminal expands all skill buttons that are folded by the current skill button; the terminal activates sub-layer skills according to the position where the finger slides; the terminal judges the release according to the skill activation area where the finger is raised Which skill.
  • the terminal detects the click coordinate range.
  • the click coordinate range is within the detection range of the skill button A:
  • the terminal surrounds the skill button A, divides the outer area of the skill button into several sub-areas within the radius R, displays the sub-skill panel S and its sub-skill icons, and binds its sub-skills
  • the terminal cyclically detects the coordinate position of the player's finger
  • the operation mode is completely in line with the player's usual habits of using a terminal, such as a mobile phone, and unlimited expansion skills can be put in.
  • the electronic device for implementing the above object display method.
  • the electronic device includes: one or more (only one is shown in the figure) A processor 1002, a memory 1004, a sensor 1006, an encoder 1008, and a transmission device 1010.
  • a computer program is stored in the memory, and the processor is configured to execute the steps in any one of the foregoing method embodiments through the computer program.
  • the above-mentioned electronic device may be located in at least one network device among multiple network devices of the computer network.
  • the foregoing processor may be configured to perform the following steps through a computer program:
  • the first object is used to control the game character to perform the first target operation in response to the second operation performed on the first object; the second object is used to control the game character to perform the second target operation in response to a third operation performed on the second object.
  • the structure shown in FIG. 10 is only an illustration, and the electronic device may also be a smart phone (such as an Android phone, an iOS phone, etc.), a tablet computer, a palmtop computer, and a mobile Internet device (Mobile Internet devices, MID), PAD and other terminal devices.
  • FIG. 10 does not limit the structure of the above electronic device.
  • the electronic device may further include more or fewer components than those shown in FIG. 10 (such as a network interface, a display device, etc.), or have a configuration different from that shown in FIG. 10.
  • the memory 1002 may be used to store software programs and modules, such as program instructions / modules corresponding to the object display method and device in the embodiments of the present application, and the processor 1004 executes the software programs and modules stored in the memory 1002 to execute Various functional applications and data processing, that is, control methods to achieve the above target components.
  • the memory 1002 may include a high-speed random access memory, and may also include a non-volatile memory, such as one or more magnetic storage devices, flash memory, or other non-volatile solid-state memory.
  • the memory 1002 may further include memories remotely provided with respect to the processor 1004, and these remote memories may be connected to the terminal through a network. Examples of the above network include but are not limited to the Internet, intranet, local area network, mobile communication network, and combinations thereof.
  • the transmission device 1010 described above is used to receive or transmit data via a network.
  • Specific examples of the aforementioned network may include a wired network and a wireless network.
  • the transmission device 1010 includes a network adapter (Network Interface Controller, NIC), which can be connected to other network devices and routers through a network cable to communicate with the Internet or a local area network.
  • the transmission device 1010 is a radio frequency (RF) module, which is used to communicate with the Internet in a wireless manner.
  • RF radio frequency
  • the memory 1002 is used to store application programs.
  • An embodiment of the present application further provides a storage medium in which a computer program is stored, wherein the computer program is configured to execute any of the steps in the above method embodiments during runtime.
  • the storage medium is further configured to store a computer program for performing the steps included in the method in the foregoing embodiment, which will not be repeated in this embodiment.
  • the storage medium may include: a flash disk, a read-only memory (Read-Only Memory, ROM), a random access device (Random Access Memory, RAM), a magnetic disk, or an optical disk.
  • the integrated unit in the above embodiment is implemented in the form of a software functional unit and sold or used as an independent product, it may be stored in the computer-readable storage medium.
  • the technical solution of the present application may essentially be a part that contributes to the existing technology or all or part of the technical solution may be embodied in the form of a software product, and the computer software product is stored in a storage medium.
  • Several instructions are included to enable one or more computer devices (which may be personal computers, servers, network devices, etc.) to perform all or part of the steps of the methods described in the embodiments of the present application.
  • the disclosed client may be implemented in other ways.
  • the device embodiments described above are only schematic.
  • the division of the unit is only a logical function division.
  • there may be another division manner for example, multiple units or components may be combined or may Integration into another system, or some features can be ignored, or not implemented.
  • the displayed or discussed mutual coupling or direct coupling or communication connection may be indirect coupling or communication connection through some interfaces, units or modules, and may be in electrical or other forms.
  • the units described as separate components may or may not be physically separated, and the components displayed as units may or may not be physical units, that is, they may be located in one place, or may be distributed on multiple network units. Some or all of the units may be selected according to actual needs to achieve the purpose of the solution of this embodiment.
  • each functional unit in each embodiment of the present application may be integrated into one processing unit, or each unit may exist alone physically, or two or more units may be integrated into one unit.
  • the above integrated unit may be implemented in the form of hardware or software functional unit.

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Physics & Mathematics (AREA)
  • Optics & Photonics (AREA)
  • Human Computer Interaction (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

一种对象的显示方法、装置、存储介质及电子装置。其中,该方法包括:在游戏界面中的对象显示区域上显示第一对象(S102);若检测到对第一对象执行的第一操作,响应第一操作,在游戏界面中显示第二对象(S104)。其中,第一对象用于响应对第一对象执行的第二操作控制游戏角色执行第一目标操作;第二对象用于响应对第二对象执行的第三操作控制游戏角色执行第二目标操作。解决了相关技术中对象在游戏界面中占用的显示空间较大导致游戏界面显示空间的利用率较低的技术问题。

Description

对象的显示方法、装置、存储介质及电子装置
本申请要求于2018年11月15日提交中国专利局、申请号201811361960.5、申请名称为“对象的显示方法、装置、存储介质及电子装置”的中国专利申请的优先权,其全部内容通过引用结合在本申请中。
技术领域
本申请涉及计算机领域,具体而言,涉及一种对象的显示技术。
背景技术
现有的游戏界面设计中,游戏界面中的对象(例如:游戏角色的技能)与操作界面-对象图标(UI-ICON)的关系为一一对应,即点击一个UI上的ICON,执行固定操作,比如:释放固定的技能。使用一个对象的UI位置上的ICON对应一个固定操作的方法,单次游戏能够带入到游戏体验中的操作数量就会被屏幕大小和触发范围这两个体验限制。在不变UI和ICON大小的情况下加入更多的UI,游戏实际的可视范围就更小。但如果缩小UI和ICON来满足加入更多UI的需求,那么对象ICON的触发范围也会被一并缩小。
发明内容
本申请实施例提供了一种对象的显示方法、装置、存储介质及电子装置,以至少解决相关技术中对象在游戏界面中占用的显示空间较大导致游戏界面显示空间的利用率较低的技术问题。
根据本申请实施例的一个方面,提供了一种对象的操作控制方法,包括:在游戏界面中的对象显示区域上显示第一对象;若检测到对所述第一对象执行的第一操作,响应所述第一操作,在所述游戏界面中显示第二对象,其中,所述第一对象用于响应对所述第一对象执行的第二操作控制游戏角色执行第一目标操作;所述第二对象用于响应对所述第二对象执行的第三操作控制所述游戏角色执行第二目标操作。
根据本申请实施例的另一方面,还提供了一种对象的操作控制装置,包括:第一显示模块,用于在游戏界面中的对象显示区域上显示第一对象;检测模块,用于检测对所述第一对象执行的第一操作,其中,所述第一对象用于响应对所述第一对象执行的第二操作控制游戏角色执行第一目标操作;第二显示模块,用于若所述检测模块检测到对所述对象执行的第一操作,响应所述第一操作在所述游戏界面中显示第二对象,其中,所述第二对象用于响应对所述第二对象执行的第三操作控制所述游戏角色执行第二目标操作。
根据本申请实施例的另一方面,还提供了一种存储介质,所述存储介质中存储有计算机程序,其中,所述计算机程序被设置为运行时执行上述任一项中所述的方法。
根据本申请实施例的另一方面,还提供了一种电子装置,包括存储器和处理器,其特征在于,所述存储器中存储有计算机程序,所述处理器被设置为通过所述计算机程序执行上述任一项中所述的方法。
根据本申请实施例的另一方面,还提供了一种计算机程序产品,包括指令,当其在计算机上运行时,使得计算机执行上述任一项所述的方法。
在本申请实施例中,采用在游戏界面中的对象显示区域上显示第一对象,若检测到对第一对象执行第一操作,响应所述第一操作,在游戏界面中显示第二对象,其中,第一对象用于响应对所述第一对象执行的第二操作控制游戏角色执行第一目标操作;第二对象用于响应对第二对象执行的第三操作控制游戏角色执行第二目标操作。游戏界面划分为对象显示区域和非对象显示区域,在对象显示区域中可以不必显示全部的对象,将第二对象隐藏起来,只在对象显示区域中显示第一对象,当检测到对第一对象执行的第一操作时,将隐藏的第二对象显示进行显示,从而节省了游戏界面的用于显示对象的空间,能够降低用于控制游戏角色执行目标操作的对象对游戏画面显示的影响,使得游戏界面中能够有更加充足的空间显示游戏画面或其他图标等等信息,从而实现了节约对象占用的游戏界面显示空间,提高游戏界面显示空间的利用率的技术效果,进而解决了相关技术中对象在游戏界面中占用的显示空间较大导致游戏界面显示空间的利用率较低的技术问题。
附图说明
此处所说明的附图用来提供对本申请的进一步理解,构成本申请的一部分,本申请的示意性实施例及其说明用于解释本申请,并不构成对本申请的不当限定。在附图中:
图1是根据本申请实施例的一种可选的对象的显示方法的示意图;
图2是根据本申请实施例的一种可选的对象的显示方法的应用环境示意图;
图3是根据本申请可选的实施方式的一种可选的对象的显示方法的示意图;
图4是根据本申请可选的实施方式的另一种可选的对象的显示方法的示意图;
图5是根据本申请可选的实施方式的另一种可选的对象的显示方法的示意图;
图6是根据本申请可选的实施方式的另一种可选的对象的显示方法的示意图;
图7是根据本申请可选的实施方式的另一种可选的对象的显示方法的示意图;
图8是根据本申请实施例的一种可选的对象的显示装置的示意图;
图9是根据本申请实施例的一种可选的对象的显示方法的应用场景示意图;以及
图10是根据本申请实施例的一种可选的电子装置的示意图。
具体实施方式
为了使本技术领域的人员更好地理解本申请方案,下面将结合本申请实施例中的附图,对本申请实施例中的技术方案进行清楚、完整地描述,显然,所描述的实施例仅仅是本申请一部分的实施例,而不是全部的实施例。基于本申请中的实施例,本领域普通技术人员在没有做出创造性劳动前提下所获得的所有其他实施例,都应当属于本申请保护的范围。
需要说明的是,本申请的说明书和权利要求书及上述附图中的术语“第一”、“第二”等是用于区别类似的对象,而不必用于描述特定的顺序或先后次序。应该理解这样使用的数据在适当情况下可以互换,以便这里描述的本申请的实施例能够以除了在这里图示或描述的那些以外的顺序实施。此外,术语“包括”和“具有”以及他们的任何变形,意图在于覆盖不排他的包含,例如,包含了一系列步骤或单元的过程、方法、系统、产品或设备不必限于清楚地列出的那些步骤或单元,而是可包括没有清楚地列出的或对于这些过程、方法、产品或设备固有的其它步骤或单元。
根据本申请实施例的一个方面,提供了一种对象的显示方法,该方法可以应用于终端,终端可以是智能终端例如智能手机、计算机、个人数字助理(Personal Digital Assistant,简称PDA)、平板电脑等设备。
如图1所示,该方法包括:
S102,终端在游戏界面中的对象显示区域上显示第一对象;
S104,若终端检测到对第一对象执行的第一操作,响应第一操作,在游戏界面中显示第二对象。
其中,第一对象用于响应对第一对象执行的第二操作控制游戏角色执行第一目标操作;第二对象用于响应对第二对象执行的第三操作控制游戏角色执行第二目标操作。
可选地,在本实施例中,上述对象的显示方法可以应用于如图2所示的终端202所构成的硬件环境中。如图2所示,终端202上安装了游戏客户端204,用户点击游戏客户端204进入游戏,在终端202的全屏幕上显示游戏界面206,游戏界面206上显示了游戏场景、游戏角色、可操作的第一对象(对象A和对象B)的图标等等,在游戏界面上第一对象(对象A和对象B)的图标显示的区域为对象显示区域,其他的区域为非对象显示区域。客户端204在游戏界面中的对象显示区域(区域1)上显示第一对象(对象A和对象B),在检测到对第一对象中的对象A执行的第一操作的情况下,响应第一操作在游戏界面中显示第二对象(对象C、对象D和对象E)。例如,可以在游戏界面中的非对象显示区域(区域2)显示第二对象。
可选地,在本实施例中,上述对象的显示方法可以但不限于应用于在游戏界面上显示对象的场景中。其中,上述客户端可以但不限于为各种类型的游戏应用、游戏小程序、游戏网站等等,例如,横版动作类游戏、休闲益智类游戏、动作射击类游戏、体育竞速类游戏、棋牌桌游类游戏、经营策略类游戏、角色扮演类游戏等等。具体的,可以但不限于应用于在上述横版动作类游戏的游戏界面上显示对象的场景中,还可以但不限于应用于在上 述角色扮演类游戏的游戏界面上显示对象的场景中,以节约对象占用的游戏界面显示空间,提高游戏界面显示空间的利用率。上述仅是一种示例,本实施例中对此不做任何限定。
可选地,在本实施例中,默认显示在游戏界面上的对象均可以称为上述第一对象,游戏界面上用于显示第一对象的区域可以称为对象显示区域,游戏界面上除对象显示区域外的全部或者部分区域可以称为非对象显示区域。
可选地,在本实施例中,第一对象和第二对象可以但不限于是具有相同功能的对象,比如:第一对象和第二对象可以但不限于为游戏角色不同的技能。或者,第一对象和第二对象还可以但不限于为游戏角色不同的装备、道具等等。
可选地,在本实施例中,第一对象可以但不限于包括一个或者多个对象,以对象为技能图标为例,游戏界面上可以但不限于只显示一个使用频率最高的技能图标或者显示两个或者两个以上使用频率较高的图标。其余的技能图标为第二对象。
可选地,在本实施例中,第一操作可以是用于触发第二对象显示的操作,第一操作的类型可以但不限于为点击操作(比如:对触摸屏的点击操作)、触摸操作(比如:对触摸屏的触摸操作)、划动操作(比如:对触摸屏的划动操作)、按键操作(比如:对键盘或者游戏手柄上按键的操作)、摇杆操作(比如:对游戏手柄或者笔记本电脑上摇杆的操作)等等。
可选地,在本实施例中,第二操作可以是用于触发对游戏角色执行第一目标操作的控制的操作,第三操作可以是用于触发对游戏角色执行第二目标操作的控制的操作。第二操作和第三操作的类型可以但不限于为点击操作、触摸操作、划动操作、按键操作、摇杆操作等等。其中,第二操作和第三操作可以为同一种类型的操作,第一操作可以为与第二操作和第三操作所属于的操作类型不同的操作。比如:第一操作为触摸操作,第二操作和第三操作为单击操作。当终端检测到对第一对象执行触摸操作时,响应该触摸操作在游戏界面中显示第二对象,当终端检测到对第一对象执行单击操作时,控制游戏角色执行第一目标操作。当然,在一些情况下,第一操作可以为与第二操作和第三操作所属于的操作类型相同的操作。比如:第一操作、第二操作和第三操作都为触摸操作,但是第一操作与第二操作和第三操作的持续时间不同,当终端检测到对第一对象执行的触摸操作的持续时间满足第一阈值时,响应该触摸操作在游戏界面中显示第二对象,当终端检测到对第一对象执行的触摸操作满足第二阈值时,控制游戏角色执行第一目标操作。
可选地,在本实施例中,第一目标操作和第二目标操作为相同类型的不同操作。以对象为游戏界面上显示的技能图标为例,第一对象和第二对象分别为游戏角色的不同的技能,控制游戏角色执行第一目标操作或者第二目标操作可以是控制游戏角色展示不同的技能效果(包括展示技能画面、改变角色属性等等)。在一个可选的实施方式中,以触摸屏的游戏界面中显示游戏角色的技能为例,如图3所示,第一对象为技能A,第二对象为技能B和技能C,如图4所示,对象的显示流程包括以下步骤:
步骤1,终端检测到按住技能ICON的操作,即终端检测到对图3所示的技能A的触摸操作。
步骤2,终端在周边弹出其它扩展技能ICON,即终端在技能A的周边弹出技能B和技能C的ICON。
步骤3,终端检测是否在当前ICON上松开手指,即终端检测是否在技能A的ICON上结束上述触摸操作,如果是,则执行步骤10,否则执行步骤4。
步骤4,终端持续显示其它扩展技能,即终端没有检测到在技能A的ICON上结束上述触摸操作,则持续显示技能B和技能C。
步骤5,终端检测是否将手指从当前ICON上划走,即终端检测上述触摸操作是否转换为划动操作,如果是,则执行步骤6,否则返回步骤4。
步骤6,终端检测手指停留的位置是否有扩展技能ICON,即终端检测划动操作是否划动到了技能B或者技能C的显示位置,如果是,则执行步骤7,否则执行步骤8。
步骤7,选定手指位置的技能ICON,即将划动操作所划动到的位置对应的技能确定为第二对象,例如选定技能C的图标,继续执行步骤9。
步骤8,终端选定与手指距离最近的技能ICON,即终端检测划动操作所划动到的位置与技能B和技能C之间的距离,比如其与技能C之间的距离更近,则选定技能C。
步骤9,终端检测手指是否抬起,即终端检测划动操作是否在当前位置结束,如果是,则执行步骤10,否则返回步骤4。
步骤10,使用当前技能,即终端控制游戏角色释放技能C。
步骤11,终端将UI上的ICON替换为使用的技能,即终端将游戏界面中对象显示区域上显示的技能A的图标替换为技能C的图标。
可见,通过上述步骤,终端上的游戏界面划分为对象显示区域和非对象显示区域,在对象显示区域中可以不必显示全部的对象,将第二对象隐藏起来,只在对象显示区域中显示第一对象,当终端检测到对第一对象执行的第一操作时,将隐藏的第二对象在游戏界面中进行显示,从而节省了游戏界面的用于显示对象的空间,能够降低用于控制游戏角色执行目标操作的对象对游戏画面显示的影响,使得游戏界面中能够有更加充足的空间显示游戏画面或其他图标等等信息,从而实现了节约对象占用的游戏界面显示空间,提高游戏界面显示空间的利用率的技术效果,进而解决了相关技术中对象在游戏界面中占用的显示空间较大导致游戏界面显示空间的利用率较低的技术问题。
需要说明的是,在本申请实施例中,终端可以响应于第一操作,在游戏界面中的非对象显示区域上显示第二对象。当然终端也可以在对象显示区域显示第二对象,例如将第一操作所作用的第一对象替换为该第二对象。
作为一种可选的方案,终端检测到对第一对象执行的第一操作包括:
S1,终端检测对第一对象执行的操作;
S2,若终端检测到对第一对象执行以下操作之一,确定检测到第一操作:第一触摸操作、第一点击操作、第一划动操作,其中,第一触摸操作为对第一对象执行的持续时间满足第一条件的触摸操作,第一点击操作为对第一对象执行的在第一时间段内触发了第一点击次数的点击操作,第一划动操作为对第一对象执行的以第一对象为起点向第一方向划动的划动操作。
可选地,在本实施例中,终端可以配置不同种类的操作作为用于触发显示第二对象的第一操作,比如:在屏幕上持续一段时间的第一触摸操作,在一定时间段内触发的几次点击的操作(单击、双击、三击等等),从第一对象开始的划动操作等等。
例如:终端检测到对第一对象执行的单击操作,则显示第二对象,或者,终端检测到对第一对象执行的触摸操作持续了2秒,则显示第二对象等等。
可选地,在本实施例中,对于不同类别的第一对象,用于触发显示第二对象的第一操作可以为不同的操作,比如:游戏界面上显示有多种第一对象:技能图标、设置图标、道具图标等等,终端可以将每种图标中的一个作为第一对象显示在对象显示区域上,其他的图标作为第二对象隐藏起来,终端为每种图标配置一种用于触发显示第二对象的操作,比如:终端检测到对技能图标的单击操作时,显示技能图标中的第二对象,终端检测到对设置图标的双击操作时,显示设置图标中的第二对象,终端检测到对道具图标的持续2秒时间的触摸操作时,显示道具图标中的第二对象。或者,终端也可以为不同类别的第一对象设置相同的用于触发显示第二图标的操作。
作为一种可选的方案,终端响应第一操作,在游戏界面中的非对象显示区域上显示第二对象包括:
S1,终端响应第一操作,从具有对应关系的显示对象和隐藏对象中获取第一对象所对应的隐藏对象作为第二对象,其中,显示对象包括第一对象,隐藏对象包括第二对象,显示对象为在游戏界面中的对象显示区域上显示的对象;
S2,终端在非对象显示区域上确定用于显示第一对象所对应的隐藏对象的目标区域;
S3,终端在目标区域上显示第二对象。
可选地,在本实施例中,可以将游戏界面中可显示的对象分为显示对象和隐藏对象两类,终端可以配置显示对象和隐藏对象之间的对应关系,即终端确定检测到对某个显示对象执行的第一操作时,显示哪些隐藏对象。其中,显示对象包括第一对象,第二对象为第一对象对应的隐藏对象。
可选地,在本实施例中,可以如上述配置对象之间的对应关系。终端也可以配置对象显示区域与对象之间的对应关系。显示对象的显示区域对应一个或者多个对象,终端将其 中的一个对象作为显示对象显示在其对应的显示区域上,将其他对象作为隐藏对象隐藏起来,终端在检测到对显示对象执行的第一操作时,可以显示该显示区域对应的全部对象,包括第一对象在内。
在一个可选的实施方式中,如图5所示,对象1和对象2为第一对象,对象3至对象5为第二对象,终端可以配置对象1对应对象3和对象4,对象2对应对象5,在游戏界面上,显示对象1和对象2,在终端检测到对对象1执行第一操作时,显示对象3和对象4。
在另一个可选的实施方式中,或者终端可以设置对象显示区域中包括区域1和区域2,配置区域1对应对象1、对象3和对象4,区域2对应对象2和对象5,在区域1上显示对象1,在区域2上显示对象2,终端在检测到对对象1执行的第一操作时,显示对象1、对象3和对象4,终端在检测到对对象2执行的第一操作时,显示对象2和对象5。
可选地,在本实施例中,具有对应关系的显示对象和隐藏对象的存储方式可以但不限于是以表格的形式存储显示对象的标识和隐藏对象的标识,终端在检测到对第一对象执行第一操作的情况下,响应该第一操作,获取到第一对象的标识,从表格中确定第一对象的标识所对应的隐藏对象的标识,再在存储空间中获取到隐藏对象的标识所对应的显示图标,终端将获取到的显示图标作为第二对象的图标显示在游戏界面的非对象显示区域中,将确定出的隐藏对象的标识所对象的操作作为第二对象能够控制游戏角色执行的第二目标操作。
作为一种可选的方案,终端在非对象显示区域上确定用于显示第一对象所对应的隐藏对象的目标区域包括:
终端将非对象显示区域上与第一对象的距离落入目标阈值范围的区域确定为目标区域。
作为一种可选的方案,终端在目标区域上显示第二对象包括:
S1,终端获取第二对象的对象数量;
S2,终端根据对象数量在目标区域上为第二对象中的每个对象划分对应的区域进行显示。
可选地,在本实施例中,为了方便对第一对象和第二对象的后续操作,非对象显示区域上与第一对象的距离落入目标阈值范围的区域可以是以第一对象的显示区域为圆心,半径在目标阈值范围之间的一个圆环区域或者位于第一对象某个方向(比如上方)上的部分圆环区域。终端根据待显示的第二对象的数量对该目标区域进行划分,再将第二对象显示在划分后对应的区域上。
例如:如图6所示,游戏界面上显示了一个第一对象,显示第一对象的区域作为对象显示区域,除该区域以外的游戏界面上的区域为非对象显示区域,终端将第一对象附近的三分之一个圆环区域确定为目标区域,第二对象的对象数量为两个,分别是对象1和对象2,终端将上述三分之一个圆环区域分成两份,分别显示对象1和对象2。
需要说明的是,显示第二对象的位置不一定在第一对象的附近,也可以是游戏界面上非对象显示区域中的任意位置,本实施例对此不作限定。比如:第一对象在游戏界面的右下角显示,第二对象可以在游戏界面的左上角显示,这样还可以方便对游戏进行双手操作。
作为一种可选的方案,在终端响应第一操作,在游戏界面中的非对象显示区域上显示第二对象之后,终端在检测到对第一对象执行的第二操作的情况下,响应第二操作控制游戏角色执行第一目标操作。或者,终端在检测到对第二对象执行的第三操作的情况下,响应第三操作控制游戏角色执行第二目标操作。
例如,在游戏界面上同时显示有第一对象和第二对象,用户可以对需要操作的对象进行选择,终端响应用户的操作执行相应的流程。以技能释放为例,终端在检测到对第一对象执行的第一操作的触发从而显示第二对象之后,如果终端检测到对第一对象执行的第二操作,则控制游戏角色释放第一对象对应的技能(第一目标操作),如果终端检测到对第二对象执行的第三操作,则控制游戏角色释放第二对象对应的技能(第二目标操作)。
可选地,在本实施例中,各个对象还可以具有触发后的冷却时间,处于冷却时间内的对象可以被设置为在冷却时间结束前不允许被触发,在游戏界面中显示此种状态下的对象时,可以但不限于在对象上显示一个半透明的图层,并可以在图层上以倒计时的方式显示剩余的冷却时间。在倒计时结束后,不再显示该图层,对象有恢复到允许被触发的状态直至对象被再一次触发。
可选地,在本实施例中,终端在检测到对第一对象执行的第二操作的情况下,在响应第二操作控制游戏角色执行第一目标操作的同时进入第一对象的冷却时间。
作为一种可选的方案,终端在检测到对第二对象执行的第三操作的情况下,响应第三操作控制游戏角色执行第二目标操作包括:
S1,终端检测对第二对象执行的操作;
S2,若终端检测到对第二对象执行以下操作之一,确定检测到第三操作:第二触摸操作、第二点击操作、第二划动操作,其中,第二触摸操作为对第二对象执行的持续时间满足第二条件的触摸操作,第二点击操作为对第二对象执行的在第二时间段内触发了第二点击次数的点击操作,第二划动操作为对第二对象执行的以第一对象为起点第二对象为终点的划动操作、或者、以第二对象为起点向第二方向划动的划动操作;
S3,终端响应第三操作控制游戏角色执行第二目标操作。
可选地,在本实施例中,用于触发第二对象功能的第三操作的形式可以为多种形式,比如:第二触摸操作、第二点击操作、第二划动操作,其中,第二触摸操作为对第二对象执行的持续时间满足第二条件的触摸操作(例如:对第二对象执行的持续时间在2秒以上5秒以下的触摸操作,如果该触摸操作持续时间超过了5秒可以取消对第二对象的触发),第二点击操作为对第二对象执行的在第二时间段内触发了第二点击次数的点击操作(例如:单击、双击、三击等等),第二划动操作为对第二对象执行的以第一对象为起点第二对象为 终点的划动操作(例如:从第一对象划动到第二对象的划动操作)、或者、以第二对象为起点向第二方向划动的划动操作(从第二对象开始向某一方向的划动操作,该方向可以表示第二目标操作的操作方向)。
作为一种可选的方案,第二对象包括多个对象,其中,终端在检测到对第二对象执行的第三操作的情况下,响应第三操作控制游戏角色执行第二目标操作包括:
S1,终端在检测到对第二对象执行第三操作的情况下,确定第三操作的操作位置与多个对象之间的位置关系;
S2,终端将多个对象中位置关系满足目标位置关系的对象确定为目标对象;
S3,终端响应第三操作控制游戏角色执行目标对象所对应的第二目标操作。
可选地,在本实施例中,终端可以根据第三操作的操作位置与第二对象中多个对象的显示位置之间的位置关系来确定用于响应第三操作的目标对象。
在一个可选的实施方式中,以游戏角色技能释放为例,如图7所示,在游戏界面中的对象显示区域上显示了第一对象(技能1的图标),终端检测到用户对技能1的触摸操作持续了2秒(如图7中黑色圆点所示),响应该触摸操作在技能1的周围显示第二对象(技能2和技能3,技能2即对象2,技能3即对象3),终端检测到用户手指尚未离开屏幕,用户手指由技能1图标所在的位置移动到技能2图标所在的位置后离开了屏幕(如图7中白色圆点所示),则将技能2确定为目标对象,控制游戏角色释放技能2。
作为一种可选的方案,终端将多个对象中位置关系满足目标位置关系的对象确定为目标对象包括以下之一:
终端将操作位置在非对象显示区域上所落入的目标区域所对应的对象确定为目标对象;或者,终端将多个对象中与操作位置距离最近的对象确定为目标对象。
可选地,在本实施例中,终端可以将第三操作的操作位置所落入的对象所在的区域对应的对象确定为目标对象。或者,终端还可以将距离第三操作的操作位置最近的对象确定为目标对象。
需要说明的是,终端根据位置关系确定目标对象的方式不止于此,本实施例对此不作限定。比如,终端还可以是将第三操作的操作位置与第一对象的连线所经过的对象确定为目标对象等等。
作为一种可选的方案,在终端响应第三操作控制游戏角色执行第二目标操作之后,方法还包括:
终端将游戏界面中的对象显示区域上显示的第一对象替换为第二对象。
可选地,在本实施例中,终端可以将游戏界面中的对象显示区域显示的第一对象替换为被选中的第二对象。以方便第二对象的功能的再次触发。
可选地,在本实施例中,游戏记录了显示在对象显示区域的第一对象以及通过对第一对象执行第二操作来控制游戏角色执行的第一目标操作的对应关系,终端在将游戏界面中的对象显示区域上显示的第一对象的图标替换为第二对象的图标的同时,可以将游戏中记录的上述对应关系替换为第二对象以及通过对第二对象执行第三操作来控制游戏角色执行的第二目标操作之间的对应关系。也就是说,终端将第二对象作为新的显示对象,同时将第二对象的相关信息作为显示对象的信息。
作为一种可选的方案,在终端将游戏界面中的对象显示区域上显示的第一对象替换为第二对象之后,方法还包括以下之一:
终端在检测到对第二对象执行第一操作的情况下,响应第一操作在游戏界面中的非对象显示区域上显示第一对象;或者,终端在检测到对第二对象执行第三操作的情况下,响应第三操作控制游戏角色执行第二目标操作。
可选地,在本实施例中,终端将第一对象替换为第二对象后,可以将第一对象配置为第二对象所对应的隐藏对象,如果终端检测到对第二对象执行用于触发显示隐藏对象的第一操作,可以在非对象显示区域显示第一对象,如果终端检测到对第一对象执行第二操作,则可以响应第二操作控制游戏角色执行第一目标操作。
可选地,在本实施例中,终端在对象显示区域中显示的第二对象可以直接响应检测到的对其执行的第三操作控制游戏角色执行第二目标操作。
可选地,在本实施例中,第一操作是用于触发游戏界面上显示隐藏对象的操作,当游戏界面上的显示对象为第二对象时,同样可以通过对第二对象执行的第一操作来触发游戏界面上显示作为隐藏对象的第一对象。
可选地,在本实施例中,对第二对象执行的用于触发游戏界面上显示隐藏对象的操作也可以与对第一对象执行的用于触发游戏界面上显示隐藏对象的第一操作不同。比如:对第一对象执行单击操作可以触发显示第二对象,对第二对象执行触摸操作可以触发显示第一对象。
需要说明的是,对于前述的各方法实施例,为了简单描述,故将其都表述为一系列的动作组合,但是本领域技术人员应该知悉,本申请并不受所描述的动作顺序的限制,因为依据本申请,某些步骤可以采用其他顺序或者同时进行。其次,本领域技术人员也应该知悉,说明书中所描述的实施例均属于优选实施例,所涉及的动作和模块并不一定是本申请所必须的。
根据本申请实施例的另一个方面,还提供了一种用于实施上述对象的显示方法的对象的显示装置,如图8所示,该装置包括:
第一显示模块82,用于在游戏界面中的对象显示区域上显示第一对象;
检测模块84,用于检测对第一对象执行的第一操作,其中,第一对象用于响应对第一对象执行的第二操作控制游戏角色执行第一目标操作;
第二显示模块86,用于若所述检测模块84检测到对第一对象执行第一操作,响应第一操作在游戏界面中显示第二对象,其中,第二对象用于响应对第二对象执行的第三操作控制游戏角色执行第二目标操作。
可见,通过上述装置,游戏界面划分为对象显示区域和非对象显示区域,在对象显示区域中可以不必显示全部的对象,将第二对象隐藏起来,只在对象显示区域中显示第一对象,当检测到对第一对象执行的第一操作时,将隐藏的第二对象在游戏界面中进行显示,从而节省了游戏界面的用于显示对象的空间,能够降低用于控制游戏角色执行目标操作的对象对游戏画面显示的影响,使得游戏界面中能够有更加充足的空间显示游戏画面或其他图标等等信息,从而实现了节约对象占用的游戏界面显示空间,提高游戏界面显示空间的利用率的技术效果,进而解决了相关技术中对象在游戏界面中占用的显示空间较大导致游戏界面显示空间的利用率较低的技术问题。
作为一种可选的方案,第二显示模块用于响应所述第一操作,在所述游戏界面中的非对象显示区域上显示所述第二对象。
作为一种可选的方案,检测模块包括:
第一检测单元,用于检测对第一对象执行的操作;
第一确定单元,用于若检测到对所述第一对象执行以下操作之一,确定检测到所述第一操作:第一触摸操作、第一点击操作、第一划动操作,其中,所述第一触摸操作为对所述第一对象执行的持续时间满足第一条件的触摸操作,所述第一点击操作为对所述第一对象执行的在第一时间段内触发了第一点击次数的点击操作,所述第一划动操作为对所述第一对象执行的以所述第一对象为起点向第一方向划动的划动操作。
作为一种可选的方案,第二显示模块包括:
获取单元,用于响应第一操作从具有对应关系的显示对象和隐藏对象中获取第一对象所对应的隐藏对象作为第二对象,其中,显示对象包括第一对象,隐藏对象包括第二对象,显示对象为在游戏界面中的对象显示区域上显示的对象;
确定单元,用于在非对象显示区域上确定用于显示第一对象所对应的隐藏对象的目标区域;
显示单元,用于在目标区域上显示第二对象。
作为一种可选的方案,确定单元包括:
第一确定子单元,用于将非对象显示区域上与第一对象的距离落入目标阈值范围的区域确定为目标区域。
作为一种可选的方案,显示单元包括:
获取子单元,用于获取第二对象的对象数量;
划分子单元,用于根据对象数量在目标区域上为第二对象中的每个对象划分对应的区域进行显示。
作为一种可选的方案,装置还包括以下之一:
第一控制模块,用于在检测到对第一对象执行第二操作的情况下,响应第二操作控制游戏角色执行第一目标操作;
第二控制模块,用于在检测到对第二对象执行第三操作的情况下,响应第三操作控制游戏角色执行第二目标操作。
作为一种可选的方案,第二控制模块包括:
第二检测单元,用于检测对第二对象执行的操作;
第二确定单元,用于若检测到对第二对象执行以下操作之一,确定检测到第三操作:第二触摸操作、第二点击操作、第二划动操作,其中,第二触摸操作为对第二对象执行的持续时间满足第二条件的触摸操作,第二点击操作为对第二对象执行的在第二时间段内触发了第二点击次数的点击操作,第二划动操作为对第二对象执行的以第一对象为起点第二对象为终点的划动操作、或者、以第二对象为起点向第二方向划动的划动操作;
第一控制单元,用于响应第三操作控制游戏角色执行第二目标操作。
作为一种可选的方案,第二对象包括多个对象,其中,第二控制模块包括:
第三确定单元,用于在检测到对第二对象执行第三操作的情况下,确定第三操作的操作位置与多个对象之间的位置关系;
第四确定单元,用于将多个对象中位置关系满足目标位置关系的对象确定为目标对象;
第二控制单元,用于响应第三操作控制游戏角色执行目标对象所对应的第二目标操作。
作为一种可选的方案,第四确定单元包括以下之一:
第二确定子单元,用于将操作位置在非对象显示区域上所落入的目标区域所对应的对象确定为目标对象;
第三确定子单元,用于将多个对象中与操作位置距离最近的对象确定为目标对象。
作为一种可选的方案,装置还包括:
替换模块,用于将游戏界面中的对象显示区域上显示的第一对象替换为第二对象。
作为一种可选的方案,装置还包括以下之一:
第三显示模块,用于在检测到对第二对象执行第一操作的情况下,响应第一操作在游戏界面中的非对象显示区域上显示第一对象;
第三控制模块,用于在检测到对第二对象执行第三操作的情况下,响应第三操作控制游戏角色执行第二目标操作。
本申请实施例的应用环境可以但不限于参照上述实施例中的应用环境,本实施例中对此不再赘述。本申请实施例提供了用于实施上述实时通信的连接方法的一种可选的具体应用示例。
作为一种可选的实施例,上述对象的显示方法可以但不限于应用于如图9所示的在游戏界面中显示对象的场景中。
手机游戏的最大难点在于,受到屏幕大小以及触摸区域大小的限制,无法在游戏单次战斗中使用过多的UI来满足设计中多技能点选的诉求。市面上现有的游戏当中,绝大部分技能ICON与技能为一一对应关系,即一个技能的UI位置上的ICON,对应一个固定的技能。那么单次游戏能够带入到战斗中的技能数量就会被屏幕大小和触发范围这两个体验限制。在不变UI和ICON大小的情况下加入更多的UI,游戏实际的可视范围就更小;如果缩小UI和ICON来满足加入更多UI的需求,那么技能ICON的触发范围也会被一并缩小,极大的影响玩家的操作体验。
在本场景中,提供了一个使用单技能UI,并根据玩家操作进行多技能的智能扩展的方式,来满足在手机游戏UI数量的限制下,玩家对于战斗中能够使用技能的需求。
当玩家使用手指按住游戏中固定技能UI上的ICON时,终端会根据系统设定或者玩家自己的设定,向四周弹出不同的多个技能ICON;此时玩家可以向所需要的技能ICON方向划动手指,在欲使用的技能ICON上抬起手指,此时终端控制游戏中的玩家角色释放当前ICON所对应的技能。在玩家抬起手指的同时,终端向四周弹出的多个技能ICON消失,同时固定技能UI上的ICON将被替换为所释放的技能ICON,以满足玩家再次快速使用的需求。
在本场景中,当手指点击位置在技能按钮上时,终端展开当前技能按钮折叠的所有技能按钮;终端根据手指滑动所在位置,激活子层技能;终端根据手指弹起位置所在技能激活区域判断释放哪个技能。
玩家触摸屏幕,终端检测点击坐标范围,当点击坐标范围在技能按钮A侦测范围内时:
1、终端围绕技能按钮A,在半径R范围内将技能按钮外层区域划分若干子区域,展示子技能面板S及其子技能图标,并绑定其子技能;
2、终端循环检测玩家手指所在坐标位置;
3、当玩家手指在屏幕滑动,滑动位置坐标超出技能A按钮范围,进入半径R范围:
a)玩家手指滑动,终端根据手指所在坐标位置,高亮显示1中划分子区域;
b)玩家手指离开屏幕,终端释放当前激活子层技能B,并将父层技能按钮A替换为B,释放B技能,同时技能开始冷却倒计时;
c)终端隐藏子层技能面板S;
4、玩家手指离开屏幕,终端判断离开时手指所在屏幕坐标:
a)在技能A图标范围内,终端直接释放技能A,同时技能开始冷却倒计时;
b)在R半径外,终端取消技能释放;
c)终端隐藏子层技能面板S。
通过上述方式,使得操作方式完全符合玩家平时使用终端例如手机的习惯,并且能够放入无限多的扩展技能。
根据本申请实施例的又一个方面,还提供了一种用于实施上述对象的显示方法的电子装置,如图10所示,该电子装置包括:一个或多个(图中仅示出一个)处理器1002、存储器1004、传感器1006、编码器1008以及传输装置1010,该存储器中存储有计算机程序,该处理器被设置为通过计算机程序执行上述任一项方法实施例中的步骤。
可选地,在本实施例中,上述电子装置可以位于计算机网络的多个网络设备中的至少一个网络设备。
可选地,在本实施例中,上述处理器可以被设置为通过计算机程序执行以下步骤:
S1,在游戏界面中的对象显示区域上显示第一对象;
S2,若检测到对第一对象执行的第一操作,响应所述第一操作,在所述游戏界面中显示第二对象。
其中,第一对象用于响应对第一对象执行的第二操作控制游戏角色执行第一目标操作;第二对象用于响应对第二对象执行的第三操作控制游戏角色执行第二目标操作。
可选地,本领域普通技术人员可以理解,图10所示的结构仅为示意,电子装置也可以是智能手机(如Android手机、iOS手机等)、平板电脑、掌上电脑以及移动互联网设备(Mobile Internet Devices,MID)、PAD等终端设备。图10其并不对上述电子装置的结构造成限定。例如,电子装置还可包括比图10中所示更多或者更少的组件(如网络接口、显示装置等),或者具有与图10所示不同的配置。
其中,存储器1002可用于存储软件程序以及模块,如本申请实施例中的对象的显示方法和装置对应的程序指令/模块,处理器1004通过运行存储在存储器1002内的软件程序以及模块,从而执行各种功能应用以及数据处理,即实现上述的目标组件的控制方法。存储器1002可包括高速随机存储器,还可以包括非易失性存储器,如一个或者多个磁性存储装置、闪存、或者其他非易失性固态存储器。在一些实例中,存储器1002可进一步包括相对于处理器1004远程设置的存储器,这些远程存储器可以通过网络连接至终端。上述网络的实例包括但不限于互联网、企业内部网、局域网、移动通信网及其组合。
上述的传输装置1010用于经由一个网络接收或者发送数据。上述的网络具体实例可包括有线网络及无线网络。在一个实例中,传输装置1010包括一个网络适配器(Network Interface Controller,NIC),其可通过网线与其他网络设备与路由器相连从而可与互联网或局域网进行通讯。在一个实例中,传输装置1010为射频(Radio Frequency,RF)模块,其用于通过无线方式与互联网进行通讯。
其中,具体地,存储器1002用于存储应用程序。
本申请的实施例还提供了一种存储介质,该存储介质中存储有计算机程序,其中,该计算机程序被设置为运行时执行上述任一项方法实施例中的步骤。
可选地,存储介质还被设置为存储用于执行上述实施例中的方法中所包括的步骤的计算机程序,本实施例中对此不再赘述。
可选地,在本实施例中,本领域普通技术人员可以理解上述实施例的各种方法中的全部或部分步骤是可以通过程序来指令终端设备相关的硬件来完成,该程序可以存储于一计算机可读存储介质中,存储介质可以包括:闪存盘、只读存储器(Read-Only Memory,ROM)、随机存取器(Random Access Memory,RAM)、磁盘或光盘等。
上述本申请实施例序号仅仅为了描述,不代表实施例的优劣。
上述实施例中的集成的单元如果以软件功能单元的形式实现并作为独立的产品销售或使用时,可以存储在上述计算机可读取的存储介质中。基于这样的理解,本申请的技术方案本质上或者说对现有技术做出贡献的部分或者该技术方案的全部或部分可以以软件产品的形式体现出来,该计算机软件产品存储在存储介质中,包括若干指令用以使得一台或多台计算机设备(可为个人计算机、服务器或者网络设备等)执行本申请各个实施例所述方法的全部或部分步骤。
在本申请的上述实施例中,对各个实施例的描述都各有侧重,某个实施例中没有详述的部分,可以参见其他实施例的相关描述。
在本申请所提供的几个实施例中,应该理解到,所揭露的客户端,可通过其它的方式实现。其中,以上所描述的装置实施例仅仅是示意性的,例如所述单元的划分,仅仅为一 种逻辑功能划分,实际实现时可以有另外的划分方式,例如多个单元或组件可以结合或者可以集成到另一个系统,或一些特征可以忽略,或不执行。另一点,所显示或讨论的相互之间的耦合或直接耦合或通信连接可以是通过一些接口,单元或模块的间接耦合或通信连接,可以是电性或其它的形式。
所述作为分离部件说明的单元可以是或者也可以不是物理上分开的,作为单元显示的部件可以是或者也可以不是物理单元,即可以位于一个地方,或者也可以分布到多个网络单元上。可以根据实际的需要选择其中的部分或者全部单元来实现本实施例方案的目的。
另外,在本申请各个实施例中的各功能单元可以集成在一个处理单元中,也可以是各个单元单独物理存在,也可以两个或两个以上单元集成在一个单元中。上述集成的单元既可以采用硬件的形式实现,也可以采用软件功能单元的形式实现。
以上所述仅是本申请的优选实施方式,应当指出,对于本技术领域的普通技术人员来说,在不脱离本申请原理的前提下,还可以做出若干改进和润饰,这些改进和润饰也应视为本申请的保护范围。

Claims (16)

  1. 一种对象的显示方法,所述方法应用于终端,包括:
    在游戏界面中的对象显示区域上显示第一对象;
    若检测到对所述第一对象执行的第一操作,响应所述第一操作,在所述游戏界面中显示第二对象,其中,所述第一对象用于响应对所述第一对象执行的第二操作控制游戏角色执行第一目标操作;所述第二对象用于响应对所述第二对象执行的第三操作控制所述游戏角色执行第二目标操作。
  2. 根据权利要求1所述的方法,所述响应所述第一操作,在所述游戏界面中显示第二对象,包括:
    响应所述第一操作,在所述游戏界面中的非对象显示区域上显示所述第二对象。
  3. 根据权利要求1所述的方法,检测到对所述第一对象执行的第一操作包括:
    检测对所述第一对象执行的操作;
    若检测到对所述第一对象执行以下操作之一,确定检测到所述第一操作:第一触摸操作、第一点击操作、第一划动操作,其中,所述第一触摸操作为对所述第一对象执行的持续时间满足第一条件的触摸操作,所述第一点击操作为对所述第一对象执行的在第一时间段内触发了第一点击次数的点击操作,所述第一划动操作为对所述第一对象执行的以所述第一对象为起点向第一方向划动的划动操作。
  4. 根据权利要求2所述的方法,所述响应所述第一操作,在所述游戏界面中的非对象显示区域上显示所述第二对象包括:
    响应所述第一操作,从具有对应关系的显示对象和隐藏对象中获取所述第一对象所对应的隐藏对象作为所述第二对象,其中,所述显示对象包括所述第一对象,所述隐藏对象包括所述第二对象,所述显示对象为在所述游戏界面中的所述对象显示区域上显示的对象;
    在所述非对象显示区域上确定用于显示所述第一对象所对应的隐藏对象的目标区域;
    在所述目标区域上显示所述第二对象。
  5. 根据权利要求4所述的方法,在所述非对象显示区域上确定用于显示所述第一对象所对应的隐藏对象的目标区域包括:
    将所述非对象显示区域上与所述第一对象的距离落入目标阈值范围的区域确定为所述目标区域。
  6. 根据权利要求4所述的方法,所述在所述目标区域上显示所述第二对象包括:
    获取所述第二对象的对象数量;
    根据所述对象数量,在所述目标区域上为所述第二对象中的每个对象划分对应的区域进行显示。
  7. 根据权利要求2所述的方法,在响应所述第一操作,在所述游戏界面中的非对象显示区域上显示第二对象之后,所述方法还包括以下之一:
    在检测到对所述第一对象执行所述第二操作的情况下,响应所述第二操作控制所述游戏角色执行所述第一目标操作;
    在检测到对所述第二对象执行所述第三操作的情况下,响应所述第三操作控制所述游戏角色执行所述第二目标操作。
  8. 根据权利要求7所述的方法,在检测到对所述第二对象执行所述第三操作的情况下,响应所述第三操作控制所述游戏角色执行所述第二目标操作包括:
    检测对所述第二对象执行的操作;
    若检测到对所述第二对象执行以下操作之一,确定检测到所述第三操作:第二触摸操作、第二点击操作、第二划动操作,其中,所述第二触摸操作为对所述第二对象执行的持续时间满足第二条件的触摸操作,所述第二点击操作为对所述第二对象执行的在第二时间段内触发了第二点击次数的点击操作,所述第二划动操作为对所述第二对象执行的以所述第一对象为起点、所述第二对象为终点的划动操作、或者、以所述第二对象为起点向第二方向划动的划动操作;
    响应所述第三操作控制所述游戏角色执行所述第二目标操作。
  9. 根据权利要求7所述的方法,所述第二对象包括多个对象,其中,在检测到对所述第二对象执行所述第三操作的情况下,响应所述第三操作控制所述游戏角色执行所述第二目标操作包括:
    在检测到对所述第二对象执行所述第三操作的情况下,确定所述第三操作的操作位置与所述多个对象之间的位置关系;
    将所述多个对象中所述位置关系满足目标位置关系的对象确定为目标对象;
    响应所述第三操作控制所述游戏角色执行所述目标对象所对应的所述第二目标操作。
  10. 根据权利要求9所述的方法,将所述多个对象中所述位置关系满足目标位置关系的对象确定为所述目标对象包括以下之一:
    将所述操作位置在所述非对象显示区域上所落入的目标区域所对应的对象确定为所述目标对象;
    将所述多个对象中与所述操作位置距离最近的对象确定为所述目标对象。
  11. 根据权利要求7所述的方法,在响应所述第三操作控制所述游戏角色执行所述第二目标操作之后,所述方法还包括:
    将游戏界面中的对象显示区域上显示的所述第一对象替换为所述第二对象。
  12. 根据权利要求11所述的方法,在将游戏界面中的对象显示区域上显示的所述第一对象替换为所述第二对象之后,所述方法还包括以下之一:
    在检测到对所述第二对象执行所述第一操作的情况下,响应所述第一操作在所述游戏界面中的所述非对象显示区域上显示所述第一对象;
    在检测到对所述第二对象执行所述第三操作的情况下,响应所述第三操作控制所述游戏角色执行所述第二目标操作。
  13. 一种对象的操作控制装置,包括:
    第一显示模块,用于在游戏界面中的对象显示区域上显示第一对象;
    检测模块,用于检测对所述第一对象执行的第一操作,其中,所述第一对象用于响应对所述第一对象执行的第二操作控制游戏角色执行第一目标操作;
    第二显示模块,用于若所述检测模块检测到对所述对象执行的第一操作,响应所述第一操作在所述游戏界面中显示第二对象,其中,所述第二对象用于响应对所述第二对象执行的第三操作控制所述游戏角色执行第二目标操作。
  14. 一种存储介质,所述存储介质中存储有计算机程序,其中,所述计算机程序被设置为运行时执行所述权利要求1至12中任一项所述的方法。
  15. 一种电子装置,包括存储器和处理器,所述存储器中存储有计算机程序,所述处理器被设置为通过所述计算机程序执行所述权利要求1至12中任一项所述的方法。
  16. 一种计算机程序产品,包括指令,当其在计算机上运行时,使得计算机执行如权利要求1-12中任一项所述的方法。
PCT/CN2019/111635 2018-11-15 2019-10-17 对象的显示方法、装置、存储介质及电子装置 WO2020098444A1 (zh)

Priority Applications (2)

Application Number Priority Date Filing Date Title
US17/078,059 US11400375B2 (en) 2018-11-15 2020-10-22 Object display method and apparatus, storage medium, and electronic device
US17/848,293 US20220314116A1 (en) 2018-11-15 2022-06-23 Object display method and apparatus, storage medium, and electronic device

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN201811361960.5A CN109513208B (zh) 2018-11-15 2018-11-15 对象的显示方法、装置、存储介质及电子装置
CN201811361960.5 2018-11-15

Related Child Applications (1)

Application Number Title Priority Date Filing Date
US17/078,059 Continuation US11400375B2 (en) 2018-11-15 2020-10-22 Object display method and apparatus, storage medium, and electronic device

Publications (1)

Publication Number Publication Date
WO2020098444A1 true WO2020098444A1 (zh) 2020-05-22

Family

ID=65777861

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2019/111635 WO2020098444A1 (zh) 2018-11-15 2019-10-17 对象的显示方法、装置、存储介质及电子装置

Country Status (3)

Country Link
US (2) US11400375B2 (zh)
CN (1) CN109513208B (zh)
WO (1) WO2020098444A1 (zh)

Families Citing this family (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109513208B (zh) 2018-11-15 2021-04-09 深圳市腾讯信息技术有限公司 对象的显示方法、装置、存储介质及电子装置
CN110215684B (zh) * 2019-04-26 2020-10-23 网易(杭州)网络有限公司 游戏对象控制方法及装置
JP2021029291A (ja) * 2019-08-16 2021-03-01 株式会社コロプラ ゲームプログラム、ゲーム方法、および情報処理装置
US11071906B2 (en) * 2019-10-08 2021-07-27 Zynga Inc. Touchscreen game user interface
CN111467798B (zh) * 2020-04-01 2021-09-21 腾讯科技(深圳)有限公司 游戏应用程序中的帧显示方法、装置、终端和存储介质
CN111760278B (zh) * 2020-07-10 2022-06-10 腾讯科技(深圳)有限公司 技能控件的显示方法、装置、设备及介质
CN112007362B (zh) * 2020-08-28 2022-06-24 腾讯科技(深圳)有限公司 虚拟世界中的显示控制方法、装置、存储介质及设备
CN112221123B (zh) * 2020-10-01 2022-08-09 腾讯科技(深圳)有限公司 一种虚拟对象切换方法、装置、计算机设备和存储介质
CN112169315A (zh) * 2020-10-26 2021-01-05 网易(杭州)网络有限公司 游戏中技能的控制方法、装置以及触控终端
CN112691366B (zh) * 2021-01-13 2023-09-15 腾讯科技(深圳)有限公司 虚拟道具的显示方法、装置、设备及介质
CN113769399A (zh) * 2021-09-14 2021-12-10 网易(杭州)网络有限公司 游戏界面中功能控件收纳的方法、装置、设备及存储介质
CN113893527B (zh) * 2021-11-01 2023-07-14 北京字跳网络技术有限公司 一种交互控制方法、装置、电子设备及存储介质
CN114344900A (zh) * 2021-12-31 2022-04-15 北京字跳网络技术有限公司 角色控制方法、终端设备及系统

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103870131A (zh) * 2012-12-14 2014-06-18 联想(北京)有限公司 一种控制电子设备的方法及电子设备
US20160103606A1 (en) * 2014-10-09 2016-04-14 Wistron Corporation Method, electronic device, and computer program product for displaying virtual button
US20170038946A1 (en) * 2015-08-03 2017-02-09 Lenovo (Beijing) Co., Ltd. Display Control Method and Device, and Electronic Apparatus
CN107930122A (zh) * 2017-12-14 2018-04-20 网易(杭州)网络有限公司 信息处理方法、装置及存储介质
CN109513208A (zh) * 2018-11-15 2019-03-26 深圳市腾讯信息技术有限公司 对象的显示方法、装置、存储介质及电子装置

Family Cites Families (18)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20030011638A1 (en) * 2001-07-10 2003-01-16 Sun-Woo Chung Pop-up menu system
JP4169502B2 (ja) * 2001-10-11 2008-10-22 株式会社セガ 画像表示方法、画像生成装置及びプログラム
CN102043581A (zh) * 2010-12-02 2011-05-04 广东宝莱特医用科技股份有限公司 医疗设备触摸屏界面上控件按钮的处理方法
CN103593009A (zh) * 2011-02-10 2014-02-19 三星电子株式会社 包含触摸屏显示器的便携式设备以及控制它的方法
KR102061881B1 (ko) * 2012-10-10 2020-01-06 삼성전자주식회사 멀티 디스플레이 장치 및 그 디스플레이 제어 방법
KR20150126193A (ko) * 2014-05-02 2015-11-11 삼성전자주식회사 복수의 디스플레이를 이용한 컨텐츠 출력 방법 및 그 장치
CN106730810B (zh) * 2015-11-19 2020-02-18 网易(杭州)网络有限公司 一种移动智能终端的游戏按钮切换方法及装置
CN107132971B (zh) * 2016-02-29 2020-07-24 福建兑信科技有限公司 一种移动终端操作界面的控制方法、系统及移动终端
KR102452973B1 (ko) * 2016-04-12 2022-10-11 삼성전자주식회사 영상 처리 방법 및 이를 지원하는 전자 장치
CN106293461B (zh) * 2016-08-04 2018-02-27 腾讯科技(深圳)有限公司 一种交互式应用中的按键处理方法和终端以及服务器
CN107132979A (zh) * 2017-03-14 2017-09-05 网易(杭州)网络有限公司 在移动设备游戏中精确选择目标的交互方法、装置及计算机可读存储介质
CN108459811B (zh) * 2018-01-09 2021-03-16 网易(杭州)网络有限公司 虚拟道具的处理方法、装置、电子设备及存储介质
CN108579086B (zh) * 2018-03-27 2019-11-08 腾讯科技(深圳)有限公司 对象的处理方法、装置、存储介质和电子装置
CN109364478B (zh) * 2018-09-07 2022-08-23 深圳市腾讯信息技术有限公司 信息同步方法、装置及存储介质
CN110548288B (zh) * 2019-09-05 2020-11-10 腾讯科技(深圳)有限公司 虚拟对象的受击提示方法、装置、终端及存储介质
CN111589128B (zh) * 2020-04-23 2022-02-18 腾讯科技(深圳)有限公司 基于虚拟场景的操作控件显示方法及装置
CN111589133B (zh) * 2020-04-28 2022-02-22 腾讯科技(深圳)有限公司 虚拟对象控制方法、装置、设备及存储介质
CN113633964B (zh) * 2021-08-16 2024-04-02 腾讯科技(深圳)有限公司 虚拟技能的控制方法、装置、设备及计算机可读存储介质

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103870131A (zh) * 2012-12-14 2014-06-18 联想(北京)有限公司 一种控制电子设备的方法及电子设备
US20160103606A1 (en) * 2014-10-09 2016-04-14 Wistron Corporation Method, electronic device, and computer program product for displaying virtual button
US20170038946A1 (en) * 2015-08-03 2017-02-09 Lenovo (Beijing) Co., Ltd. Display Control Method and Device, and Electronic Apparatus
CN107930122A (zh) * 2017-12-14 2018-04-20 网易(杭州)网络有限公司 信息处理方法、装置及存储介质
CN109513208A (zh) * 2018-11-15 2019-03-26 深圳市腾讯信息技术有限公司 对象的显示方法、装置、存储介质及电子装置

Also Published As

Publication number Publication date
CN109513208A (zh) 2019-03-26
US11400375B2 (en) 2022-08-02
US20210038987A1 (en) 2021-02-11
CN109513208B (zh) 2021-04-09
US20220314116A1 (en) 2022-10-06

Similar Documents

Publication Publication Date Title
WO2020098444A1 (zh) 对象的显示方法、装置、存储介质及电子装置
US10507383B2 (en) Information processing method and apparatus, electronic device, and storage medium
US10500483B2 (en) Information processing method and apparatus, storage medium, and electronic device
US10702775B2 (en) Virtual character control method, apparatus, storage medium and electronic device
JP6529659B2 (ja) 情報処理方法、端末及びコンピュータ記憶媒体
EP3285156B1 (en) Information processing method and terminal, and computer storage medium
US10398977B2 (en) Information processing method, terminal, and computer storage medium
CN107930122B (zh) 信息处理方法、装置及存储介质
US20230037089A1 (en) Operation control method and apparatus, storage medium, and electronic device
CN107551537B (zh) 一种游戏中虚拟角色的控制方法及装置、存储介质、电子设备
WO2022121528A1 (zh) 互动信息处理方法、装置、终端、存储介质及程序产品
US20220155922A1 (en) Side-toolbar-display method, terminal, and storage medium
CN108366169B (zh) 一种通知消息的处理方法及移动终端
CN111773711A (zh) 游戏视角的控制方法、装置、存储介质和电子装置
WO2022001471A1 (zh) 游戏切换方法、装置、电子装置及存储介质
US20140223328A1 (en) Apparatus and method for automatically controlling display screen density
JP2017080313A (ja) プログラム及び情報処理装置
CN111840990B (zh) 输入控制方法、装置及电子设备
WO2023197788A1 (zh) 一种交互方法、装置、计算机设备及可读存储介质
WO2023226422A1 (zh) 内容编辑的控制方法、装置、电子设备及存储介质
CN113926186A (zh) 游戏中虚拟对象的选择方法、装置以及触控终端
CN114442820A (zh) 一种基于激光交互的控制方法和计算机设备
CN115658172A (zh) 应用账号切换方法、装置、电子设备及介质
CN113680051A (zh) 游戏的控制方法、装置、设备及存储介质
CN115779429A (zh) 游戏中虚拟角色的控制方法、装置以及电子终端

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 19885838

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 19885838

Country of ref document: EP

Kind code of ref document: A1