US11003261B2 - Information processing method, terminal, and computer storage medium - Google Patents

Information processing method, terminal, and computer storage medium Download PDF

Info

Publication number
US11003261B2
US11003261B2 US16/560,772 US201916560772A US11003261B2 US 11003261 B2 US11003261 B2 US 11003261B2 US 201916560772 A US201916560772 A US 201916560772A US 11003261 B2 US11003261 B2 US 11003261B2
Authority
US
United States
Prior art keywords
skill
release
character object
range
virtual joystick
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
US16/560,772
Other versions
US20190391676A1 (en
Inventor
Haosu WANG
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Tencent Technology Shenzhen Co Ltd
Original Assignee
Tencent Technology Shenzhen Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Tencent Technology Shenzhen Co Ltd filed Critical Tencent Technology Shenzhen Co Ltd
Priority to US16/560,772 priority Critical patent/US11003261B2/en
Publication of US20190391676A1 publication Critical patent/US20190391676A1/en
Application granted granted Critical
Publication of US11003261B2 publication Critical patent/US11003261B2/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/40Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment
    • A63F13/42Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment by mapping the input signals into game commands, e.g. mapping the displacement of a stylus on a touch screen to the steering angle of a virtual vehicle
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/033Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
    • G06F3/0354Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor with detection of 2D relative movements between the device, or an operating part thereof, and a plane or surface, e.g. 2D mice, trackballs, pens or pucks
    • G06F3/03547Touch pads, in which fingers can move on a surface
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/20Input arrangements for video game devices
    • A63F13/21Input arrangements for video game devices characterised by their sensors, purposes or types
    • A63F13/214Input arrangements for video game devices characterised by their sensors, purposes or types for locating contacts on a surface, e.g. floor mats or touch pads
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/20Input arrangements for video game devices
    • A63F13/21Input arrangements for video game devices characterised by their sensors, purposes or types
    • A63F13/214Input arrangements for video game devices characterised by their sensors, purposes or types for locating contacts on a surface, e.g. floor mats or touch pads
    • A63F13/2145Input arrangements for video game devices characterised by their sensors, purposes or types for locating contacts on a surface, e.g. floor mats or touch pads the surface being also a display device, e.g. touch screens
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/25Output arrangements for video game devices
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/40Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment
    • A63F13/42Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment by mapping the input signals into game commands, e.g. mapping the displacement of a stylus on a touch screen to the steering angle of a virtual vehicle
    • A63F13/426Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment by mapping the input signals into game commands, e.g. mapping the displacement of a stylus on a touch screen to the steering angle of a virtual vehicle involving on-screen location information, e.g. screen coordinates of an area at which the player is aiming with a light gun
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/55Controlling game characters or game objects based on the game progress
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/04817Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance using icons
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04883Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures for inputting data by handwriting, e.g. gesture or text
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04886Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures by partitioning the display area of the touch-screen or the surface of the digitising tablet into independently controllable areas, e.g. virtual keyboards or menus
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0489Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using dedicated keyboard keys or combinations thereof
    • G06F3/04895Guidance during keyboard input operation, e.g. prompting
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2203/00Indexing scheme relating to G06F3/00 - G06F3/048
    • G06F2203/048Indexing scheme relating to G06F3/048
    • G06F2203/04808Several contacts: gestures triggering a specific function, e.g. scrolling, zooming, right-click, when the user establishes several contacts with the surface simultaneously; e.g. using several fingers or a combination of fingers and pen

Definitions

  • the present disclosure relates to information exchange technologies and, more particularly, to an information processing method, terminal, and computer storage medium.
  • processors of the intelligent terminals have increasingly high processing capability, so that many applications have been developed to realize operation and control on the large-screen or super-screen based on man-machine interaction.
  • multiple users may run different interaction modes by creating groups in one-to-one, one-to-many, and many-to-many formats, so as to obtain different interaction results.
  • information exchange may be performed between the different groups, and different interaction results are obtained according to responses to the information exchange.
  • information exchange may also be performed among group members in a same group, and different interaction results are obtained according to responses to the information exchange.
  • release of a specific skill may be triggered to enrich a presentation form and content of information, and different presentation forms and content of information may finally lead to different interaction results.
  • the way to release a specific skill cannot accurately and rapidly locate a target object to which the specific skill is directed, easily causing mis-operation, and impacting the interaction processing speed due to locating inaccuracy.
  • embodiments of the present invention provides an information processing method, terminal, and computer storage medium, so as to resolve at least one problem in the existing technology, so that a target object for skill release operation can be located accurately and rapidly, avoiding mis-operation and improving interaction processing speed due to the improved locating accuracy.
  • An embodiment of the present invention provides an information processing method implemented by a computer system.
  • the method includes: performing rendering on a graphical user interface to obtain at least one virtual resource object; when detecting a skill-release trigger gesture on at least one skill object located in at least one skill operation area on the graphical user interface, performing rendering on the graphical user interface to obtain a skill-release supplementary control object, the skill-release supplementary control object comprising a skill-release control halo object and a virtual joystick object located within a radiation range of the skill-release control halo object; when detecting a dragging operation on the virtual joystick object, controlling a skill-release location of the skill object to be correspondingly adjusted on the graphical user interface; and determining whether the virtual joystick object is out of a threshold range and, when the virtual joystick object is not out of the threshold range, selecting a target character object satisfying a first preset policy from at least one character object within a skill releasable range of the skill object according to a detected release operation of the dragging operation
  • An embodiment of the present invention further provides a terminal, the terminal including: a display, a memory storing instructions, and a processor coupled to the memory.
  • the processor When executing the instructions, the processor is configured for: performing rendering on a graphical user interface to obtain at least one virtual resource object; when detecting a skill-release trigger gesture on at least one skill object located in at least one skill operation area on the graphical user interface, performing rendering on the graphical user interface to obtain a skill-release supplementary control object, the skill-release supplementary control object comprising a skill-release control halo object and a virtual joystick object located within a radiation range of the skill-release control halo object; when detecting a dragging operation on the virtual joystick object, controlling a skill-release location of the skill object to be correspondingly adjusted on the graphical user interface; and determining whether the virtual joystick object is out of a threshold range and, when the virtual joystick object is not out of the threshold range, selecting a target character object satisfying a first preset policy from at least one character
  • An embodiment of the present invention further provides a non-transitory computer-readable storage medium.
  • the non-transitory computer-readable storage medium contains computer-executable instructions for, when executed by a processor, performing an information processing method.
  • the method includes: performing rendering on a graphical user interface to obtain at least one virtual resource object; when detecting a skill-release trigger gesture on at least one skill object located in at least one skill operation area on the graphical user interface, performing rendering on the graphical user interface to obtain a skill-release supplementary control object, the skill-release supplementary control object comprising a skill-release control halo object and a virtual joystick object located within a radiation range of the skill-release control halo object; when detecting a dragging operation on the virtual joystick object, controlling a skill-release location of the skill object to be correspondingly adjusted on the graphical user interface; and determining whether the virtual joystick object is out of a threshold range and, when the virtual joystick object is not out of the threshold range, selecting a target character object satisfying
  • FIG. 1 is a schematic diagram of various hardware devices for performing information exchange according to an embodiment of the present invention
  • FIG. 2 is a schematic flowchart of an information processing method according to an embodiment of the present invention.
  • FIG. 3 is a schematic diagram of a user interface (UI) according to an embodiment of the present invention.
  • FIG. 4 is a schematic diagram of a system setting interface for generating the UI effect in FIG. 3 according to an embodiment of the present invention
  • FIG. 5 is a schematic diagram of another UI according to an embodiment of the present invention.
  • FIG. 6 is a schematic flowchart of another information processing method according to an embodiment of the present invention.
  • FIG. 7 is a schematic flowchart of another information processing method according to an embodiment of the present invention.
  • FIG. 8 to FIG. 10 are schematic diagrams of multiple UIs according to embodiments of the present invention.
  • FIG. 11 is a schematic diagram of an information processing terminal according to an embodiment of the present invention.
  • FIG. 12 is a schematic hardware structural diagram of an information processing terminal according to an embodiment of the present invention.
  • FIG. 13 is a schematic flowchart of implementation of a specific application scenario according to an embodiment of the present invention.
  • FIG. 14 is a schematic flowchart of implementation of another specific application scenario according to an embodiment of the present invention.
  • FIG. 1 is a schematic diagram of various hardware devices in an operating environment for performing information exchange according to an embodiment of the present invention.
  • the operating environment includes: one or more servers, where a server 11 is only an example, terminal devices 21 to 25 , and a network 31 .
  • the network 31 includes network entities such as routers and gateways (not shown).
  • the terminal devices 21 to 25 perform information exchange with the server by using a wired network or a wireless network, so as to download an application, an application update data packet, and/or application related data information or service information from the server 11 .
  • Various types of the terminal devices are shown in FIG. 1 , including a mobile phone (terminal 23 ), a tablet computer or a PDA (terminal 25 ), a desktop computer (terminal 22 ), a PC (terminal 24 ), an all-in-one PC (terminal 21 ), and other types.
  • an application having an entertainment function such as a video application, an audio play application, a game application, or reading software
  • an application having a serving function such as a map navigation application, or a group purchasing application.
  • the terminal devices 21 to 25 download a game application, a game application update data packet, and/or game application related data information or service information from the server 11 according to a requirement.
  • the terminal devices 21 to 25 download a game application, a game application update data packet, and/or game application related data information or service information from the server 11 according to a requirement.
  • a skill-release trigger gesture is detected on at least one skill object located in at least one skill operation area in the game interface
  • a skill-release supplementary control object is obtained through rendering on the graphical user interface.
  • the skill-release supplementary control object including a skill-release control halo object and a virtual joystick object located within a radiation range of the skill-release control halo object.
  • a skill-release location of the skill object is controlled to be correspondingly adjusted on the graphical user interface; and it is determined whether the virtual joystick object is out of a threshold range.
  • a target character object satisfying a first preset policy is selected from at least one character object. Further, a skill-release operation is performed on the selected target character object.
  • the target character object can be selected and located by determining whether a virtual joystick object touched by a user finger is out of a threshold range, it facilitates the user to respond rapidly in the information exchange process, thereby avoiding wasting user's response time on searching the graphical user interface.
  • a target object for skill-release can be located accurately and rapidly, avoiding mis-operation and improving interaction processing speed due to the improved locating accuracy.
  • FIG. 1 is only an example of a system architecture for implementing the embodiments of the present invention, and the embodiments of the present invention are not limited to the system architecture in FIG. 1 . Based on the system architecture, various embodiments of the present invention are provided in the followings.
  • an information processing method is provided.
  • a software application is executed on a processor of a terminal and rendering is performed on a display of the terminal to obtain a graphical user interface.
  • the processor, the graphical user interface, and the software application are implemented in, for example, a game system. As shown in FIG. 2 , the method includes the followings.
  • Step 101 Performing rendering on the graphical user interface to obtain at least one virtual resource object.
  • the virtual resource object includes various types of objects on the graphical user interface.
  • a user avatar icon for representing a user an object for representing a building, a tree, tower defense, or the like in a background
  • an object for representing a status (such as a blood value or a vitality value) of the user an object for representing a skill, equipment, or the like of the user
  • a direction button object for controlling a change of a location of the user a rendering object used during skill-release by the user, and the like
  • Step 102 When detecting a skill-release trigger gesture on at least one skill object located in at least one skill operation area on the graphical user interface, performing rendering on the graphical user interface to obtain a skill-release supplementary control object.
  • the skill-release supplementary control object includes a skill-release control halo object and a virtual joystick object located within a radiation range of the skill-release control halo object.
  • rendering may be performed at a preset location on the graphical user interface to obtain the skill-release supplementary control object, so as to display the skill-release supplementary control object at a default fixed location.
  • the skill-release supplementary control object can appear at the preset location, that is, the default fixed location, it facilitates the user to respond rapidly in an information exchange process, thereby avoiding wasting the user's response time for searching the graphical user interface.
  • the fixed location may be fixed with respect to the graphical user interface, with respect to one or more objects on the graphical user interface, or with respect to other types of references.
  • a first location may be obtained with a touch or a slide of a finger, and rendering is performed at the first location on the graphical user interface to obtain the skill-release supplementary control object, so as to display the skill-release supplementary control object at a changeable location with a touch or a slide of the finger.
  • the skill-release supplementary control object can appear at a changeable location with a touch or a slide of the finger, it can well meet the requirement of a user with a habit of rapidly performing skill-release by a slide of the finger, which is different from the requirement of a user with a habit of performing skill-release at a fixed location and aiming in a stable control area to perform skill-release, it can also facilitate the user is facilitated to respond rapidly in an information exchange process, thereby avoiding wasting the user's response time for searching the graphical user interface.
  • a skill-release operation gesture applied to a skill object 1 is obtained, and rendering is performed to obtain a skill-release supplementary control object.
  • the skill-release supplementary control object includes a skill-release control halo object 41 and a virtual joystick object 42 .
  • a skill-release control operation is subsequently triggered, so that the location of the skill-release control halo object 41 remains unchanged, and a skill releasable range specified by a skill indicator object 43 completely cover the area in which the skill-release control halo object 41 is located.
  • the skill-release control halo object 41 may have a shape of a wheel and may be referred to as a wheel; the virtual joystick object 42 may have an annular shape or a ring shape and may be referred to as a joystick; and the skill indicator object 43 may have a shape of a wheel, and may be referred to as a skill indicator.
  • the skill-release control halo object and the virtual joystick object used in this specification are only examples of a skill-release controller object and a joystick object.
  • a skill-release controller object and a joystick object that are obtained by performing rendering in a graphical user interface include, but are not limited to, a shape of a wheel, an annular shape, a ring shape, and other shapes, as long as objects that may be configured to implement skill control can be implemented.
  • FIG. 4 is a schematic diagram of a system configuration interface.
  • the UI effect interface shown in FIG. 3 may be obtained by the configuration interface shown in FIG. 4 .
  • the UI shown in FIG. 3 is subsequently obtained by performing rendering.
  • the center of the joystick can shift from the center of the wheel, and the skill-release control operation is triggered, so that the location of the wheel remains unchanged, and the skill releasable range specified by the skill indicator completely covers the area in which the wheel is located.
  • the skill indicator is an indicator on which rendering is performed to supplement the user to aim or to achieve other purposes.
  • the skill releasable range specified by the skill indicator also completely covers the skill-release control halo object.
  • Step 103 When detecting a drag operation on the virtual joystick object, controlling the skill-release location of the skill object to be correspondingly adjusted on the graphical user interface.
  • Step 104 Determining whether the virtual joystick object is out of a threshold range. When it is determined that the virtual joystick object is not out of the threshold range, based on a detected release operation of the drag operation and from at least one character object within a skill releasable range of the skill object, selecting a target character object having the highest priority and satisfying a first preset policy, and performing a skill-release operation on the target character object.
  • FIG. 5 shows an example of the threshold range.
  • the virtual joystick object 42 partially overlaps a threshold range 44 , and is not out of the threshold range.
  • a user finger release gesture is obtained to perform a subsequent action, and a character object satisfying the first preset policy is selected, from the at least one character object within the skill releasable range, as the target character object having the highest priority.
  • performing a skill-release operation on the target character object may include: based on a release location and/or direction of the skill object obtained by the movement of the virtual joystick object along the dragging of the skill-release operation gesture, performing the skill-release operation on the target character object within the skill releasable range.
  • a target character object is selected and located by determining whether a virtual joystick object touched by a user finger is out of a threshold range.
  • a finger release gesture of the user is obtained when it is determined that the virtual joystick object touched by the user finger is not out of the threshold range; and a character object satisfying a first preset policy is selected from at least one character object within a skill releasable range as the target character object having the highest priority.
  • a skill-release operation is then performed on the target character object within the skill releasable range according to a release location and/or direction of the skill object obtained through movement of the virtual joystick object along with the skill-release operation gesture.
  • the target object for the skill-release can be located accurately and rapidly, avoiding mis-operation and improving interaction processing speed due to the improved locating accuracy.
  • another information processing method is provided.
  • the method is applied to an electronic device, the electronic device includes a display unit, and the display unit includes a display area.
  • the method includes the followings.
  • Step 201 Performing rendering in a graphical user interface to obtain at least one virtual resource object.
  • the virtual resource object includes various types of objects on the graphical user interface.
  • a user avatar icon for representing a user an object for representing a building, a tree, tower defense, or the like in a background
  • an object for representing a status (such as a blood value or a vitality value) of the user an object for representing a skill, equipment, or the like of the user
  • a direction button object for controlling a change of a location of the user a rendering object used during skill-release by the user, and the like
  • Step 202 When detecting a skill-release trigger gesture on at least one skill object located in at least one skill operation area on the graphical user interface, performing rendering on the graphical user interface to obtain a skill-release supplementary control object.
  • the skill-release supplementary control object includes a skill-release control halo object and a virtual joystick object located within a radiation range of the skill-release control halo object.
  • rendering may be performed at a preset location on the graphical user interface to obtain the skill-release supplementary control object, so as to display the skill-release supplementary control object at a default fixed location.
  • the skill-release supplementary control object can appear at the preset location, that is, the default fixed location, it facilitates the user to respond rapidly in an information exchange process, thereby avoiding wasting the user's response time for searching the graphical user interface.
  • the fixed location may be fixed with respect to the graphical user interface, with respect to one or more objects on the graphical user interface, or with respect to other types of references.
  • a first location may be obtained with a touch or a slide of a finger, and rendering is performed at the first location on the graphical user interface to obtain the skill-release supplementary control object, so as to display the skill-release supplementary control object at a changeable location with a touch or a slide of the finger.
  • the skill-release supplementary control object can appear at a changeable location with a touch or a slide of the finger, it can well meet the requirement of a user with a habit of rapidly performing skill-release by a slide of the finger, which is different from the requirement of a user with a habit of performing skill-release at a fixed location and aiming in a stable control area to perform skill-release, it can also facilitate the user is facilitated to respond rapidly in an information exchange process, thereby avoiding wasting the user's response time for searching the graphical user interface.
  • a skill-release operation gesture applied to a skill object 1 is obtained, and rendering is performed to obtain a skill-release supplementary control object.
  • the skill-release supplementary control object includes a skill-release control halo object 41 and a virtual joystick object 42 .
  • a skill-release control operation is subsequently triggered, so that the location of the skill-release control halo object 41 remains unchanged, and a skill releasable range specified by a skill indicator object 43 completely cover the area in which the skill-release control halo object 41 is located.
  • the skill-release control halo object 41 may have a shape of a wheel and may be referred to as a wheel; the virtual joystick object 42 may have an annular shape or a ring shape and may be referred to as a joystick; and the skill indicator object 43 may have a shape of a wheel, and may be referred to as a skill indicator.
  • the skill-release control halo object and the virtual joystick object used in this specification are only examples of a skill-release controller object and a joystick object.
  • a skill-release controller object and a joystick object that are obtained by performing rendering in a graphical user interface include, but are not limited to, a shape of a wheel, an annular shape, a ring shape, and other shapes, as long as objects that may be configured to implement skill control can be implemented.
  • FIG. 4 is a schematic diagram of a system configuration interface.
  • the UI effect interface shown in FIG. 3 may be obtained by the configuration interface shown in FIG. 4 .
  • the UI shown in FIG. 3 is subsequently obtained by performing rendering.
  • the center of the joystick can shift from the center of the wheel, and the skill-release control operation is triggered, so that the location of the wheel remains unchanged, and the skill releasable range specified by the skill indicator completely covers the area in which the wheel is located.
  • the skill indicator is an indicator on which rendering is performed to supplement the user to aim or to achieve other purposes.
  • the skill releasable range specified by the skill indicator also completely covers the skill-release control halo object.
  • Step 203 When detecting a drag operation on the virtual joystick object, controlling the skill-release location of the skill object to be correspondingly adjusted on the graphical user interface.
  • Step 204 When detecting that the virtual joystick object moves along with a skill-release operation gesture, determining whether the virtual joystick object is out of a threshold range. When it is determined that the virtual joystick object is not out of the threshold range, based on a detected release operation of the drag operation and from at least one character object within a skill releasable range of the skill object, selecting a target character object having the highest priority and satisfying a first preset policy. When no such target character object can be selected within the skill releasable range, discarding the skill-release operation on a target character object within a skill releasable range, and display a first prompt message.
  • the first prompt message is used to represent that there is no selectable target character object within the skill releasable range.
  • FIG. 5 shows an example of the threshold range.
  • the virtual joystick object 42 partially overlaps a threshold range 44 , and is not out of the threshold range.
  • a user finger release gesture is obtained to perform a subsequent action, and a character object satisfying the first preset policy is selected, from the at least one character object within the skill releasable range, as the target character object having the highest priority.
  • a target character object is selected and located by determining whether a virtual joystick object touched by a user finger is out of a threshold range.
  • a finger release gesture of the user is obtained when it is determined that the virtual joystick object touched by the user finger is not out of the threshold range, but a character object satisfying a first preset policy cannot be selected from at least one character object within a skill releasable range as the target character object, the skill-release operation on the target character object within a skill releasable range is discarded, and a first prompt message is displayed.
  • the first prompt message is used to indicate to the user that there is no selectable target character object within the skill releasable range.
  • a target object for skill-release can be located accurately and rapidly, and the skill-release operation is discarded if the target object is not found, which can avoid mis-operation and continue with other subsequent action to re-determine the target object, improving interaction processing speed due to the improved locating accuracy.
  • another information processing method is provided.
  • the method is applied to an electronic device, the electronic device includes a display unit, and the display unit includes a display area.
  • the method includes the following steps:
  • Step 301 Performing rendering on the graphical user interface to obtain at least one virtual resource object.
  • the virtual resource object includes various types of objects on the graphical user interface.
  • a user avatar icon for representing a user an object for representing a building, a tree, tower defense, or the like in a background
  • an object for representing a status (such as a blood value or a vitality value) of the user an object for representing a skill, equipment, or the like of the user
  • a direction button object for controlling a change of a location of the user a rendering object used during skill-release by the user, and the like
  • Step 302 When detecting a skill-release trigger gesture on at least one skill object located in at least one skill operation area on the graphical user interface, performing rendering on the graphical user interface to obtain a skill-release supplementary control object.
  • the skill-release supplementary control object includes a skill-release control halo object and a virtual joystick object located within a radiation range of the skill-release control halo object.
  • rendering may be performed at a preset location on the graphical user interface to obtain the skill-release supplementary control object, so as to display the skill-release supplementary control object at a default fixed location.
  • the skill-release supplementary control object can appear at the preset location, that is, the default fixed location, it facilitates the user to respond rapidly in an information exchange process, thereby avoiding wasting the user's response time for searching the graphical user interface.
  • the fixed location may be fixed with respect to the graphical user interface, with respect to one or more objects on the graphical user interface, or with respect to other types of references.
  • a first location may be obtained with a touch or a slide of a finger, and rendering is performed at the first location on the graphical user interface to obtain the skill-release supplementary control object, so as to display the skill-release supplementary control object at a changeable location with a touch or a slide of the finger.
  • the skill-release supplementary control object can appear at a changeable location with a touch or a slide of the finger, it can well meet the requirement of a user with a habit of rapidly performing skill-release by a slide of the finger, which is different from the requirement of a user with a habit of performing skill-release at a fixed location and aiming in a stable control area to perform skill-release, it can also facilitate the user is facilitated to respond rapidly in an information exchange process, thereby avoiding wasting the user's response time for searching the graphical user interface.
  • a skill-release operation gesture applied to a skill object 1 is obtained, and rendering is performed to obtain a skill-release supplementary control object.
  • the skill-release supplementary control object includes a skill-release control halo object 41 and a virtual joystick object 42 .
  • a skill-release control operation is subsequently triggered, so that the location of the skill-release control halo object 41 remains unchanged, and a skill releasable range specified by a skill indicator object 43 completely cover the area in which the skill-release control halo object 41 is located.
  • the skill-release control halo object 41 may have a shape of a wheel and may be referred to as a wheel; the virtual joystick object 42 may have an annular shape or a ring shape and may be referred to as a joystick; and the skill indicator object 43 may have a shape of a wheel, and may be referred to as a skill indicator.
  • the skill-release control halo object and the virtual joystick object used in this specification are only examples of a skill-release controller object and a joystick object.
  • a skill-release controller object and a joystick object that are obtained by performing rendering in a graphical user interface include, but are not limited to, a shape of a wheel, an annular shape, a ring shape, and other shapes, as long as objects that may be configured to implement skill control can be implemented.
  • FIG. 4 is a schematic diagram of a system configuration interface.
  • the UI effect interface shown in FIG. 3 may be obtained by the configuration interface shown in FIG. 4 .
  • the UI shown in FIG. 3 is subsequently obtained by performing rendering.
  • the center of the joystick can shift from the center of the wheel, and the skill-release control operation is triggered, so that the location of the wheel remains unchanged, and the skill releasable range specified by the skill indicator completely covers the area in which the wheel is located.
  • the skill indicator is an indicator on which rendering is performed to supplement the user to aim or to achieve other purposes.
  • the skill releasable range specified by the skill indicator also completely covers the skill-release control halo object.
  • Step 303 When detecting a drag operation on the virtual joystick object, controlling the skill-release location of the skill object to be correspondingly adjusted on the graphical user interface.
  • Step 304 Determining whether the virtual joystick object is out of a threshold range.
  • a supplementary skill indicator object is obtained by rendering on the graphical user interface, and a skill releasable range specified by the supplementary skill indicator object is used as the searching range for a first character object.
  • FIG. 8 shows an example of the threshold range.
  • the virtual joystick object 42 is located at an edge of the skill-release control halo object 41 , and is out of a threshold range 44 .
  • the supplementary skill indicator object includes a fan-shaped indicator object (other shapes, such as an annular shape or a square shape, can also be used).
  • the fan-shaped indicator object is different from the skill indicator object 43 , and has a smaller attack range, so as to specify a target character object having the highest priority within a skill releasable range specified by the fan-shaped indicator object (also referred to as a searching range or an enemy-searching range).
  • Step 305 Obtaining a user finger release gesture, selecting from at least one first character object within the searching range a first character object satisfying a second preset policy as a target character object having the highest priority; and performing a skill-release operation on the selected target character object.
  • the skill-release operation is performed on the target character object.
  • a target character object is selected and located by determining whether a virtual joystick object touched by a user finger is out of a threshold range and, when it is determined that the virtual joystick object touched by the user finger is out of the threshold range, rendering is performed on the graphical user interface to obtain a supplementary skill indicator object.
  • the skill releasable range specified by the supplementary skill indicator object is a searching range of a first character object.
  • a user finger release gesture is obtained, and a first character object satisfying a second preset policy as the target character object having the highest priority is selected from at least one first character object within the searching range; and the skill-release operation is performed on the target character object within the searching range.
  • the skill-release control halo object (such as a wheel) may have two threshold ranges, and the virtual joystick object (such as a joystick) may move within the two ranges, so as to implement skill-release. If a skill moves within the innermost range, a release target is automatically searched within the skill-releasable-range centering around a player character object (or a release target is searched in a direction along a connection line from a center of the wheel to the joystick). When there is no target, the skill-release operation is not performed and a prompt is displayed.
  • a skill indicator having a fan shape or another shape is displayed using the connection line from the center of the wheel to the joystick as a centerline and the player character target as the center point.
  • the performing rendering on the graphical user interface to obtain a supplementary skill indicator object includes: obtaining a first relative direction formed by the virtual joystick object relative to a center of the skill-release control halo object, and generating, in the first relative direction, a first target selection direction line by using a connection line to a center point of the skill indicator object; and forming the supplementary skill indicator object based on positive and negative offsets on a preset angle and centering the first target selection direction line.
  • a skill-release operation gesture applied on a skill object 1 is obtained, and rendering is performed to obtain a skill-release supplementary control object.
  • the skill-release supplementary control object includes the skill-release control halo object 41 and the virtual joystick object 42 .
  • a skill-release control operation is subsequently triggered, so that the location of the skill-release control halo object 41 remains unchanged, and a skill releasable range specified by a skill indicator object 43 completely covers an area in which the skill-release control halo object 41 is located.
  • the skill-release control halo object 41 may have a shape of a wheel and may be referred to as a wheel;
  • the virtual joystick object 42 may have an annular shape or a ring shape and may be referred to as a joystick;
  • the skill indicator object 43 may have a shape of a wheel, and may be referred to as a skill indicator.
  • Rendering is performed on the graphical user interface to obtain a supplementary skill indicator object when the virtual joystick object touched by a user finger is out of the threshold range, such as a fan-shaped indicator object 45 in FIG. 9 .
  • the fan-shaped indicator object 45 has a first target selection direction line 451 , which is the direction of the connection line from the virtual joystick object and the skill-release control halo object.
  • the fan-shaped indicator object 45 is formed based on positive and negative offsets on the preset angle ‘a’.
  • a person f 1 is a member of the user's side.
  • the method further includes: each time it is detected that a location of the virtual joystick object changes relative to the center of the skill-release control halo object, refreshing the first relative direction to a second relative direction, and generating, in the second relative direction, a second target selection direction line by using a connection line to the center point of the skill indicator object; and selecting a first character object with the shortest vertical distance to the second target selection direction line as the target character object having the highest priority and highlighting the target character object, so as to re-determine and refresh the target character object each time the virtual joystick object moves, and to highlight a new target character object obtained through refreshing.
  • a skill-release operation gesture applied to a skill object 1 is obtained, and rendering is performed to obtain a skill-release supplementary control object.
  • the skill-release supplementary control object includes a skill-release control halo object 41 and a virtual joystick object 42 .
  • a skill-release control operation is subsequently triggered, so that a location of the skill-release control halo object 41 remains unchanged, and a skill releasable range specified by a skill indicator object 43 completely covers an area in which the skill-release control halo object 41 is located.
  • the fan-shaped indicator object 45 has the second target selection direction line 452 obtained through refreshing. Centering around the second target selection direction line 452 , the fan-shaped indicator object 45 is formed based on positive and negative offsets on the preset angle a.
  • the personnel f 1 is a member of the user's side
  • persons g 1 and g 2 are enemies (members of the opposing side).
  • different groups can be created by multiple users in a one-to-one, one-to-many, and many-to-many formats to run different interaction modes. Therefore, different interaction results can be obtained.
  • the different interaction modes include a versus mode between multiple online users, and further include an offline versus mode without an Internet connection.
  • the versus mode between multiple users and the offline versus mode without an Internet connection are both applicable to the UIs shown in FIG. 3 , FIG. 5 , FIG. 8 , FIG. 9 , and FIG. 10 .
  • the UIs shown in FIG. 3 , FIG. 5 , FIG. 8 , FIG. 9 , and FIG. 10 are obtained through rendering by a processor of a terminal device, which is specifically an image processor. This is only a specific example.
  • the specific example is applicable to different personnel deployment scenarios in a game, for example, numbers of deployed persons of two parties are 1 to 1, 3 to 3, or 5 to 5, and is also applicable to scenarios in which numbers of deployed persons of two parties are different in a game, for example, a scenario in which numbers of deployed persons are 40 to 20 or 30 to 60, to run a mode in a game in which a skill-release location and direction are determined by using a wheel at a fixed location.
  • the multiple users are grouped into different groups, each group includes at least one group member, and the different groups are marked as at least a first group (for example, own group) and a second group (for example, an opponent group).
  • the selecting, from at least one first character object within the searching range, a first character object satisfying a second preset policy as the target character object having the highest priority includes: when the searching range includes multiple first character objects, selecting a first character object with the shortest vertical distance to the first target selection direction line as the target character object having the highest priority, and highlighting the target character object.
  • the method further includes: when it is determined that the virtual joystick object touched by a user finger is out of the threshold range, performing rendering on the graphical user interface to obtain a supplementary skill indicator object, where a skill releasable range specified by the supplementary skill indicator object is a searching range of a first character object; obtaining a user finger release gesture and, when no first character object can be found within the searching range, discarding performing the skill-release operation on the target character object within the searching range, and displaying a second prompt message, where the second prompt message is used to represent that there is no selectable target character object within the searching range.
  • a terminal is provided.
  • a software application is executed on a processor of the terminal and rendering is performed on a display of the terminal to obtain a graphical user interface.
  • the processor, the graphical user interface, and the software application are implemented in a game system.
  • the terminal further includes a first rendering unit 51 , a first detection unit 52 , a second detection unit 53 , and a skill-release unit 54 .
  • the first rendering unit 51 is configured to perform rendering on the graphical user interface to obtain at least one virtual resource object.
  • the first detection unit 52 is configured to: when detecting a skill-release trigger gesture on at least one skill object located in at least one skill operation area on the graphical user interface, perform rendering on the graphical user interface to obtain a skill-release supplementary control object.
  • the skill-release supplementary control object including a skill-release control halo object and a virtual joystick object located within a radiation range of the skill-release control halo object.
  • the second detection unit 53 is configured to: when detecting a drag operation on the virtual joystick object, control the skill-release location of the skill object to be correspondingly adjusted on the graphical user interface.
  • the skill-release unit 54 is configured to: determine whether the virtual joystick object is out of a threshold range and, when the virtual joystick object is not out of the threshold range, based on a detected release operation of the drag operation and from at least one character object within a skill releasable range of the skill object, select a target character object having the highest priority and satisfying a first preset policy, and perform a skill-release operation on the target character object.
  • the skill-release operation is performed on the target character object within the skill releasable range according to a release location and/or direction of the skill object obtained by the movement of the virtual joystick object along the dragging of the skill-release operation gesture.
  • the virtual resource object includes various types of objects on the graphical user interface.
  • a user avatar icon for representing a user an object for representing a building, a tree, tower defense, or the like in a background
  • an object for representing a status (such as a blood value or a vitality value) of the user an object for representing a skill, equipment, or the like of the user
  • a direction button object for controlling a change of a location of the user a rendering object used during skill-release by the user, and the like
  • rendering may be performed at a preset location on the graphical user interface to obtain the skill-release supplementary control object, so as to display the skill-release supplementary control object at a default fixed location.
  • the skill-release supplementary control object can appear at the preset location, that is, the default fixed location, it facilitates the user to respond rapidly in an information exchange process, thereby avoiding wasting the user's response time for searching the graphical user interface.
  • the fixed location may be fixed with respect to the graphical user interface, with respect to one or more objects on the graphical user interface, or with respect to other types of references.
  • a first location may be obtained with a touch or a slide of a finger, and rendering is performed at the first location on the graphical user interface to obtain the skill-release supplementary control object, so as to display the skill-release supplementary control object at a changeable location with a touch or a slide of the finger.
  • the skill-release supplementary control object can appear at a changeable location with a touch or a slide of the finger, it can well meet the requirement of a user with a habit of rapidly performing skill-release by a slide of the finger, which is different from the requirement of a user with a habit of performing skill-release at a fixed location and aiming in a stable control area to perform skill-release, it can also facilitate the user is facilitated to respond rapidly in an information exchange process, thereby avoiding wasting the user's response time for searching the graphical user interface.
  • a skill-release operation gesture applied to a skill object 1 is obtained, and rendering is performed to obtain a skill-release supplementary control object.
  • the skill-release supplementary control object includes a skill-release control halo object 41 and a virtual joystick object 42 .
  • a skill-release control operation is subsequently triggered, so that the location of the skill-release control halo object 41 remains unchanged, and a skill releasable range specified by a skill indicator object 43 completely cover the area in which the skill-release control halo object 41 is located.
  • the skill-release control halo object 41 may have a shape of a wheel and may be referred to as a wheel; the virtual joystick object 42 may have an annular shape or a ring shape and may be referred to as a joystick; and the skill indicator object 43 may have a shape of a wheel, and may be referred to as a skill indicator.
  • the skill-release control halo object and the virtual joystick object used in this specification are only examples of a skill-release controller object and a joystick object.
  • a skill-release controller object and a joystick object that are obtained by performing rendering in a graphical user interface include, but are not limited to, a shape of a wheel, an annular shape, a ring shape, and other shapes, as long as objects that may be configured to implement skill control can be implemented.
  • FIG. 4 is a schematic diagram of a system configuration interface.
  • the UI effect interface shown in FIG. 3 may be obtained by the configuration interface shown in FIG. 4 .
  • the UI shown in FIG. 3 is subsequently obtained by performing rendering.
  • the center of the joystick can shift from the center of the wheel, and the skill-release control operation is triggered, so that the location of the wheel remains unchanged, and the skill releasable range specified by the skill indicator completely covers the area in which the wheel is located.
  • the skill indicator is an indicator on which rendering is performed to supplement the user to aim or to achieve other purposes.
  • the skill releasable range specified by the skill indicator also completely covers the skill-release control halo object.
  • FIG. 5 shows an example of the threshold range.
  • the virtual joystick object 42 partially overlaps a threshold range 44 , and is not out of the threshold range.
  • a user finger release gesture is obtained to perform a subsequent action, and a character object satisfying the first preset policy is selected, from the at least one character object within the skill releasable range, as the target character object having the highest priority.
  • FIG. 8 shows an example of the threshold range.
  • the virtual joystick object 42 is located at an edge of the skill-release control halo object 41 , and is out of a threshold range 44 .
  • the supplementary skill indicator object includes a fan-shaped indicator object (other shapes, such as an annular shape or a square shape, can also be used).
  • the fan-shaped indicator object is different from the skill indicator object 43 , and has a smaller attack range, so as to specify a target character object having the highest priority within a skill releasable range specified by the fan-shaped indicator object (also referred to as a searching range or an enemy-searching range).
  • a skill-release operation gesture applied on a skill object 1 is obtained, and rendering is performed to obtain a skill-release supplementary control object.
  • the skill-release supplementary control object includes the skill-release control halo object 41 and the virtual joystick object 42 .
  • a skill-release control operation is subsequently triggered, so that the location of the skill-release control halo object 41 remains unchanged, and a skill releasable range specified by a skill indicator object 43 completely covers an area in which the skill-release control halo object 41 is located.
  • the fan-shaped indicator object 45 has the second target selection direction line 452 obtained through refreshing. Centering around the second target selection direction line 452 , the fan-shaped indicator object 45 is formed based on positive and negative offsets on the preset angle a.
  • the terminal further includes a skill-release cancellation unit, which is configured to: when a determining result is that the virtual joystick object touched by a user finger is not out of the threshold range, obtain a user finger release gesture and, if the target character object is not found within the skill releasable range, discard the skill-release operation on the target character object within the skill releasable range, and a display first prompt message, where the first prompt message is used to represent that there is no selectable target character object within the skill releasable range.
  • a skill-release cancellation unit which is configured to: when a determining result is that the virtual joystick object touched by a user finger is not out of the threshold range, obtain a user finger release gesture and, if the target character object is not found within the skill releasable range, discard the skill-release operation on the target character object within the skill releasable range, and a display first prompt message, where the first prompt message is used to represent that there is no selectable target character object within the skill
  • the terminal further includes a second rendering unit and a skill-release unit.
  • the second rendering unit is configured to, when it is determined that the virtual joystick object touched by a user finger is out of the threshold range, perform rendering on the graphical user interface to obtain a supplementary skill indicator object, where a skill releasable range specified by the supplementary skill indicator object is a searching range of a first character object.
  • the skill-release unit is configured to: obtain a user finger release gesture, and select, from at least one first character object within the searching range, a first character object satisfying a second preset policy as the target character object having the highest priority; and perform the skill-release operation on the target character object within the searching range.
  • the second rendering unit is further configured to: obtain a first relative direction formed by the virtual joystick object relative to a center of the skill-release control halo object, generate a first target selection direction line in the first relative direction by using a connection line to a center point of the skill indicator object; and, centering around the first target selection direction line, form the supplementary skill indicator object based on positive and negative offsets plus a preset angle.
  • the skill-release unit is further configured to, when the searching range includes multiple first character objects, select a first character object with the shortest vertical distance to the first target selection direction line as the target character object having the highest priority, and to highlight the target character object.
  • the terminal further includes a refreshing unit, which is configured to: each time it is detected that a location of the virtual joystick object changes relative to the center of the skill-release control halo object, refresh the first relative direction to a second relative direction and generate, in the second relative direction, a second target selection direction line by using a connection line to the center point of the skill indicator object; and select a first character object with the shortest vertical distance to the second target selection direction line as the target character object having the highest priority, and highlight the target character object, to re-determine and refresh the target character object each time the virtual joystick object moves, and highlight a new target character object obtained through refreshing.
  • a refreshing unit which is configured to: each time it is detected that a location of the virtual joystick object changes relative to the center of the skill-release control halo object, refresh the first relative direction to a second relative direction and generate, in the second relative direction, a second target selection direction line by using a connection line to the center point of the skill indicator object; and select a first character object with the shortest
  • the terminal further includes a skill-release unit, which is configured to: when it is determined that the virtual joystick object touched by a user finger is out of the threshold range, perform rendering on the graphical user interface to obtain a supplementary skill indicator object, where a skill releasable range specified by the supplementary skill indicator object is a searching range of a first character object; obtain a user finger release gesture and, if the first character object used as the target character object is not found from the searching range, discard performing the skill-release operation on the target character object within the searching range, and display second prompt message, where the second prompt message is used to represent that there is no selectable target character object within the searching range.
  • a skill-release unit which is configured to: when it is determined that the virtual joystick object touched by a user finger is out of the threshold range, perform rendering on the graphical user interface to obtain a supplementary skill indicator object, where a skill releasable range specified by the supplementary skill indicator object is a searching range of a first character object; obtain a user finger release
  • a terminal is provided. As shown in FIG. 12 , the terminal includes: a display 61 and a processor 62 .
  • the display 61 is configured to: execute a software application on the processor of the terminal and then perform rendering on the software application, to obtain a graphical user interface.
  • the graphical user interface is configured to facilitate control processing in man-machine interaction.
  • the processor 62 is configured to perform the information processing method in the embodiments of the present invention.
  • the processor, the graphical user interface, and the software application are implemented in a game system.
  • the terminal further includes: a memory 63 , an input device 64 (for example, a peripheral device such as a collection device including a camera, a microphone, and a headset; a mouse, a joystick, or a desktop computer keyboard; or a physical keyboard or a touchscreen on a notebook computer or a tablet computer), an output device 65 (for example, an audio output device or a video output device including a speaker, a headset, and the like), a bus 66 , and a networking device 67 .
  • an input device 64 for example, a peripheral device such as a collection device including a camera, a microphone, and a headset; a mouse, a joystick, or a desktop computer keyboard; or a physical keyboard or a touchscreen on a notebook computer or a tablet computer
  • an output device 65 for example, an audio output device or a video output device including a speaker, a headset, and the like
  • the processor 62 , the memory 63 , the input device 64 , the display 61 , and the networking device 67 are connected by using the bus 66 , and the bus 66 is used for data transmission and communication between the processor 62 , the memory 63 , the display 61 , and the networking device 67 .
  • the input device 64 is mainly configured to obtain an input operation of a user, and the input device 64 may vary with the terminal.
  • the input device 64 may be an input device such as a mouse or a keyboard; when the terminal is portable device such as a smartphone or a tablet computer, the input device 64 may be a touchscreen.
  • the networking device 67 is used by multiple terminals and a server to connect and upload and download data by using a network, and used by multiple terminals to connect and perform data transmission by using a network.
  • the server may be formed by a cluster system, and to implement functions of various units, the functions may be combined or functions of the units are separately provided in an electronic device.
  • Either the terminal or the server at least includes a database for storing data and a processor for data processing, or includes a storage medium disposed in the server or a storage medium that is disposed separately.
  • a microprocessor for data processing, during processing, a microprocessor, a central processing unit (CPU), a digital signal processor (DSP), or a field programmable gate array (FPGA) may be used for implementation.
  • the storage medium includes an operation instruction, the operation instruction may be computer executable code, and steps in the procedure of the information processing method in the embodiments of the present invention are implemented by using the operation instruction.
  • a computer storage medium is provided.
  • a computer executable instruction is stored in the computer storage medium, and the computer executable instruction is configured to perform the information processing method in the embodiments of the present invention.
  • an application scenario of a game system may be implemented according the embodiments of the present invention.
  • This application scenario is related to Multiplayer Online Battle Arena Games (MOBA).
  • MOBA Multiplayer Online Battle Arena Games
  • related terms are as follows: 1) the UI layer, that is, icons in the graphical user interface; 2) a skill indicator: used to supplement a skill-release special effect, halo, or operation; 3) lens, which may be understood as a camera in the game; 4) mini map: a scaled-down version of a large map, which may be understood as a radar map, where information and locations of two parties are displayed in the map; 5) wheel: a halo displayed above a skill key when the skill key is pressed; and 6) virtual joystick: a control for an operation and locating on the wheel.
  • a flow chart of the application scenario includes the followings, where the first circle is an indicator, the second circle is a secondary attack range, and the third circle is an enemy searching range.
  • a system invokes a supplementary spellcasting wheel, and detects whether a virtual joystick touched by a finger of the player is out of a movement threshold set by the system.
  • the joystick is not out of a threshold range (as shown in FIG. 5 )
  • the player releases the finger, and the system searches, within a skill releasable range according to a determined priority, for a target currently having the highest priority, and performs skill-release on the target (which is also referred to as an automatic enemy searching mechanism). If there is no target within the range, the system discards current spellcasting, and prompts the player that “there is no selectable target within the range”.
  • the joystick is out of the threshold range (as shown in FIG. 5 )
  • the followings in (2) are performed.
  • a fan-shaped indicator is invoked, and is projected to the scene at a location of the joystick relative to a center of the wheel, and a connection line to a center point is drawn in a relative direction, a target selection direction line, as a fan shape of 90 degrees formed based on positive and negative 45 degrees centering around the direction line, as shown in FIG. 9 .
  • a hero is selected as a skill target and is highlighted. If there are multiple heroes within the range, a hero with the shortest vertical distance to the direction line is selected as a skill target and highlighted.
  • Target determination is refreshed each time the virtual joystick moves, as shown in FIG. 10 , and a new target is highlighted immediately. In this case, when the finger is released, the skill-release operation is performed on the highlighted target. If there is no target within the range, the player is prompted that “there is no selectable target within the range”, and skill-release is canceled.
  • FIG. 14 is a schematic flowchart of specific interaction in an information processing method in this application scenario.
  • a terminal 1 a terminal 1 , a terminal 2 , and a server are included.
  • the user 1 performs triggering and control by using the terminal 1
  • the user 2 performs triggering and control by using the terminal 2 ; and the method includes the followings.
  • step 11 to step 16 are included.
  • Step 11 The user 1 triggers the game system by using the terminal 1 , and registers identity authentication information, where the identity authentication information may be a user name and a password.
  • Step 12 The terminal 1 transmits the obtained identity authentication information to the server 3 , and the server 3 performs identity authentication, and returns a first graphical user interface to the terminal 1 after the identity authentication succeeds, where the first graphical user interface includes a virtual resource object.
  • Step 13 A specified virtual resource object (such as an SMS message object in FIG. 3 ) can respond based on a touch operation of the user 1 , and performs a series of virtual operations in step 14 to step 17 .
  • a specified virtual resource object such as an SMS message object in FIG. 3
  • Step 14 Terminal 1 performs rendering on the graphical user interface to obtain a skill-release supplementary control object when detecting a skill-release trigger gesture on at least one skill object located in at least one skill operation area on the graphical user interface, the skill-release supplementary control object including a skill-release control halo object and a virtual joystick object located within a radiation range of the skill-release control halo object.
  • the skill releasable range specified by the skill indicator object completely covers the area in which the skill-release control halo object is located.
  • Step 15 Controlling, when detecting a drag operation on the virtual joystick object, a skill-release location of the skill object to be correspondingly adjusted on the graphical user interface.
  • Step 16 Determining whether the virtual joystick object is out of a threshold range, when the virtual joystick object is not out of the threshold range, according to a detected release operation of the drag operation, select a target character object satisfying a first preset policy from at least one character object within a skill releasable range of the skill object, and perform a skill-release operation on the target character object.
  • Step 17 Synchronizing an execution result obtained by performing step 14 to step 16 to the server, or instantly transferring the execution result to the terminal 2 by using the server, or directly forward the execution result to the terminal 2 , so that the user 2 that logs in to the game system by using the terminal 2 can respond to the virtual operation of the user 1 , so as to implement interaction between multiple terminals.
  • interaction between two terminals is used as an example, and during an actual operation, interaction between multiple terminals may be not limited to interaction between the two terminals in this example.
  • step 21 to step 26 are included.
  • Step 21 The user 2 triggers the game system by using the terminal 2 , and registers identity authentication information, where the identity authentication information may be a user name and a password.
  • Step 22 The terminal 2 transmits the obtained identity authentication information to the server 3 , and the server 3 performs identity authentication, and returns a second graphical user interface to the terminal 2 after the identity authentication succeeds, where the second graphical user interface includes a virtual resource object.
  • Step 23 A specified virtual resource object (such as an SMS message object in FIG. 3 ) can respond based on a touch operation of the user 2 , and performs a series of virtual operations in step 24 to step 27 .
  • a specified virtual resource object such as an SMS message object in FIG. 3
  • Step 24 When detecting a skill-release trigger gesture on at least one skill object located in at least one skill operation area on the graphical user interface, terminal 2 performs rendering on the graphical user interface to obtain a skill-release supplementary control object, where the skill-release supplementary control object includes a skill-release control halo object and a virtual joystick object located within a radiation range of the skill-release control halo object.
  • the skill releasable range specified by the skill indicator object completely covers the area in which the skill-release control halo object is located.
  • Step 25 When detecting a drag operation on the virtual joystick object, controlling a skill-release location of the skill object to be correspondingly adjusted on the graphical user interface.
  • Step 26 Determine whether the virtual joystick object is out of a threshold range and, when the virtual joystick object is not out of the threshold range, according to a detected release operation of the drag operation, selecting a target character object satisfying a first preset policy from at least one character object within a skill releasable range of the skill object, and performing a skill-release operation on the target character object.
  • Step 27 Synchronizing an execution result obtained by performing step 24 to step 26 to the server, or instantly transferring the execution result to the terminal 1 by using the server, or directly forward the execution result to the terminal 1 , so that the user 1 that logs in to the game system by using the terminal 1 can respond to the virtual operation of the user 2 , so as to implement interaction between multiple terminals.
  • interaction between two terminals is used as an example, and during an actual operation, interaction between multiple terminals may be not limited to interaction between the two terminals in this example.
  • Step 30 Optionally, after receiving a first man-machine interaction execution result obtained by step 14 to step 17 and/or a second interaction execution result obtained by step 24 to step 27 , synchronizing or transferring the first man-machine interaction execution result and/or the second interaction execution result to corresponding terminals.
  • the disclosed devices and methods may be implemented in other manners.
  • the described device embodiments are merely examples.
  • the unit division is merely logical function division and may be other division during actual implementation.
  • multiple units or components may be combined or integrated into another system, or some features may be ignored or not performed.
  • the displayed or discussed mutual couplings or direct couplings or communication connections between constituent parts may be implemented through some interfaces.
  • the indirect couplings or communication connections between the devices or units may be implemented in electronic, mechanic, or other forms.
  • the units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one location, or may be distributed on multiple network units. Some or all of the units may be selected according to actual needs to achieve the objectives of the solutions of the embodiments.
  • the functional units in the embodiments of the present invention may all be integrated into one processing unit, or each of the units may exist separately, or two or more units are integrated into one unit, and the integrated unit may be implemented in a form of hardware, or may be implemented in a form of hardware in addition to a software functional unit.
  • the program may be stored in a computer-readable storage medium. When the program runs, the steps of the method embodiments are performed.
  • the foregoing storage medium includes: any medium that can store program code, such as a portable storage device, a read-only memory (ROM), a random access memory (RAM), a magnetic disk, or an optical disc.
  • the integrated unit when the integrated unit is implemented in a form of a software functional module and sold or used as an independent product, the integrated unit may be stored in a computer-readable storage medium.
  • the computer software product is stored in a storage medium, and includes several instructions for instructing a computer device (which may be a personal computer, a server, or a network device) to perform all or some of the steps of the methods described in the embodiments of the present invention.
  • the foregoing storage medium includes: any medium that can store program code, such as a portable storage device, a ROM, a RAM, a magnetic disk, or an optical disc.
  • a target character object is selected and located by determining whether a virtual joystick object touched by a user finger is out of a threshold range.
  • a finger release gesture of the user is obtained; and a character object satisfying a first preset policy is selected from at least one character object within a skill releasable range as the target character object having the highest priority.
  • a skill-release operation is then performed on the target character object. In such a way of releasing the specific skill, the target object for the skill-release can be located accurately and rapidly, avoiding mis-operation and improving interaction processing speed due to the improved locating accuracy.

Abstract

An information processing method includes: performing rendering on a graphical user interface to obtain at least one virtual resource object; when detecting a skill-release trigger gesture on at least one skill object, performing rendering to obtain a skill-release supplementary control object, having a skill-release control halo object and a virtual joystick object; when detecting a dragging operation on the virtual joystick object, controlling a skill-release location of the skill object to be correspondingly adjusted; and determining whether the virtual joystick object is out of a threshold range and, when the virtual joystick object is not out of the threshold range, selecting a target character object satisfying a first preset policy from at least one character object within a skill releasable range of the skill object according to a detected release operation of the dragging operation, and performing a skill-release operation on the target character object.

Description

RELATED APPLICATION
This patent application is a continuation application of U.S. patent application Ser. No. 15/720,432, filed on Sep. 29, 2017. U.S. patent application Ser. No. 15/720,432 is a continuation application of PCT Patent Application No. PCT/CN2016/083208, filed on May 24, 2016, which claims priority to Chinese Patent Application No. 201510654167.4, filed by Tencent Technology (Shenzhen) Company Limited on Oct. 10, 2015, and entitled “INFORMATION PROCESSING METHOD, TERMINAL, AND COMPUTER STORAGE MEDIUM”, the entire content of all above applications are incorporated by reference in entirety.
FIELD OF THE TECHNOLOGY
The present disclosure relates to information exchange technologies and, more particularly, to an information processing method, terminal, and computer storage medium.
BACKGROUND OF THE DISCLOSURE
With large-screen and super-screen intelligent terminals gradually gaining popularity, processors of the intelligent terminals have increasingly high processing capability, so that many applications have been developed to realize operation and control on the large-screen or super-screen based on man-machine interaction. When performing the operation and control based on man-machine interaction, multiple users may run different interaction modes by creating groups in one-to-one, one-to-many, and many-to-many formats, so as to obtain different interaction results. For example, in a graphical user interface obtained through rendering on a large screen or a super screen, after multiple users are grouped into two different groups, through the operation and control processing of man-machine interaction, information exchange may be performed between the different groups, and different interaction results are obtained according to responses to the information exchange. Through the operation and control processing of man-machine interaction, information exchange may also be performed among group members in a same group, and different interaction results are obtained according to responses to the information exchange.
In the existing technology, in an information exchange process, release of a specific skill may be triggered to enrich a presentation form and content of information, and different presentation forms and content of information may finally lead to different interaction results. However, currently, the way to release a specific skill cannot accurately and rapidly locate a target object to which the specific skill is directed, easily causing mis-operation, and impacting the interaction processing speed due to locating inaccuracy.
SUMMARY
In view of this, embodiments of the present invention provides an information processing method, terminal, and computer storage medium, so as to resolve at least one problem in the existing technology, so that a target object for skill release operation can be located accurately and rapidly, avoiding mis-operation and improving interaction processing speed due to the improved locating accuracy.
The technical solutions in the embodiments of the present invention are implemented as follows.
An embodiment of the present invention provides an information processing method implemented by a computer system. The method includes: performing rendering on a graphical user interface to obtain at least one virtual resource object; when detecting a skill-release trigger gesture on at least one skill object located in at least one skill operation area on the graphical user interface, performing rendering on the graphical user interface to obtain a skill-release supplementary control object, the skill-release supplementary control object comprising a skill-release control halo object and a virtual joystick object located within a radiation range of the skill-release control halo object; when detecting a dragging operation on the virtual joystick object, controlling a skill-release location of the skill object to be correspondingly adjusted on the graphical user interface; and determining whether the virtual joystick object is out of a threshold range and, when the virtual joystick object is not out of the threshold range, selecting a target character object satisfying a first preset policy from at least one character object within a skill releasable range of the skill object according to a detected release operation of the dragging operation, and performing a skill-release operation on the target character object.
An embodiment of the present invention further provides a terminal, the terminal including: a display, a memory storing instructions, and a processor coupled to the memory. When executing the instructions, the processor is configured for: performing rendering on a graphical user interface to obtain at least one virtual resource object; when detecting a skill-release trigger gesture on at least one skill object located in at least one skill operation area on the graphical user interface, performing rendering on the graphical user interface to obtain a skill-release supplementary control object, the skill-release supplementary control object comprising a skill-release control halo object and a virtual joystick object located within a radiation range of the skill-release control halo object; when detecting a dragging operation on the virtual joystick object, controlling a skill-release location of the skill object to be correspondingly adjusted on the graphical user interface; and determining whether the virtual joystick object is out of a threshold range and, when the virtual joystick object is not out of the threshold range, selecting a target character object satisfying a first preset policy from at least one character object within a skill releasable range of the skill object according to a detected release operation of the dragging operation, and performing a skill-release operation on the target character object.
An embodiment of the present invention further provides a non-transitory computer-readable storage medium. The non-transitory computer-readable storage medium contains computer-executable instructions for, when executed by a processor, performing an information processing method. The method includes: performing rendering on a graphical user interface to obtain at least one virtual resource object; when detecting a skill-release trigger gesture on at least one skill object located in at least one skill operation area on the graphical user interface, performing rendering on the graphical user interface to obtain a skill-release supplementary control object, the skill-release supplementary control object comprising a skill-release control halo object and a virtual joystick object located within a radiation range of the skill-release control halo object; when detecting a dragging operation on the virtual joystick object, controlling a skill-release location of the skill object to be correspondingly adjusted on the graphical user interface; and determining whether the virtual joystick object is out of a threshold range and, when the virtual joystick object is not out of the threshold range, selecting a target character object satisfying a first preset policy from at least one character object within a skill releasable range of the skill object according to a detected release operation of the dragging operation, and performing a skill-release operation on the target character object.
BRIEF DESCRIPTION OF THE DRAWINGS
FIG. 1 is a schematic diagram of various hardware devices for performing information exchange according to an embodiment of the present invention;
FIG. 2 is a schematic flowchart of an information processing method according to an embodiment of the present invention;
FIG. 3 is a schematic diagram of a user interface (UI) according to an embodiment of the present invention;
FIG. 4 is a schematic diagram of a system setting interface for generating the UI effect in FIG. 3 according to an embodiment of the present invention;
FIG. 5 is a schematic diagram of another UI according to an embodiment of the present invention;
FIG. 6 is a schematic flowchart of another information processing method according to an embodiment of the present invention;
FIG. 7 is a schematic flowchart of another information processing method according to an embodiment of the present invention;
FIG. 8 to FIG. 10 are schematic diagrams of multiple UIs according to embodiments of the present invention;
FIG. 11 is a schematic diagram of an information processing terminal according to an embodiment of the present invention;
FIG. 12 is a schematic hardware structural diagram of an information processing terminal according to an embodiment of the present invention;
FIG. 13 is a schematic flowchart of implementation of a specific application scenario according to an embodiment of the present invention; and
FIG. 14 is a schematic flowchart of implementation of another specific application scenario according to an embodiment of the present invention.
DESCRIPTION OF EMBODIMENTS
Implementation of the technical solutions is further described in detail below with reference to the accompanying drawings.
FIG. 1 is a schematic diagram of various hardware devices in an operating environment for performing information exchange according to an embodiment of the present invention. As shown in FIG. 1, the operating environment includes: one or more servers, where a server 11 is only an example, terminal devices 21 to 25, and a network 31. The network 31 includes network entities such as routers and gateways (not shown).
The terminal devices 21 to 25 perform information exchange with the server by using a wired network or a wireless network, so as to download an application, an application update data packet, and/or application related data information or service information from the server 11. Various types of the terminal devices are shown in FIG. 1, including a mobile phone (terminal 23), a tablet computer or a PDA (terminal 25), a desktop computer (terminal 22), a PC (terminal 24), an all-in-one PC (terminal 21), and other types. Various applications required by a user are installed in the terminal device, for example, an application having an entertainment function (such as a video application, an audio play application, a game application, or reading software) or an application having a serving function (such as a map navigation application, or a group purchasing application).
Using the electronic game scene as an example, by using the network 31, the terminal devices 21 to 25 download a game application, a game application update data packet, and/or game application related data information or service information from the server 11 according to a requirement. According to embodiments of the present invention, after the game application is started on the terminal device and a game interface obtained through rendering is entered, when a skill-release trigger gesture is detected on at least one skill object located in at least one skill operation area in the game interface, a skill-release supplementary control object is obtained through rendering on the graphical user interface.
The skill-release supplementary control object including a skill-release control halo object and a virtual joystick object located within a radiation range of the skill-release control halo object. When a drag operation on the virtual joystick object is detected, a skill-release location of the skill object is controlled to be correspondingly adjusted on the graphical user interface; and it is determined whether the virtual joystick object is out of a threshold range. when the virtual joystick object is not out of the threshold range, based on a detected release operation of the drag operation and from a skill releasable range of the skill object, a target character object satisfying a first preset policy is selected from at least one character object. Further, a skill-release operation is performed on the selected target character object.
Because the target character object can be selected and located by determining whether a virtual joystick object touched by a user finger is out of a threshold range, it facilitates the user to respond rapidly in the information exchange process, thereby avoiding wasting user's response time on searching the graphical user interface. In such a manner of releasing the specific skill, a target object for skill-release can be located accurately and rapidly, avoiding mis-operation and improving interaction processing speed due to the improved locating accuracy.
The example in FIG. 1 is only an example of a system architecture for implementing the embodiments of the present invention, and the embodiments of the present invention are not limited to the system architecture in FIG. 1. Based on the system architecture, various embodiments of the present invention are provided in the followings.
In an embodiment of the present invention, an information processing method is provided. A software application is executed on a processor of a terminal and rendering is performed on a display of the terminal to obtain a graphical user interface. The processor, the graphical user interface, and the software application are implemented in, for example, a game system. As shown in FIG. 2, the method includes the followings.
Step 101: Performing rendering on the graphical user interface to obtain at least one virtual resource object.
The virtual resource object includes various types of objects on the graphical user interface. For example, a user avatar icon for representing a user, an object for representing a building, a tree, tower defense, or the like in a background, an object for representing a status (such as a blood value or a vitality value) of the user, an object for representing a skill, equipment, or the like of the user, a direction button object for controlling a change of a location of the user, a rendering object used during skill-release by the user, and the like, shall all fall within the protection scope of the “virtual resource object” of the embodiments of the present invention.
Step 102: When detecting a skill-release trigger gesture on at least one skill object located in at least one skill operation area on the graphical user interface, performing rendering on the graphical user interface to obtain a skill-release supplementary control object. The skill-release supplementary control object includes a skill-release control halo object and a virtual joystick object located within a radiation range of the skill-release control halo object.
Specifically, two methods may be used. In a first method, rendering may be performed at a preset location on the graphical user interface to obtain the skill-release supplementary control object, so as to display the skill-release supplementary control object at a default fixed location. Because the skill-release supplementary control object can appear at the preset location, that is, the default fixed location, it facilitates the user to respond rapidly in an information exchange process, thereby avoiding wasting the user's response time for searching the graphical user interface. The fixed location may be fixed with respect to the graphical user interface, with respect to one or more objects on the graphical user interface, or with respect to other types of references.
In a second method, a first location may be obtained with a touch or a slide of a finger, and rendering is performed at the first location on the graphical user interface to obtain the skill-release supplementary control object, so as to display the skill-release supplementary control object at a changeable location with a touch or a slide of the finger. Because the skill-release supplementary control object can appear at a changeable location with a touch or a slide of the finger, it can well meet the requirement of a user with a habit of rapidly performing skill-release by a slide of the finger, which is different from the requirement of a user with a habit of performing skill-release at a fixed location and aiming in a stable control area to perform skill-release, it can also facilitate the user is facilitated to respond rapidly in an information exchange process, thereby avoiding wasting the user's response time for searching the graphical user interface.
Using the first method as an example, as shown in FIG. 3, in a skill operation area 40 of a graphical user interface, a skill-release operation gesture applied to a skill object 1 is obtained, and rendering is performed to obtain a skill-release supplementary control object. The skill-release supplementary control object includes a skill-release control halo object 41 and a virtual joystick object 42. A skill-release control operation is subsequently triggered, so that the location of the skill-release control halo object 41 remains unchanged, and a skill releasable range specified by a skill indicator object 43 completely cover the area in which the skill-release control halo object 41 is located.
Specifically, as shown in FIG. 3, the skill-release control halo object 41 may have a shape of a wheel and may be referred to as a wheel; the virtual joystick object 42 may have an annular shape or a ring shape and may be referred to as a joystick; and the skill indicator object 43 may have a shape of a wheel, and may be referred to as a skill indicator. It should be noted that, the skill-release control halo object and the virtual joystick object used in this specification are only examples of a skill-release controller object and a joystick object. A skill-release controller object and a joystick object that are obtained by performing rendering in a graphical user interface include, but are not limited to, a shape of a wheel, an annular shape, a ring shape, and other shapes, as long as objects that may be configured to implement skill control can be implemented.
FIG. 4 is a schematic diagram of a system configuration interface. The UI effect interface shown in FIG. 3 may be obtained by the configuration interface shown in FIG. 4. As shown in FIG. 4, when a user selects an option to set the wheel appearance location to the fixed location, being matched with such setting of the user, the UI shown in FIG. 3 is subsequently obtained by performing rendering. In the UI, the center of the joystick can shift from the center of the wheel, and the skill-release control operation is triggered, so that the location of the wheel remains unchanged, and the skill releasable range specified by the skill indicator completely covers the area in which the wheel is located. The skill indicator is an indicator on which rendering is performed to supplement the user to aim or to achieve other purposes. The skill releasable range specified by the skill indicator also completely covers the skill-release control halo object.
Step 103: When detecting a drag operation on the virtual joystick object, controlling the skill-release location of the skill object to be correspondingly adjusted on the graphical user interface.
Step 104: Determining whether the virtual joystick object is out of a threshold range. When it is determined that the virtual joystick object is not out of the threshold range, based on a detected release operation of the drag operation and from at least one character object within a skill releasable range of the skill object, selecting a target character object having the highest priority and satisfying a first preset policy, and performing a skill-release operation on the target character object.
FIG. 5 shows an example of the threshold range. As shown in FIG. 5, the virtual joystick object 42 partially overlaps a threshold range 44, and is not out of the threshold range. In such a case, a user finger release gesture is obtained to perform a subsequent action, and a character object satisfying the first preset policy is selected, from the at least one character object within the skill releasable range, as the target character object having the highest priority.
Further, performing a skill-release operation on the target character object may include: based on a release location and/or direction of the skill object obtained by the movement of the virtual joystick object along the dragging of the skill-release operation gesture, performing the skill-release operation on the target character object within the skill releasable range.
According to the embodiments of the present invention, a target character object is selected and located by determining whether a virtual joystick object touched by a user finger is out of a threshold range. A finger release gesture of the user is obtained when it is determined that the virtual joystick object touched by the user finger is not out of the threshold range; and a character object satisfying a first preset policy is selected from at least one character object within a skill releasable range as the target character object having the highest priority. A skill-release operation is then performed on the target character object within the skill releasable range according to a release location and/or direction of the skill object obtained through movement of the virtual joystick object along with the skill-release operation gesture. In such a way of releasing the specific skill, the target object for the skill-release can be located accurately and rapidly, avoiding mis-operation and improving interaction processing speed due to the improved locating accuracy.
In an embodiment of the present invention, another information processing method is provided. The method is applied to an electronic device, the electronic device includes a display unit, and the display unit includes a display area. As shown in FIG. 6, the method includes the followings.
Step 201: Performing rendering in a graphical user interface to obtain at least one virtual resource object.
The virtual resource object includes various types of objects on the graphical user interface. For example, a user avatar icon for representing a user, an object for representing a building, a tree, tower defense, or the like in a background, an object for representing a status (such as a blood value or a vitality value) of the user, an object for representing a skill, equipment, or the like of the user, a direction button object for controlling a change of a location of the user, a rendering object used during skill-release by the user, and the like, shall all fall within the protection scope of the “virtual resource object” of the embodiments of the present invention.
Step 202: When detecting a skill-release trigger gesture on at least one skill object located in at least one skill operation area on the graphical user interface, performing rendering on the graphical user interface to obtain a skill-release supplementary control object. The skill-release supplementary control object includes a skill-release control halo object and a virtual joystick object located within a radiation range of the skill-release control halo object.
Specifically, two methods may be used. In a first method, rendering may be performed at a preset location on the graphical user interface to obtain the skill-release supplementary control object, so as to display the skill-release supplementary control object at a default fixed location. Because the skill-release supplementary control object can appear at the preset location, that is, the default fixed location, it facilitates the user to respond rapidly in an information exchange process, thereby avoiding wasting the user's response time for searching the graphical user interface. The fixed location may be fixed with respect to the graphical user interface, with respect to one or more objects on the graphical user interface, or with respect to other types of references.
In a second method, a first location may be obtained with a touch or a slide of a finger, and rendering is performed at the first location on the graphical user interface to obtain the skill-release supplementary control object, so as to display the skill-release supplementary control object at a changeable location with a touch or a slide of the finger. Because the skill-release supplementary control object can appear at a changeable location with a touch or a slide of the finger, it can well meet the requirement of a user with a habit of rapidly performing skill-release by a slide of the finger, which is different from the requirement of a user with a habit of performing skill-release at a fixed location and aiming in a stable control area to perform skill-release, it can also facilitate the user is facilitated to respond rapidly in an information exchange process, thereby avoiding wasting the user's response time for searching the graphical user interface.
Using the first method as an example, as shown in FIG. 3, in a skill operation area 40 of a graphical user interface, a skill-release operation gesture applied to a skill object 1 is obtained, and rendering is performed to obtain a skill-release supplementary control object. The skill-release supplementary control object includes a skill-release control halo object 41 and a virtual joystick object 42. A skill-release control operation is subsequently triggered, so that the location of the skill-release control halo object 41 remains unchanged, and a skill releasable range specified by a skill indicator object 43 completely cover the area in which the skill-release control halo object 41 is located.
Specifically, as shown in FIG. 3, the skill-release control halo object 41 may have a shape of a wheel and may be referred to as a wheel; the virtual joystick object 42 may have an annular shape or a ring shape and may be referred to as a joystick; and the skill indicator object 43 may have a shape of a wheel, and may be referred to as a skill indicator. It should be noted that, the skill-release control halo object and the virtual joystick object used in this specification are only examples of a skill-release controller object and a joystick object. A skill-release controller object and a joystick object that are obtained by performing rendering in a graphical user interface include, but are not limited to, a shape of a wheel, an annular shape, a ring shape, and other shapes, as long as objects that may be configured to implement skill control can be implemented.
FIG. 4 is a schematic diagram of a system configuration interface. The UI effect interface shown in FIG. 3 may be obtained by the configuration interface shown in FIG. 4. As shown in FIG. 4, when a user selects an option to set the wheel appearance location to the fixed location, being matched with such setting of the user, the UI shown in FIG. 3 is subsequently obtained by performing rendering. In the UI, the center of the joystick can shift from the center of the wheel, and the skill-release control operation is triggered, so that the location of the wheel remains unchanged, and the skill releasable range specified by the skill indicator completely covers the area in which the wheel is located. The skill indicator is an indicator on which rendering is performed to supplement the user to aim or to achieve other purposes. The skill releasable range specified by the skill indicator also completely covers the skill-release control halo object.
Step 203: When detecting a drag operation on the virtual joystick object, controlling the skill-release location of the skill object to be correspondingly adjusted on the graphical user interface.
Step 204: When detecting that the virtual joystick object moves along with a skill-release operation gesture, determining whether the virtual joystick object is out of a threshold range. When it is determined that the virtual joystick object is not out of the threshold range, based on a detected release operation of the drag operation and from at least one character object within a skill releasable range of the skill object, selecting a target character object having the highest priority and satisfying a first preset policy. When no such target character object can be selected within the skill releasable range, discarding the skill-release operation on a target character object within a skill releasable range, and display a first prompt message.
The first prompt message is used to represent that there is no selectable target character object within the skill releasable range.
FIG. 5 shows an example of the threshold range. As shown in FIG. 5, the virtual joystick object 42 partially overlaps a threshold range 44, and is not out of the threshold range. In such a case, a user finger release gesture is obtained to perform a subsequent action, and a character object satisfying the first preset policy is selected, from the at least one character object within the skill releasable range, as the target character object having the highest priority.
According to the embodiments of the present invention, a target character object is selected and located by determining whether a virtual joystick object touched by a user finger is out of a threshold range. A finger release gesture of the user is obtained when it is determined that the virtual joystick object touched by the user finger is not out of the threshold range, but a character object satisfying a first preset policy cannot be selected from at least one character object within a skill releasable range as the target character object, the skill-release operation on the target character object within a skill releasable range is discarded, and a first prompt message is displayed. The first prompt message is used to indicate to the user that there is no selectable target character object within the skill releasable range. In such a way of releasing the specific skill, a target object for skill-release can be located accurately and rapidly, and the skill-release operation is discarded if the target object is not found, which can avoid mis-operation and continue with other subsequent action to re-determine the target object, improving interaction processing speed due to the improved locating accuracy.
In an embodiment of the present invention, another information processing method is provided. The method is applied to an electronic device, the electronic device includes a display unit, and the display unit includes a display area. As shown in FIG. 7, the method includes the following steps:
Step 301: Performing rendering on the graphical user interface to obtain at least one virtual resource object.
The virtual resource object includes various types of objects on the graphical user interface. For example, a user avatar icon for representing a user, an object for representing a building, a tree, tower defense, or the like in a background, an object for representing a status (such as a blood value or a vitality value) of the user, an object for representing a skill, equipment, or the like of the user, a direction button object for controlling a change of a location of the user, a rendering object used during skill-release by the user, and the like, shall all fall within the protection scope of the “virtual resource object” of the embodiments of the present invention.
Step 302: When detecting a skill-release trigger gesture on at least one skill object located in at least one skill operation area on the graphical user interface, performing rendering on the graphical user interface to obtain a skill-release supplementary control object. The skill-release supplementary control object includes a skill-release control halo object and a virtual joystick object located within a radiation range of the skill-release control halo object.
Specifically, two methods may be used. In a first method, rendering may be performed at a preset location on the graphical user interface to obtain the skill-release supplementary control object, so as to display the skill-release supplementary control object at a default fixed location. Because the skill-release supplementary control object can appear at the preset location, that is, the default fixed location, it facilitates the user to respond rapidly in an information exchange process, thereby avoiding wasting the user's response time for searching the graphical user interface. The fixed location may be fixed with respect to the graphical user interface, with respect to one or more objects on the graphical user interface, or with respect to other types of references.
In a second method, a first location may be obtained with a touch or a slide of a finger, and rendering is performed at the first location on the graphical user interface to obtain the skill-release supplementary control object, so as to display the skill-release supplementary control object at a changeable location with a touch or a slide of the finger. Because the skill-release supplementary control object can appear at a changeable location with a touch or a slide of the finger, it can well meet the requirement of a user with a habit of rapidly performing skill-release by a slide of the finger, which is different from the requirement of a user with a habit of performing skill-release at a fixed location and aiming in a stable control area to perform skill-release, it can also facilitate the user is facilitated to respond rapidly in an information exchange process, thereby avoiding wasting the user's response time for searching the graphical user interface.
Using the first method as an example, as shown in FIG. 3, in a skill operation area 40 of a graphical user interface, a skill-release operation gesture applied to a skill object 1 is obtained, and rendering is performed to obtain a skill-release supplementary control object. The skill-release supplementary control object includes a skill-release control halo object 41 and a virtual joystick object 42. A skill-release control operation is subsequently triggered, so that the location of the skill-release control halo object 41 remains unchanged, and a skill releasable range specified by a skill indicator object 43 completely cover the area in which the skill-release control halo object 41 is located.
Specifically, as shown in FIG. 3, the skill-release control halo object 41 may have a shape of a wheel and may be referred to as a wheel; the virtual joystick object 42 may have an annular shape or a ring shape and may be referred to as a joystick; and the skill indicator object 43 may have a shape of a wheel, and may be referred to as a skill indicator. It should be noted that, the skill-release control halo object and the virtual joystick object used in this specification are only examples of a skill-release controller object and a joystick object. A skill-release controller object and a joystick object that are obtained by performing rendering in a graphical user interface include, but are not limited to, a shape of a wheel, an annular shape, a ring shape, and other shapes, as long as objects that may be configured to implement skill control can be implemented.
FIG. 4 is a schematic diagram of a system configuration interface. The UI effect interface shown in FIG. 3 may be obtained by the configuration interface shown in FIG. 4. As shown in FIG. 4, when a user selects an option to set the wheel appearance location to the fixed location, being matched with such setting of the user, the UI shown in FIG. 3 is subsequently obtained by performing rendering. In the UI, the center of the joystick can shift from the center of the wheel, and the skill-release control operation is triggered, so that the location of the wheel remains unchanged, and the skill releasable range specified by the skill indicator completely covers the area in which the wheel is located. The skill indicator is an indicator on which rendering is performed to supplement the user to aim or to achieve other purposes. The skill releasable range specified by the skill indicator also completely covers the skill-release control halo object.
Step 303: When detecting a drag operation on the virtual joystick object, controlling the skill-release location of the skill object to be correspondingly adjusted on the graphical user interface.
Step 304: Determining whether the virtual joystick object is out of a threshold range. When it is determined that the virtual joystick object is out of the threshold range, a supplementary skill indicator object is obtained by rendering on the graphical user interface, and a skill releasable range specified by the supplementary skill indicator object is used as the searching range for a first character object.
FIG. 8 shows an example of the threshold range. As shown in FIG. 8, the virtual joystick object 42 is located at an edge of the skill-release control halo object 41, and is out of a threshold range 44. Further, the supplementary skill indicator object includes a fan-shaped indicator object (other shapes, such as an annular shape or a square shape, can also be used). The fan-shaped indicator object is different from the skill indicator object 43, and has a smaller attack range, so as to specify a target character object having the highest priority within a skill releasable range specified by the fan-shaped indicator object (also referred to as a searching range or an enemy-searching range).
Step 305: Obtaining a user finger release gesture, selecting from at least one first character object within the searching range a first character object satisfying a second preset policy as a target character object having the highest priority; and performing a skill-release operation on the selected target character object.
Specifically, within the searching range, based on the release location and/or direction of the skill object obtained by the virtual joystick object following the dragging motion of the skill-release operation gesture, the skill-release operation is performed on the target character object.
According to the embodiment of the present invention, a target character object is selected and located by determining whether a virtual joystick object touched by a user finger is out of a threshold range and, when it is determined that the virtual joystick object touched by the user finger is out of the threshold range, rendering is performed on the graphical user interface to obtain a supplementary skill indicator object. The skill releasable range specified by the supplementary skill indicator object is a searching range of a first character object. Further, a user finger release gesture is obtained, and a first character object satisfying a second preset policy as the target character object having the highest priority is selected from at least one first character object within the searching range; and the skill-release operation is performed on the target character object within the searching range. In such a way of releasing the specific skill, a target object for skill-release can be located accurately and rapidly, avoiding mis-operation and improving interaction processing speed due to the improved locating accuracy.
According to the above embodiments, when selecting and locating a target character object by determining whether the virtual joystick object touched by the user finger is out of a threshold range, the skill-release control halo object (such as a wheel) may have two threshold ranges, and the virtual joystick object (such as a joystick) may move within the two ranges, so as to implement skill-release. If a skill moves within the innermost range, a release target is automatically searched within the skill-releasable-range centering around a player character object (or a release target is searched in a direction along a connection line from a center of the wheel to the joystick). When there is no target, the skill-release operation is not performed and a prompt is displayed.
On the other hand, when a user drags the joystick to an area within the outermost threshold range, a skill indicator having a fan shape or another shape (such as a fan-shaped indicator) is displayed using the connection line from the center of the wheel to the joystick as a centerline and the player character target as the center point.
In an implementation of the embodiment of the present invention, the performing rendering on the graphical user interface to obtain a supplementary skill indicator object includes: obtaining a first relative direction formed by the virtual joystick object relative to a center of the skill-release control halo object, and generating, in the first relative direction, a first target selection direction line by using a connection line to a center point of the skill indicator object; and forming the supplementary skill indicator object based on positive and negative offsets on a preset angle and centering the first target selection direction line.
For example, as shown in FIG. 9, in a skill operation area of a graphical user interface, a skill-release operation gesture applied on a skill object 1 is obtained, and rendering is performed to obtain a skill-release supplementary control object. The skill-release supplementary control object includes the skill-release control halo object 41 and the virtual joystick object 42. A skill-release control operation is subsequently triggered, so that the location of the skill-release control halo object 41 remains unchanged, and a skill releasable range specified by a skill indicator object 43 completely covers an area in which the skill-release control halo object 41 is located.
Specifically, as shown in FIG. 9, the skill-release control halo object 41 may have a shape of a wheel and may be referred to as a wheel; the virtual joystick object 42 may have an annular shape or a ring shape and may be referred to as a joystick; and the skill indicator object 43 may have a shape of a wheel, and may be referred to as a skill indicator.
Rendering is performed on the graphical user interface to obtain a supplementary skill indicator object when the virtual joystick object touched by a user finger is out of the threshold range, such as a fan-shaped indicator object 45 in FIG. 9. The fan-shaped indicator object 45 has a first target selection direction line 451, which is the direction of the connection line from the virtual joystick object and the skill-release control halo object. Using the player character f1 as an original point and centering around the first target selection direction line 451, the fan-shaped indicator object 45 is formed based on positive and negative offsets on the preset angle ‘a’. In FIG. 9, a person f1 is a member of the user's side.
In an implementation of the embodiment of the present invention, the method further includes: each time it is detected that a location of the virtual joystick object changes relative to the center of the skill-release control halo object, refreshing the first relative direction to a second relative direction, and generating, in the second relative direction, a second target selection direction line by using a connection line to the center point of the skill indicator object; and selecting a first character object with the shortest vertical distance to the second target selection direction line as the target character object having the highest priority and highlighting the target character object, so as to re-determine and refresh the target character object each time the virtual joystick object moves, and to highlight a new target character object obtained through refreshing.
For example, as shown in FIG. 10, in a skill operation area of a graphical user interface, a skill-release operation gesture applied to a skill object 1 is obtained, and rendering is performed to obtain a skill-release supplementary control object. The skill-release supplementary control object includes a skill-release control halo object 41 and a virtual joystick object 42. A skill-release control operation is subsequently triggered, so that a location of the skill-release control halo object 41 remains unchanged, and a skill releasable range specified by a skill indicator object 43 completely covers an area in which the skill-release control halo object 41 is located.
Comparing with FIG. 9, in FIG. 10, after the virtual joystick object 42 moves relative to the skill-release control halo object 41, the first target selection direction line 451 needs to be refreshed to the current second target selection direction line 452. In this case, the fan-shaped indicator object 45 has the second target selection direction line 452 obtained through refreshing. Centering around the second target selection direction line 452, the fan-shaped indicator object 45 is formed based on positive and negative offsets on the preset angle a. In FIG. 10, the personnel f1 is a member of the user's side, and persons g1 and g2 are enemies (members of the opposing side).
In a process of implementing control based on man-machine interaction on the graphical user interface, different groups can be created by multiple users in a one-to-one, one-to-many, and many-to-many formats to run different interaction modes. Therefore, different interaction results can be obtained. The different interaction modes include a versus mode between multiple online users, and further include an offline versus mode without an Internet connection.
The versus mode between multiple users and the offline versus mode without an Internet connection are both applicable to the UIs shown in FIG. 3, FIG. 5, FIG. 8, FIG. 9, and FIG. 10. It should be noted that, the UIs shown in FIG. 3, FIG. 5, FIG. 8, FIG. 9, and FIG. 10 are obtained through rendering by a processor of a terminal device, which is specifically an image processor. This is only a specific example. The specific example is applicable to different personnel deployment scenarios in a game, for example, numbers of deployed persons of two parties are 1 to 1, 3 to 3, or 5 to 5, and is also applicable to scenarios in which numbers of deployed persons of two parties are different in a game, for example, a scenario in which numbers of deployed persons are 40 to 20 or 30 to 60, to run a mode in a game in which a skill-release location and direction are determined by using a wheel at a fixed location. For example, for the versus mode between multiple online users, the multiple users are grouped into different groups, each group includes at least one group member, and the different groups are marked as at least a first group (for example, own group) and a second group (for example, an opponent group). If there is only one person in the own group, and there is also only one person in the opponent group, it is the “1 to 1” mode mentioned above. If there are only three persons in the own group, and there are also three persons in the opponent group, it is the “3 to 3” mode mentioned above. If there are only five persons in the own group, and there are also only five persons in the opponent group, it is the “5 to 5” mode mentioned above. Certainly, for the opponent group and own group, it is not necessarily to deploy a same number of persons, and different numbers of persons may be deployed.
In an implementation of the embodiment of the present invention, the selecting, from at least one first character object within the searching range, a first character object satisfying a second preset policy as the target character object having the highest priority includes: when the searching range includes multiple first character objects, selecting a first character object with the shortest vertical distance to the first target selection direction line as the target character object having the highest priority, and highlighting the target character object.
In an implementation of the embodiment of the present invention, the method further includes: when it is determined that the virtual joystick object touched by a user finger is out of the threshold range, performing rendering on the graphical user interface to obtain a supplementary skill indicator object, where a skill releasable range specified by the supplementary skill indicator object is a searching range of a first character object; obtaining a user finger release gesture and, when no first character object can be found within the searching range, discarding performing the skill-release operation on the target character object within the searching range, and displaying a second prompt message, where the second prompt message is used to represent that there is no selectable target character object within the searching range.
In an embodiment of the present invention, a terminal is provided. A software application is executed on a processor of the terminal and rendering is performed on a display of the terminal to obtain a graphical user interface. The processor, the graphical user interface, and the software application are implemented in a game system. As shown in FIG. 11, the terminal further includes a first rendering unit 51, a first detection unit 52, a second detection unit 53, and a skill-release unit 54.
The first rendering unit 51 is configured to perform rendering on the graphical user interface to obtain at least one virtual resource object.
The first detection unit 52 is configured to: when detecting a skill-release trigger gesture on at least one skill object located in at least one skill operation area on the graphical user interface, perform rendering on the graphical user interface to obtain a skill-release supplementary control object. The skill-release supplementary control object including a skill-release control halo object and a virtual joystick object located within a radiation range of the skill-release control halo object.
The second detection unit 53 is configured to: when detecting a drag operation on the virtual joystick object, control the skill-release location of the skill object to be correspondingly adjusted on the graphical user interface.
The skill-release unit 54 is configured to: determine whether the virtual joystick object is out of a threshold range and, when the virtual joystick object is not out of the threshold range, based on a detected release operation of the drag operation and from at least one character object within a skill releasable range of the skill object, select a target character object having the highest priority and satisfying a first preset policy, and perform a skill-release operation on the target character object. The skill-release operation is performed on the target character object within the skill releasable range according to a release location and/or direction of the skill object obtained by the movement of the virtual joystick object along the dragging of the skill-release operation gesture.
In an application of the embodiments of the present invention, the virtual resource object includes various types of objects on the graphical user interface. For example, a user avatar icon for representing a user, an object for representing a building, a tree, tower defense, or the like in a background, an object for representing a status (such as a blood value or a vitality value) of the user, an object for representing a skill, equipment, or the like of the user, a direction button object for controlling a change of a location of the user, a rendering object used during skill-release by the user, and the like, shall all fall within the protection scope of the “virtual resource object” of the embodiments of the present invention.
Specifically, two methods may be used for rendering on the graphical user interface. In a first method, rendering may be performed at a preset location on the graphical user interface to obtain the skill-release supplementary control object, so as to display the skill-release supplementary control object at a default fixed location. Because the skill-release supplementary control object can appear at the preset location, that is, the default fixed location, it facilitates the user to respond rapidly in an information exchange process, thereby avoiding wasting the user's response time for searching the graphical user interface. The fixed location may be fixed with respect to the graphical user interface, with respect to one or more objects on the graphical user interface, or with respect to other types of references.
In a second method, a first location may be obtained with a touch or a slide of a finger, and rendering is performed at the first location on the graphical user interface to obtain the skill-release supplementary control object, so as to display the skill-release supplementary control object at a changeable location with a touch or a slide of the finger. Because the skill-release supplementary control object can appear at a changeable location with a touch or a slide of the finger, it can well meet the requirement of a user with a habit of rapidly performing skill-release by a slide of the finger, which is different from the requirement of a user with a habit of performing skill-release at a fixed location and aiming in a stable control area to perform skill-release, it can also facilitate the user is facilitated to respond rapidly in an information exchange process, thereby avoiding wasting the user's response time for searching the graphical user interface.
Using the first method as an example, as shown in FIG. 3, in a skill operation area 40 of a graphical user interface, a skill-release operation gesture applied to a skill object 1 is obtained, and rendering is performed to obtain a skill-release supplementary control object. The skill-release supplementary control object includes a skill-release control halo object 41 and a virtual joystick object 42. A skill-release control operation is subsequently triggered, so that the location of the skill-release control halo object 41 remains unchanged, and a skill releasable range specified by a skill indicator object 43 completely cover the area in which the skill-release control halo object 41 is located.
Specifically, as shown in FIG. 3, the skill-release control halo object 41 may have a shape of a wheel and may be referred to as a wheel; the virtual joystick object 42 may have an annular shape or a ring shape and may be referred to as a joystick; and the skill indicator object 43 may have a shape of a wheel, and may be referred to as a skill indicator. It should be noted that, the skill-release control halo object and the virtual joystick object used in this specification are only examples of a skill-release controller object and a joystick object. A skill-release controller object and a joystick object that are obtained by performing rendering in a graphical user interface include, but are not limited to, a shape of a wheel, an annular shape, a ring shape, and other shapes, as long as objects that may be configured to implement skill control can be implemented.
FIG. 4 is a schematic diagram of a system configuration interface. The UI effect interface shown in FIG. 3 may be obtained by the configuration interface shown in FIG. 4. As shown in FIG. 4, when a user selects an option to set the wheel appearance location to the fixed location, being matched with such setting of the user, the UI shown in FIG. 3 is subsequently obtained by performing rendering. In the UI, the center of the joystick can shift from the center of the wheel, and the skill-release control operation is triggered, so that the location of the wheel remains unchanged, and the skill releasable range specified by the skill indicator completely covers the area in which the wheel is located. The skill indicator is an indicator on which rendering is performed to supplement the user to aim or to achieve other purposes. The skill releasable range specified by the skill indicator also completely covers the skill-release control halo object.
FIG. 5 shows an example of the threshold range. As shown in FIG. 5, the virtual joystick object 42 partially overlaps a threshold range 44, and is not out of the threshold range. In such a case, a user finger release gesture is obtained to perform a subsequent action, and a character object satisfying the first preset policy is selected, from the at least one character object within the skill releasable range, as the target character object having the highest priority.
FIG. 8 shows an example of the threshold range. As shown in FIG. 8, the virtual joystick object 42 is located at an edge of the skill-release control halo object 41, and is out of a threshold range 44. Further, the supplementary skill indicator object includes a fan-shaped indicator object (other shapes, such as an annular shape or a square shape, can also be used). The fan-shaped indicator object is different from the skill indicator object 43, and has a smaller attack range, so as to specify a target character object having the highest priority within a skill releasable range specified by the fan-shaped indicator object (also referred to as a searching range or an enemy-searching range).
For example, as shown in FIG. 9, in a skill operation area of a graphical user interface, a skill-release operation gesture applied on a skill object 1 is obtained, and rendering is performed to obtain a skill-release supplementary control object. The skill-release supplementary control object includes the skill-release control halo object 41 and the virtual joystick object 42. A skill-release control operation is subsequently triggered, so that the location of the skill-release control halo object 41 remains unchanged, and a skill releasable range specified by a skill indicator object 43 completely covers an area in which the skill-release control halo object 41 is located.
Comparing with FIG. 9, in FIG. 10, after the virtual joystick object 42 moves relative to the skill-release control halo object 41, the first target selection direction line 451 needs to be refreshed to the current second target selection direction line 452. In such a case, the fan-shaped indicator object 45 has the second target selection direction line 452 obtained through refreshing. Centering around the second target selection direction line 452, the fan-shaped indicator object 45 is formed based on positive and negative offsets on the preset angle a.
In an implementation of the embodiment of the present invention, the terminal further includes a skill-release cancellation unit, which is configured to: when a determining result is that the virtual joystick object touched by a user finger is not out of the threshold range, obtain a user finger release gesture and, if the target character object is not found within the skill releasable range, discard the skill-release operation on the target character object within the skill releasable range, and a display first prompt message, where the first prompt message is used to represent that there is no selectable target character object within the skill releasable range.
In an implementation of this embodiment of the present invention, the terminal further includes a second rendering unit and a skill-release unit.
The second rendering unit is configured to, when it is determined that the virtual joystick object touched by a user finger is out of the threshold range, perform rendering on the graphical user interface to obtain a supplementary skill indicator object, where a skill releasable range specified by the supplementary skill indicator object is a searching range of a first character object.
The skill-release unit is configured to: obtain a user finger release gesture, and select, from at least one first character object within the searching range, a first character object satisfying a second preset policy as the target character object having the highest priority; and perform the skill-release operation on the target character object within the searching range.
In an implementation of this embodiment of the present invention, the second rendering unit is further configured to: obtain a first relative direction formed by the virtual joystick object relative to a center of the skill-release control halo object, generate a first target selection direction line in the first relative direction by using a connection line to a center point of the skill indicator object; and, centering around the first target selection direction line, form the supplementary skill indicator object based on positive and negative offsets plus a preset angle.
In an implementation of this embodiment of the present invention, the skill-release unit is further configured to, when the searching range includes multiple first character objects, select a first character object with the shortest vertical distance to the first target selection direction line as the target character object having the highest priority, and to highlight the target character object.
In an implementation of this embodiment of the present invention, the terminal further includes a refreshing unit, which is configured to: each time it is detected that a location of the virtual joystick object changes relative to the center of the skill-release control halo object, refresh the first relative direction to a second relative direction and generate, in the second relative direction, a second target selection direction line by using a connection line to the center point of the skill indicator object; and select a first character object with the shortest vertical distance to the second target selection direction line as the target character object having the highest priority, and highlight the target character object, to re-determine and refresh the target character object each time the virtual joystick object moves, and highlight a new target character object obtained through refreshing.
In an implementation of this embodiment of the present invention, the terminal further includes a skill-release unit, which is configured to: when it is determined that the virtual joystick object touched by a user finger is out of the threshold range, perform rendering on the graphical user interface to obtain a supplementary skill indicator object, where a skill releasable range specified by the supplementary skill indicator object is a searching range of a first character object; obtain a user finger release gesture and, if the first character object used as the target character object is not found from the searching range, discard performing the skill-release operation on the target character object within the searching range, and display second prompt message, where the second prompt message is used to represent that there is no selectable target character object within the searching range.
In an embodiment of the present invention, a terminal is provided. As shown in FIG. 12, the terminal includes: a display 61 and a processor 62. The display 61 is configured to: execute a software application on the processor of the terminal and then perform rendering on the software application, to obtain a graphical user interface. The graphical user interface is configured to facilitate control processing in man-machine interaction. The processor 62 is configured to perform the information processing method in the embodiments of the present invention. The processor, the graphical user interface, and the software application are implemented in a game system.
Further, the terminal further includes: a memory 63, an input device 64 (for example, a peripheral device such as a collection device including a camera, a microphone, and a headset; a mouse, a joystick, or a desktop computer keyboard; or a physical keyboard or a touchscreen on a notebook computer or a tablet computer), an output device 65 (for example, an audio output device or a video output device including a speaker, a headset, and the like), a bus 66, and a networking device 67. The processor 62, the memory 63, the input device 64, the display 61, and the networking device 67 are connected by using the bus 66, and the bus 66 is used for data transmission and communication between the processor 62, the memory 63, the display 61, and the networking device 67.
The input device 64 is mainly configured to obtain an input operation of a user, and the input device 64 may vary with the terminal. For example, when the terminal is a PC, the input device 64 may be an input device such as a mouse or a keyboard; when the terminal is portable device such as a smartphone or a tablet computer, the input device 64 may be a touchscreen. The networking device 67 is used by multiple terminals and a server to connect and upload and download data by using a network, and used by multiple terminals to connect and perform data transmission by using a network.
The server may be formed by a cluster system, and to implement functions of various units, the functions may be combined or functions of the units are separately provided in an electronic device. Either the terminal or the server at least includes a database for storing data and a processor for data processing, or includes a storage medium disposed in the server or a storage medium that is disposed separately. For the processor for data processing, during processing, a microprocessor, a central processing unit (CPU), a digital signal processor (DSP), or a field programmable gate array (FPGA) may be used for implementation. The storage medium includes an operation instruction, the operation instruction may be computer executable code, and steps in the procedure of the information processing method in the embodiments of the present invention are implemented by using the operation instruction.
In an embodiment of the present invention, a computer storage medium is provided. A computer executable instruction is stored in the computer storage medium, and the computer executable instruction is configured to perform the information processing method in the embodiments of the present invention.
For example, an application scenario of a game system may be implemented according the embodiments of the present invention.
This application scenario is related to Multiplayer Online Battle Arena Games (MOBA). In MOBA, related terms are as follows: 1) the UI layer, that is, icons in the graphical user interface; 2) a skill indicator: used to supplement a skill-release special effect, halo, or operation; 3) lens, which may be understood as a camera in the game; 4) mini map: a scaled-down version of a large map, which may be understood as a radar map, where information and locations of two parties are displayed in the map; 5) wheel: a halo displayed above a skill key when the skill key is pressed; and 6) virtual joystick: a control for an operation and locating on the wheel.
As shown in FIG. 13, a flow chart of the application scenario includes the followings, where the first circle is an indicator, the second circle is a secondary attack range, and the third circle is an enemy searching range.
1. When a player presses a skill key, a system invokes a supplementary spellcasting wheel, and detects whether a virtual joystick touched by a finger of the player is out of a movement threshold set by the system. When the joystick is not out of a threshold range (as shown in FIG. 5), the player releases the finger, and the system searches, within a skill releasable range according to a determined priority, for a target currently having the highest priority, and performs skill-release on the target (which is also referred to as an automatic enemy searching mechanism). If there is no target within the range, the system discards current spellcasting, and prompts the player that “there is no selectable target within the range”. After the joystick is out of the threshold range (as shown in FIG. 5), the followings in (2) are performed.
2. When the joystick is out of the threshold range (as shown in FIG. 5), in a scene, a fan-shaped indicator is invoked, and is projected to the scene at a location of the joystick relative to a center of the wheel, and a connection line to a center point is drawn in a relative direction, a target selection direction line, as a fan shape of 90 degrees formed based on positive and negative 45 degrees centering around the direction line, as shown in FIG. 9. In the fan shape of 90 degrees, a hero is selected as a skill target and is highlighted. If there are multiple heroes within the range, a hero with the shortest vertical distance to the direction line is selected as a skill target and highlighted.
3. Target determination is refreshed each time the virtual joystick moves, as shown in FIG. 10, and a new target is highlighted immediately. In this case, when the finger is released, the skill-release operation is performed on the highlighted target. If there is no target within the range, the player is prompted that “there is no selectable target within the range”, and skill-release is canceled.
FIG. 14 is a schematic flowchart of specific interaction in an information processing method in this application scenario. As shown in FIG. 14, in this application scenario, a terminal 1, a terminal 2, and a server are included. The user 1 performs triggering and control by using the terminal 1, and the user 2 performs triggering and control by using the terminal 2; and the method includes the followings.
For the user 1, step 11 to step 16 are included.
Step 11: The user 1 triggers the game system by using the terminal 1, and registers identity authentication information, where the identity authentication information may be a user name and a password.
Step 12: The terminal 1 transmits the obtained identity authentication information to the server 3, and the server 3 performs identity authentication, and returns a first graphical user interface to the terminal 1 after the identity authentication succeeds, where the first graphical user interface includes a virtual resource object.
Step 13: A specified virtual resource object (such as an SMS message object in FIG. 3) can respond based on a touch operation of the user 1, and performs a series of virtual operations in step 14 to step 17.
Step 14: Terminal 1 performs rendering on the graphical user interface to obtain a skill-release supplementary control object when detecting a skill-release trigger gesture on at least one skill object located in at least one skill operation area on the graphical user interface, the skill-release supplementary control object including a skill-release control halo object and a virtual joystick object located within a radiation range of the skill-release control halo object. The skill releasable range specified by the skill indicator object completely covers the area in which the skill-release control halo object is located.
Step 15: Controlling, when detecting a drag operation on the virtual joystick object, a skill-release location of the skill object to be correspondingly adjusted on the graphical user interface.
Step 16: Determining whether the virtual joystick object is out of a threshold range, when the virtual joystick object is not out of the threshold range, according to a detected release operation of the drag operation, select a target character object satisfying a first preset policy from at least one character object within a skill releasable range of the skill object, and perform a skill-release operation on the target character object.
Step 17: Synchronizing an execution result obtained by performing step 14 to step 16 to the server, or instantly transferring the execution result to the terminal 2 by using the server, or directly forward the execution result to the terminal 2, so that the user 2 that logs in to the game system by using the terminal 2 can respond to the virtual operation of the user 1, so as to implement interaction between multiple terminals. In this application scenario, only interaction between two terminals is used as an example, and during an actual operation, interaction between multiple terminals may be not limited to interaction between the two terminals in this example.
For the user 2, step 21 to step 26 are included.
Step 21: The user 2 triggers the game system by using the terminal 2, and registers identity authentication information, where the identity authentication information may be a user name and a password.
Step 22: The terminal 2 transmits the obtained identity authentication information to the server 3, and the server 3 performs identity authentication, and returns a second graphical user interface to the terminal 2 after the identity authentication succeeds, where the second graphical user interface includes a virtual resource object.
Step 23: A specified virtual resource object (such as an SMS message object in FIG. 3) can respond based on a touch operation of the user 2, and performs a series of virtual operations in step 24 to step 27.
Step 24: When detecting a skill-release trigger gesture on at least one skill object located in at least one skill operation area on the graphical user interface, terminal 2 performs rendering on the graphical user interface to obtain a skill-release supplementary control object, where the skill-release supplementary control object includes a skill-release control halo object and a virtual joystick object located within a radiation range of the skill-release control halo object. The skill releasable range specified by the skill indicator object completely covers the area in which the skill-release control halo object is located.
Step 25: When detecting a drag operation on the virtual joystick object, controlling a skill-release location of the skill object to be correspondingly adjusted on the graphical user interface.
Step 26: Determine whether the virtual joystick object is out of a threshold range and, when the virtual joystick object is not out of the threshold range, according to a detected release operation of the drag operation, selecting a target character object satisfying a first preset policy from at least one character object within a skill releasable range of the skill object, and performing a skill-release operation on the target character object.
Step 27: Synchronizing an execution result obtained by performing step 24 to step 26 to the server, or instantly transferring the execution result to the terminal 1 by using the server, or directly forward the execution result to the terminal 1, so that the user 1 that logs in to the game system by using the terminal 1 can respond to the virtual operation of the user 2, so as to implement interaction between multiple terminals. In this application scenario, only interaction between two terminals is used as an example, and during an actual operation, interaction between multiple terminals may be not limited to interaction between the two terminals in this example.
Step 30: Optionally, after receiving a first man-machine interaction execution result obtained by step 14 to step 17 and/or a second interaction execution result obtained by step 24 to step 27, synchronizing or transferring the first man-machine interaction execution result and/or the second interaction execution result to corresponding terminals.
In the embodiments provided in this application, it should be understood that the disclosed devices and methods may be implemented in other manners. The described device embodiments are merely examples. For example, the unit division is merely logical function division and may be other division during actual implementation. For example, multiple units or components may be combined or integrated into another system, or some features may be ignored or not performed. In addition, the displayed or discussed mutual couplings or direct couplings or communication connections between constituent parts may be implemented through some interfaces. The indirect couplings or communication connections between the devices or units may be implemented in electronic, mechanic, or other forms.
The units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one location, or may be distributed on multiple network units. Some or all of the units may be selected according to actual needs to achieve the objectives of the solutions of the embodiments.
In addition, the functional units in the embodiments of the present invention may all be integrated into one processing unit, or each of the units may exist separately, or two or more units are integrated into one unit, and the integrated unit may be implemented in a form of hardware, or may be implemented in a form of hardware in addition to a software functional unit.
A person of ordinary skill in the art may understand that all or some of the steps of the method embodiments may be implemented by a program instructing relevant hardware. The program may be stored in a computer-readable storage medium. When the program runs, the steps of the method embodiments are performed. The foregoing storage medium includes: any medium that can store program code, such as a portable storage device, a read-only memory (ROM), a random access memory (RAM), a magnetic disk, or an optical disc.
Alternatively, when the integrated unit is implemented in a form of a software functional module and sold or used as an independent product, the integrated unit may be stored in a computer-readable storage medium. Based on such an understanding, the technical solutions of the embodiments of the present invention essentially, or the part contributing to the existing technology may be implemented in a form of a software product. The computer software product is stored in a storage medium, and includes several instructions for instructing a computer device (which may be a personal computer, a server, or a network device) to perform all or some of the steps of the methods described in the embodiments of the present invention. The foregoing storage medium includes: any medium that can store program code, such as a portable storage device, a ROM, a RAM, a magnetic disk, or an optical disc.
The foregoing descriptions are merely specific implementations of the present disclosure, but are not intended to limit the protection scope of the present disclosure. Any variation or replacement readily figured out by a person skilled in the art within the technical scope disclosed in the present disclosure shall fall within the protection scope of the present disclosure. Therefore, the protection scope of the present disclosure shall be subject to the protection scope of the claims.
INDUSTRIAL APPLICABILITY
According to the embodiments of the present invention,
a target character object is selected and located by determining whether a virtual joystick object touched by a user finger is out of a threshold range. When it is determined that the virtual joystick object touched by the user finger is not out of the threshold range, a finger release gesture of the user is obtained; and a character object satisfying a first preset policy is selected from at least one character object within a skill releasable range as the target character object having the highest priority. A skill-release operation is then performed on the target character object. In such a way of releasing the specific skill, the target object for the skill-release can be located accurately and rapidly, avoiding mis-operation and improving interaction processing speed due to the improved locating accuracy.

Claims (20)

What is claimed is:
1. An information processing method implemented by a computer system, comprising:
performing rendering on a graphical user interface to obtain at least one virtual resource object;
when detecting a skill-release trigger gesture on at least one skill object located in at least one skill operation area on the graphical user interface, performing rendering on the graphical user interface to obtain a skill-release supplementary control object, the skill-release supplementary control object comprising a skill-release control halo object and a virtual joystick object located within a radiation range of the skill-release control halo object;
when detecting a dragging operation on a virtual joystick object touched by a user, controlling a skill-release location of the skill object to be correspondingly adjusted on the graphical user interface;
determining whether the virtual joystick object touched is out of a threshold range and, when the virtual joystick object is not out of the threshold range, selecting a target character object satisfying a first preset policy from at least one character object within a skill releasable range of the skill object according to a detected release operation of the dragging operation, and performing a skill-release operation on the target character object;
when the virtual joystick object touched is out of the threshold range, rendering on the graphical user interface to obtain a supplementary skill indicator object, wherein a skill releasable range specified by the supplementary skill indicator object is a searching range of a first character object;
after obtaining touch release gesture, selecting, from at least one first character object within the searching range, a first character object satisfying a second preset policy as the target character object; and
performing the skill-release operation on the target character object within the searching range.
2. The method according to claim 1, further comprising:
when the virtual joystick object touched is not out of the threshold range and when no target character object is found within the skill releasable range, displaying a first prompt message, wherein the first prompt message is used to represent that there is no selectable target character object within the skill releasable range.
3. The method according to claim 2, further comprising:
when it is determined that the virtual joystick object touched is out of the threshold range, performing rendering on the graphical user interface to obtain a supplementary skill indicator object, wherein the supplementary skill indicator object specifies a skill releasable range as a searching range of a first character object; and
after obtaining touch release gesture, when no first character object is found within the searching range, discarding the skill-release operation on the target character object within the searching range, and displaying the second prompt message.
4. The method according to claim 1, further comprising:
in a first relative direction formed by the virtual joystick object relative to a center of the skill-release control halo object, generating a first target selection direction line by using a connection line to a center point of the skill indicator object.
5. The method according to claim 4, further comprising:
centering around the first target selection direction line, forming the supplementary skill indicator object based on a positive offset and a negative offset on a preset angle.
6. The method according to claim 5, wherein the selecting, from at least one first character object within the searching range, a first character object satisfying a second preset policy as the target character object comprises:
when there are multiple first character objects within the searching range, selecting a first character object with a shortest vertical distance to the first target selection direction line as the target character object, and highlighting the target character object.
7. The method according to claim 6, further comprising:
each time it is detected that a location of the virtual joystick object changes relative to the center of the skill-release control halo object, changing the first relative direction to a second relative direction;
in the second relative direction, generating a second target selection direction line by using a connection line to the center point of the skill indicator object; and
selecting a second character object with a shortest vertical distance to the second target selection direction line as the target character object, and highlighting the target character object.
8. A terminal, comprising:
a display;
a memory storing instructions; and
a processor coupled to the memory and the display and, when executing the instructions, configured for:
performing rendering on a graphical user interface of the display to obtain at least one virtual resource object;
when detecting a skill-release trigger gesture on at least one skill object located in at least one skill operation area on the graphical user interface, performing rendering on the graphical user interface to obtain a skill-release supplementary control object, the skill-release supplementary control object comprising a skill-release control halo object and a virtual joystick object located within a radiation range of the skill-release control halo object;
when detecting a dragging operation on a virtual joystick object touched by a user, controlling a skill-release location of the skill object to be correspondingly adjusted on the graphical user interface;
determining whether the virtual joystick object touched is out of a threshold range and, when the virtual joystick object is not out of the threshold range, selecting a target character object satisfying a first preset policy from at least one character object within a skill releasable range of the skill object according to a detected release operation of the dragging operation, and performing a skill-release operation on the target character object;
when it is determined that the virtual joystick object touched is out of the threshold range, performing rendering on the graphical user interface to obtain a supplementary skill indicator object, wherein a skill releasable range specified by the supplementary skill indicator object is a searching range of a first character object;
after obtaining touch release gesture, selecting, from at least one first character object within the searching range, a first character object satisfying a second preset policy as the target character object; and
performing the skill-release operation on the target character object within the searching range.
9. The terminal according to claim 8, wherein the processor is further configured for:
when it is determined that the virtual joystick object touched is not out of the threshold range and when no target character object is found within the skill releasable range, displaying a first prompt message on the display, wherein the first prompt message is used to represent that there is no selectable target character object within the skill releasable range.
10. The terminal according to claim 9, wherein the processor is further configured for:
when it is determined that the virtual joystick object touched is out of the threshold range, performing rendering on the graphical user interface to obtain a supplementary skill indicator object; and
after obtaining touch release gesture, when no first character object is found within the searching range, discarding the skill-release operation on the target character object within the searching range, and displaying the second prompt message.
11. The terminal according to claim 8, wherein the processor is further configured for:
in a first relative direction formed by the virtual joystick object relative to a center of the skill-release control halo object, generating a first target selection direction line by using a connection line to a center point of the skill indicator object.
12. The terminal according to claim 11, wherein the processor is further configured for:
centering around the first target selection direction line, forming the supplementary skill indicator object based on a positive offset and a negative offset on a preset angle.
13. The terminal according to claim 12, wherein the selecting, from at least one first character object within the searching range, a first character object satisfying a second preset policy as the target character object comprises:
when there are multiple first character objects within the searching range, selecting a first character object with a shortest vertical distance to the first target selection direction line as the target character object, and highlighting the target character object.
14. The terminal according to claim 13, wherein the processor is further configured for:
each time it is detected that a location of the virtual joystick object changes relative to the center of the skill-release control halo object, changing the first relative direction to a second relative direction;
in the second relative direction, generating a second target selection direction line by using a connection line to the center point of the skill indicator object; and
selecting a second character object with a shortest vertical distance to the second target selection direction line as the target character object, and highlighting the target character object.
15. A non-transitory computer-readable storage medium containing computer-executable instructions for, when executed by a processor, performing an information processing method, the method comprising:
performing rendering on a graphical user interface to obtain at least one virtual resource object;
when detecting a skill-release trigger gesture on at least one skill object located in at least one skill operation area on the graphical user interface, performing rendering on the graphical user interface to obtain a skill-release supplementary control object, the skill-release supplementary control object comprising a skill-release control halo object and a virtual joystick object located within a radiation range of the skill-release control halo object;
when detecting a dragging operation on the virtual joystick object, controlling a skill-release location of the skill object to be correspondingly adjusted on the graphical user interface;
determining whether the virtual joystick object is out of a threshold range and, when the virtual joystick object is not out of the threshold range, selecting a target character object satisfying a first preset policy from at least one character object within a skill releasable range of the skill object according to a detected release operation of the dragging operation, and performing a skill-release operation on the target character object;
when it is determined that the virtual joystick object touched is out of the threshold range, performing rendering on the graphical user interface to obtain a supplementary skill indicator object, wherein a skill releasable range specified by the supplementary skill indicator object is a searching range of a first character object;
after obtaining touch release gesture, selecting, from at least one first character object within the searching range, a first character object satisfying a second preset policy as the target character object; and
performing the skill-release operation on the target character object within the searching range.
16. The non-transitory computer-readable storage medium according to claim 15, the method further comprising:
when it is determined that the virtual joystick object touched is not out of the threshold range and when no target character object is found within the skill releasable range, displaying a first prompt message, wherein the first prompt message is used to represent that there is no selectable target character object within the skill releasable range.
17. The non-transitory computer-readable storage medium according to claim 15, the method further comprising:
in a first relative direction formed by the virtual joystick object relative to a center of the skill-release control halo object, generating a first target selection direction line by using a connection line to a center point of the skill indicator object.
18. The non-transitory computer-readable storage medium according to claim 17, the method further comprising:
centering around the first target selection direction line, forming the supplementary skill indicator object based on a positive offset and a negative offset on a preset angle.
19. The non-transitory computer-readable storage medium according to claim 15, wherein the selecting, from at least one first character object within the searching range, a first character object satisfying a second preset policy as the target character object comprises:
when there are multiple first character objects within the searching range, selecting a first character object with a shortest vertical distance to the first target selection direction line as the target character object, and highlighting the target character object.
20. The non-transitory computer-readable storage medium according to claim 15, the method further comprising:
each time it is detected that a location of the virtual joystick object changes relative to the center of the skill-release control halo object, changing the first relative direction to a second relative direction;
in the second relative direction, generating a second target selection direction line by using a connection line to the center point of the skill indicator object; and
selecting a second character object with a shortest vertical distance to the second target selection direction line as the target character object, and highlighting the target character object.
US16/560,772 2015-10-10 2019-09-04 Information processing method, terminal, and computer storage medium Active US11003261B2 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
US16/560,772 US11003261B2 (en) 2015-10-10 2019-09-04 Information processing method, terminal, and computer storage medium

Applications Claiming Priority (5)

Application Number Priority Date Filing Date Title
CN201510654167.4 2015-10-10
CN201510654167.4A CN105194873B (en) 2015-10-10 2015-10-10 A kind of information processing method, terminal and computer storage medium
PCT/CN2016/083208 WO2017059684A1 (en) 2015-10-10 2016-05-24 Information processing method, terminal, and computer storage medium
US15/720,432 US10444871B2 (en) 2015-10-10 2017-09-29 Information processing method, terminal, and computer storage medium
US16/560,772 US11003261B2 (en) 2015-10-10 2019-09-04 Information processing method, terminal, and computer storage medium

Related Parent Applications (1)

Application Number Title Priority Date Filing Date
US15/720,432 Continuation US10444871B2 (en) 2015-10-10 2017-09-29 Information processing method, terminal, and computer storage medium

Publications (2)

Publication Number Publication Date
US20190391676A1 US20190391676A1 (en) 2019-12-26
US11003261B2 true US11003261B2 (en) 2021-05-11

Family

ID=54943034

Family Applications (2)

Application Number Title Priority Date Filing Date
US15/720,432 Active 2036-08-14 US10444871B2 (en) 2015-10-10 2017-09-29 Information processing method, terminal, and computer storage medium
US16/560,772 Active US11003261B2 (en) 2015-10-10 2019-09-04 Information processing method, terminal, and computer storage medium

Family Applications Before (1)

Application Number Title Priority Date Filing Date
US15/720,432 Active 2036-08-14 US10444871B2 (en) 2015-10-10 2017-09-29 Information processing method, terminal, and computer storage medium

Country Status (9)

Country Link
US (2) US10444871B2 (en)
EP (1) EP3264248B1 (en)
JP (1) JP6620169B2 (en)
KR (1) KR102041170B1 (en)
CN (2) CN105194873B (en)
AU (1) AU2016336603B2 (en)
CA (1) CA2981553C (en)
MY (1) MY187968A (en)
WO (1) WO2017059684A1 (en)

Families Citing this family (90)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105194873B (en) 2015-10-10 2019-01-04 腾讯科技(成都)有限公司 A kind of information processing method, terminal and computer storage medium
CN105335065A (en) * 2015-10-10 2016-02-17 腾讯科技(深圳)有限公司 Information processing method and terminal, and computer storage medium
CN106959812A (en) * 2016-01-11 2017-07-18 北京英雄互娱科技股份有限公司 Method and apparatus for man-machine interaction
US10203860B2 (en) * 2016-03-18 2019-02-12 Ebay Inc. Graphical user interface element adjustment
CN106237615A (en) * 2016-07-22 2016-12-21 广州云火信息科技有限公司 Many unity elements skill operation mode
CN106492457A (en) * 2016-10-20 2017-03-15 北京乐动卓越科技有限公司 A kind of implementation method of full 3D actions mobile phone games fight interactive system and device
CN106422329A (en) * 2016-11-01 2017-02-22 网易(杭州)网络有限公司 Game control method and device
JP6143934B1 (en) * 2016-11-10 2017-06-07 株式会社Cygames Information processing program, information processing method, and information processing apparatus
CN106354418B (en) * 2016-11-16 2019-07-09 腾讯科技(深圳)有限公司 A kind of control method and device based on touch screen
CN106843722B (en) * 2016-12-26 2019-12-31 上海莉莉丝网络科技有限公司 Touch control method and touch control device for touch terminal
CN106657127B (en) * 2017-01-05 2020-09-08 腾讯科技(深圳)有限公司 Information processing method and device and server
KR102319206B1 (en) 2017-01-05 2021-10-28 텐센트 테크놀로지(센젠) 컴퍼니 리미티드 Information processing method and device and server
CN107132979A (en) * 2017-03-14 2017-09-05 网易(杭州)网络有限公司 Exchange method, device and the computer-readable recording medium of accurate selection target in mobile device game
CN107050862B (en) * 2017-05-19 2018-06-15 网易(杭州)网络有限公司 Display control method and system, the storage medium of scene of game
US10888781B2 (en) 2017-05-19 2021-01-12 Netease (Hangzhou) Network Co., Ltd. Game scene display control method and system and storage medium
CN107174824B (en) * 2017-05-23 2021-01-15 网易(杭州)网络有限公司 Special effect information processing method and device, electronic equipment and storage medium
CN107193479B (en) 2017-05-26 2018-07-10 网易(杭州)网络有限公司 Information processing method, device, electronic equipment and storage medium
US10413814B2 (en) * 2017-06-09 2019-09-17 Supercell Oy Apparatus and method for controlling user interface of computing apparatus
CN107463319B (en) * 2017-06-09 2022-07-12 网易(杭州)网络有限公司 Information processing method and device, storage medium and electronic equipment
CN107168611B (en) * 2017-06-16 2018-12-28 网易(杭州)网络有限公司 Information processing method, device, electronic equipment and storage medium
CN107301012B (en) * 2017-06-19 2020-11-06 上海逗屋网络科技有限公司 Method and equipment for displaying description information of operation instruction in application
CN107450812A (en) * 2017-06-26 2017-12-08 网易(杭州)网络有限公司 Virtual object control method and device, storage medium, electronic equipment
CN107376339B (en) * 2017-07-18 2018-12-28 网易(杭州)网络有限公司 The exchange method and device of lock onto target in gaming
CN107362535B (en) * 2017-07-19 2019-04-26 腾讯科技(深圳)有限公司 Target object locking means, device and electronic equipment in scene of game
CN107398071B (en) * 2017-07-19 2021-01-26 网易(杭州)网络有限公司 Game target selection method and device
CN107551537B (en) * 2017-08-04 2020-12-01 网易(杭州)网络有限公司 Method and device for controlling virtual character in game, storage medium and electronic equipment
CN107583271B (en) * 2017-08-22 2020-05-22 网易(杭州)网络有限公司 Interactive method and device for selecting target in game
CN107694089B (en) * 2017-09-01 2019-02-12 网易(杭州)网络有限公司 Information processing method, device, electronic equipment and storage medium
CN107596688B (en) * 2017-09-25 2020-11-10 网易(杭州)网络有限公司 Skill release control method and device, storage medium, processor and terminal
CN107754305A (en) 2017-10-13 2018-03-06 网易(杭州)网络有限公司 Information processing method and device, storage medium, electronic equipment
CN107930105A (en) * 2017-10-23 2018-04-20 网易(杭州)网络有限公司 Information processing method and device, storage medium, electronic equipment
CN112698781B (en) * 2017-11-03 2022-06-07 腾讯科技(深圳)有限公司 Target positioning method, device, medium and electronic equipment in virtual environment
CN107899246B (en) * 2017-11-10 2020-11-20 网易(杭州)网络有限公司 Information processing method, information processing device, electronic equipment and storage medium
CN107837529B (en) * 2017-11-15 2019-08-27 腾讯科技(上海)有限公司 A kind of object selection method, device, terminal and storage medium
CN107899242A (en) * 2017-11-17 2018-04-13 杭州电魂网络科技股份有限公司 Technical ability hits determination methods and device
CN108196765A (en) * 2017-12-13 2018-06-22 网易(杭州)网络有限公司 Display control method, electronic equipment and storage medium
CN108144293B (en) * 2017-12-15 2022-05-13 网易(杭州)网络有限公司 Information processing method, information processing device, electronic equipment and storage medium
CN108245888A (en) * 2018-02-09 2018-07-06 腾讯科技(深圳)有限公司 Virtual object control method, device and computer equipment
CN108543308B (en) * 2018-02-27 2020-08-04 腾讯科技(深圳)有限公司 Method and device for selecting virtual object in virtual scene
CN108553892B (en) * 2018-04-28 2021-09-24 网易(杭州)网络有限公司 Virtual object control method and device, storage medium and electronic equipment
CN108635854A (en) * 2018-05-08 2018-10-12 网易(杭州)网络有限公司 Technical ability releasing control method and device, storage medium, electronic equipment
CN109550239B (en) * 2018-09-20 2023-02-10 厦门吉比特网络技术股份有限公司 Method and device for controlling sighting device of game
CN109513209B (en) * 2018-11-22 2020-04-17 网易(杭州)网络有限公司 Virtual object processing method and device, electronic device and storage medium
CN109568954B (en) * 2018-11-30 2020-08-28 广州要玩娱乐网络技术股份有限公司 Weapon type switching display method and device, storage medium and terminal
CN109758764B (en) * 2018-12-11 2023-02-03 网易(杭州)网络有限公司 Game skill control method and device, electronic equipment and storage medium
CN109568957B (en) * 2019-01-10 2020-02-07 网易(杭州)网络有限公司 In-game display control method, device, storage medium, processor and terminal
CN109800047B (en) * 2019-01-18 2022-03-18 网易(杭州)网络有限公司 Game skill switching method and device, storage medium and electronic equipment
US10786734B2 (en) * 2019-02-20 2020-09-29 Supercell Oy Method for facilitating user interactions in gaming environment
CN109771941B (en) * 2019-03-13 2022-08-05 网易(杭州)网络有限公司 Method, device, equipment and medium for selecting virtual object in game
CN114681911A (en) * 2019-04-03 2022-07-01 网易(杭州)网络有限公司 Information processing method and device in game, mobile terminal and readable storage medium
CN110064193A (en) * 2019-04-29 2019-07-30 网易(杭州)网络有限公司 Manipulation control method, device and the mobile terminal of virtual objects in game
KR102106273B1 (en) * 2019-05-02 2020-05-04 엔엑스엔 주식회사 The method for manipulating characters in games
CN110119481A (en) * 2019-05-10 2019-08-13 福建工程学院 Plant test tube search method and system
CN110124323B (en) * 2019-05-17 2022-09-27 罗圣博 Method and equipment for executing football skill in football application
CN117531196A (en) * 2019-07-19 2024-02-09 腾讯科技(深圳)有限公司 Reminding information sending method, device and terminal in multi-user online fight program
CN112306592B (en) * 2019-08-01 2023-07-18 腾讯科技(深圳)有限公司 Message processing method and device, storage medium and electronic device
CN110523085A (en) * 2019-08-30 2019-12-03 腾讯科技(深圳)有限公司 Control method, device, terminal and the storage medium of virtual objects
CN110559658B (en) 2019-09-04 2020-07-28 腾讯科技(深圳)有限公司 Information interaction method, device, terminal and storage medium
CN111135561B (en) * 2019-12-26 2023-08-25 北京像素软件科技股份有限公司 Game character display method, game character display device, storage medium and electronic equipment
CN111324253B (en) * 2020-02-12 2021-08-03 腾讯科技(深圳)有限公司 Virtual article interaction method and device, computer equipment and storage medium
CN111282266B (en) * 2020-02-14 2021-08-03 腾讯科技(深圳)有限公司 Skill aiming method, device, terminal and storage medium in three-dimensional virtual environment
CN111481932B (en) * 2020-04-15 2022-05-17 腾讯科技(深圳)有限公司 Virtual object control method, device, equipment and storage medium
CN111494937B (en) * 2020-04-17 2022-04-01 腾讯科技(深圳)有限公司 Virtual object control method, virtual object information synchronization device, virtual object information synchronization equipment and virtual object information synchronization medium
CN111530075B (en) * 2020-04-20 2022-04-05 腾讯科技(深圳)有限公司 Method, device, equipment and medium for displaying picture of virtual environment
CN111589112B (en) * 2020-04-24 2021-10-22 腾讯科技(深圳)有限公司 Interface display method, device, terminal and storage medium
CN111589129B (en) * 2020-04-24 2023-08-15 腾讯科技(深圳)有限公司 Virtual object control method, device, equipment and medium
CN111589133B (en) * 2020-04-28 2022-02-22 腾讯科技(深圳)有限公司 Virtual object control method, device, equipment and storage medium
CN111589114B (en) * 2020-05-12 2023-03-10 腾讯科技(深圳)有限公司 Virtual object selection method, device, terminal and storage medium
CN111672101B (en) * 2020-05-29 2023-04-07 腾讯科技(深圳)有限公司 Method, device, equipment and storage medium for acquiring virtual prop in virtual scene
CN111672119B (en) * 2020-06-05 2023-03-10 腾讯科技(深圳)有限公司 Method, apparatus, device and medium for aiming virtual object
CN111672118B (en) * 2020-06-05 2022-02-18 腾讯科技(深圳)有限公司 Virtual object aiming method, device, equipment and medium
CN111672115B (en) * 2020-06-05 2022-09-23 腾讯科技(深圳)有限公司 Virtual object control method and device, computer equipment and storage medium
CN111672114B (en) * 2020-06-05 2022-03-18 腾讯科技(深圳)有限公司 Target virtual object determination method, device, terminal and storage medium
CN111672113B (en) * 2020-06-05 2022-03-08 腾讯科技(深圳)有限公司 Virtual object selection method, device, equipment and storage medium
CN115779433A (en) * 2020-06-05 2023-03-14 腾讯科技(深圳)有限公司 Method, device, terminal and storage medium for controlling virtual object release technology
CN111672103B (en) * 2020-06-05 2021-10-29 腾讯科技(深圳)有限公司 Virtual object control method in virtual scene, computer device and storage medium
CN111803932A (en) * 2020-07-22 2020-10-23 网易(杭州)网络有限公司 Skill release method for virtual character in game, terminal and storage medium
US11731037B2 (en) * 2020-09-11 2023-08-22 Riot Games, Inc. Rapid target selection with priority zones
AU2021307015B2 (en) * 2020-11-13 2023-06-08 Tencent Technology (Shenzhen) Company Limited Virtual object control method and apparatus, storage medium, and electronic device
CN112494955B (en) * 2020-12-22 2023-10-03 腾讯科技(深圳)有限公司 Skill releasing method, device, terminal and storage medium for virtual object
CN112698780A (en) * 2020-12-29 2021-04-23 贵阳动视云科技有限公司 Virtual rocker control method and device
CN112870717A (en) * 2021-03-12 2021-06-01 北京格瑞拉科技有限公司 Control method and device of virtual operation object and storage medium
CN113244621A (en) * 2021-05-07 2021-08-13 网易(杭州)网络有限公司 Information processing method and device in game, electronic equipment and storage medium
EP4154956A4 (en) 2021-05-14 2024-01-17 Tencent Tech Shenzhen Co Ltd Method and apparatus for controlling avatar, and device and computer-readable storage medium
CN113440852B (en) * 2021-07-08 2022-07-29 腾讯科技(深圳)有限公司 Control method, device, equipment and storage medium for virtual skill in virtual scene
CN113633995A (en) * 2021-08-10 2021-11-12 网易(杭州)网络有限公司 Interactive control method, device and equipment of game and storage medium
CN113694511A (en) * 2021-08-10 2021-11-26 网易(杭州)网络有限公司 Virtual role control method and device and electronic equipment
CN114115528B (en) * 2021-11-02 2024-01-19 深圳市雷鸟网络传媒有限公司 Virtual object control method, device, computer equipment and storage medium
CN115193064A (en) * 2022-07-12 2022-10-18 网易(杭州)网络有限公司 Virtual object control method and device, storage medium and computer equipment
CN115501600B (en) * 2022-09-28 2023-07-14 广州三七极耀网络科技有限公司 Method, system, device and medium for controlling man-machine roles in game

Citations (24)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6587131B1 (en) 1999-06-04 2003-07-01 International Business Machines Corporation Method for assisting user to operate pointer
JP2004073682A (en) 2002-08-21 2004-03-11 Namco Ltd Game system, program, and information storage medium
US20070064004A1 (en) 2005-09-21 2007-03-22 Hewlett-Packard Development Company, L.P. Moving a graphic element
US20070192749A1 (en) 2003-02-03 2007-08-16 Microsoft Corporation Accessing remote screen content
US20100325235A1 (en) 2009-06-19 2010-12-23 Nintendo Co., Ltd. Information processing system, information processing apparatus and information processing system control method, capable of providing, regardless of execution/non-execution of an application, data usable by the application to other information processing apparatus
US20110285636A1 (en) 2010-05-20 2011-11-24 Howard John W Touch screen with virtual joystick and methods for use therewith
US20110295709A1 (en) 2010-05-31 2011-12-01 Nintendo Co., Ltd. Computer-readable storage medium, information processing apparatus, information processing system, and information processing method
US20130241829A1 (en) 2012-03-16 2013-09-19 Samsung Electronics Co., Ltd. User interface method of touch screen terminal and apparatus therefor
US20130342460A1 (en) 2012-03-13 2013-12-26 Joshuah Vincent System, method, and graphical user interface for controlling an application on a tablet
US20140066200A1 (en) 2012-08-31 2014-03-06 Kabushiki Kaisha Square Enix (Also Trading As Square Enix Co., Ltd.) Video game processing apparatus and video game processing program product
CN103809888A (en) 2012-11-12 2014-05-21 北京三星通信技术研究有限公司 Mobile terminal and manipulation method thereof
CN104076986A (en) 2014-07-25 2014-10-01 上海逗屋网络科技有限公司 Touch control method and equipment used for multi-touch screen terminal
CN104267904A (en) 2014-09-26 2015-01-07 深圳市睿德网络科技有限公司 Touch screen virtual unit control method and mobile terminal
CN104750419A (en) 2015-04-07 2015-07-01 上海雪宝信息科技有限公司 Method and device for operating objects on touch terminal
US20150182856A1 (en) 2013-12-31 2015-07-02 Microsoft Corporation Touch screen game controller
CN104898953A (en) 2015-06-16 2015-09-09 深圳市腾讯计算机系统有限公司 Touch screen based control method and device
CN104915117A (en) 2015-06-16 2015-09-16 深圳市腾讯计算机系统有限公司 Method and device for controlling interaction with virtual target
CN104922906A (en) 2015-07-15 2015-09-23 网易(杭州)网络有限公司 Action executing method and device
WO2015151640A1 (en) 2014-04-04 2015-10-08 株式会社コロプラ User interface program and game program
CN105194873A (en) 2015-10-10 2015-12-30 腾讯科技(深圳)有限公司 Information-processing method, terminal and computer storage medium
US9588748B2 (en) * 2010-06-11 2017-03-07 Nintendo Co., Ltd. Information processing terminal, information processing system, computer-readable storage medium having stored thereon information processing program, and information processing method
US20180028918A1 (en) 2015-09-29 2018-02-01 Tencent Technology (Shenzhen) Company Limited Information processing method, terminal, and computer storage medium
US10318124B2 (en) * 2013-02-19 2019-06-11 Sony Interactive Entertainment Inc. Information processing apparatus and information processing method
US10638420B2 (en) * 2017-09-29 2020-04-28 Kyocera Document Solutions Inc. Information processing apparatus selectively executable a normal mode or a sleep mode, and non-transitory computer readable recording medium that records an information processing program executable by an information processing apparatus selectively executable a normal mode or a sleep mode

Family Cites Families (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7469381B2 (en) * 2007-01-07 2008-12-23 Apple Inc. List scrolling and document translation, scaling, and rotation on a touch-screen display
CN101630224A (en) * 2009-08-14 2010-01-20 宇龙计算机通信科技(深圳)有限公司 Method and system for processing control icons on interface and touch terminal
WO2012044713A1 (en) * 2010-10-01 2012-04-05 Imerj LLC Drag/flick gestures in user interface
WO2012114696A1 (en) * 2011-02-24 2012-08-30 パナソニック株式会社 Diffracted sound reduction device, diffracted sound reduction method, and filter coefficient determination method
WO2012135843A1 (en) * 2011-03-31 2012-10-04 Tetris Holding, Llc Systems and methods for manipulation of objects

Patent Citations (27)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6587131B1 (en) 1999-06-04 2003-07-01 International Business Machines Corporation Method for assisting user to operate pointer
JP2004073682A (en) 2002-08-21 2004-03-11 Namco Ltd Game system, program, and information storage medium
US20070192749A1 (en) 2003-02-03 2007-08-16 Microsoft Corporation Accessing remote screen content
US20070064004A1 (en) 2005-09-21 2007-03-22 Hewlett-Packard Development Company, L.P. Moving a graphic element
US20100325235A1 (en) 2009-06-19 2010-12-23 Nintendo Co., Ltd. Information processing system, information processing apparatus and information processing system control method, capable of providing, regardless of execution/non-execution of an application, data usable by the application to other information processing apparatus
US20110285636A1 (en) 2010-05-20 2011-11-24 Howard John W Touch screen with virtual joystick and methods for use therewith
US20110295709A1 (en) 2010-05-31 2011-12-01 Nintendo Co., Ltd. Computer-readable storage medium, information processing apparatus, information processing system, and information processing method
US9588748B2 (en) * 2010-06-11 2017-03-07 Nintendo Co., Ltd. Information processing terminal, information processing system, computer-readable storage medium having stored thereon information processing program, and information processing method
US20130342460A1 (en) 2012-03-13 2013-12-26 Joshuah Vincent System, method, and graphical user interface for controlling an application on a tablet
US20130241829A1 (en) 2012-03-16 2013-09-19 Samsung Electronics Co., Ltd. User interface method of touch screen terminal and apparatus therefor
US20140066200A1 (en) 2012-08-31 2014-03-06 Kabushiki Kaisha Square Enix (Also Trading As Square Enix Co., Ltd.) Video game processing apparatus and video game processing program product
CN103809888A (en) 2012-11-12 2014-05-21 北京三星通信技术研究有限公司 Mobile terminal and manipulation method thereof
US10318124B2 (en) * 2013-02-19 2019-06-11 Sony Interactive Entertainment Inc. Information processing apparatus and information processing method
US20150182856A1 (en) 2013-12-31 2015-07-02 Microsoft Corporation Touch screen game controller
WO2015151640A1 (en) 2014-04-04 2015-10-08 株式会社コロプラ User interface program and game program
US20150049058A1 (en) 2014-06-25 2015-02-19 Shanghai Douwu Network Technology Co., Ltd Method and Apparatus of Touch control for Multi-Point Touch Terminal
CN104076986A (en) 2014-07-25 2014-10-01 上海逗屋网络科技有限公司 Touch control method and equipment used for multi-touch screen terminal
CN104267904A (en) 2014-09-26 2015-01-07 深圳市睿德网络科技有限公司 Touch screen virtual unit control method and mobile terminal
CN104750419A (en) 2015-04-07 2015-07-01 上海雪宝信息科技有限公司 Method and device for operating objects on touch terminal
CN104915117A (en) 2015-06-16 2015-09-16 深圳市腾讯计算机系统有限公司 Method and device for controlling interaction with virtual target
CN104898953A (en) 2015-06-16 2015-09-09 深圳市腾讯计算机系统有限公司 Touch screen based control method and device
US20170361230A1 (en) 2015-06-16 2017-12-21 Tencent Technology (Shenzhen) Company Limited Method for controlling interaction with virtual target, terminal, and storage medium
EP3312710A1 (en) 2015-06-16 2018-04-25 Tencent Technology Shenzhen Company Limited Operation and control method based on touch screen, and terminal
CN104922906A (en) 2015-07-15 2015-09-23 网易(杭州)网络有限公司 Action executing method and device
US20180028918A1 (en) 2015-09-29 2018-02-01 Tencent Technology (Shenzhen) Company Limited Information processing method, terminal, and computer storage medium
CN105194873A (en) 2015-10-10 2015-12-30 腾讯科技(深圳)有限公司 Information-processing method, terminal and computer storage medium
US10638420B2 (en) * 2017-09-29 2020-04-28 Kyocera Document Solutions Inc. Information processing apparatus selectively executable a normal mode or a sleep mode, and non-transitory computer readable recording medium that records an information processing program executable by an information processing apparatus selectively executable a normal mode or a sleep mode

Non-Patent Citations (6)

* Cited by examiner, † Cited by third party
Title
Canadian Intellectual Property Office Application No. 2981553 dated Aug. 29, 2018 6 Pages.
IP Australia Examination report No. 2 for Application No. 2016336603 dated Jan. 22, 2019 7 pages.
The European Patent Office (EPO) The Extended European Search Report for 16852992.3 dated Oct. 25, 2018 12 Pages.
The Japan Patent Office (JPO) Notification of Reasons for Refusal for Application No. 2017-554809 dated Sep. 11, 2018 7 Pages (including translation).
The State Intellectual Property Office of the People's Republic of China (SIPO) Office Action 1 for 201510654167.4 dated Jan. 22, 2018 10 Pages (including translation).
The World Intellectual Property Organization (WIPO) International Search Report for PCT/CN2016/083208 dated Jul. 28, 2016 5 Pages (including translation).

Also Published As

Publication number Publication date
CN105194873A (en) 2015-12-30
CN108355348B (en) 2021-01-26
CA2981553A1 (en) 2017-04-13
US10444871B2 (en) 2019-10-15
CA2981553C (en) 2019-11-19
EP3264248A4 (en) 2018-12-05
EP3264248B1 (en) 2020-09-23
KR102041170B1 (en) 2019-11-06
CN105194873B (en) 2019-01-04
KR20180005222A (en) 2018-01-15
JP6620169B2 (en) 2019-12-11
CN108355348A (en) 2018-08-03
AU2016336603B2 (en) 2019-06-13
EP3264248A1 (en) 2018-01-03
AU2016336603A1 (en) 2017-11-09
US20190391676A1 (en) 2019-12-26
JP2018517449A (en) 2018-07-05
MY187968A (en) 2021-11-03
US20180024660A1 (en) 2018-01-25
WO2017059684A1 (en) 2017-04-13

Similar Documents

Publication Publication Date Title
US11003261B2 (en) Information processing method, terminal, and computer storage medium
US10768812B2 (en) Method, terminal, and storage medium for operating objects displayed on a graphical user interface
US10434403B2 (en) Information processing method, terminal, and computer storage medium
US10864441B2 (en) Information processing method, terminal, and computer storage medium
EP3273334B1 (en) Information processing method, terminal and computer storage medium
US10398977B2 (en) Information processing method, terminal, and computer storage medium
WO2017054453A1 (en) Information processing method, terminal and computer storage medium

Legal Events

Date Code Title Description
FEPP Fee payment procedure

Free format text: ENTITY STATUS SET TO UNDISCOUNTED (ORIGINAL EVENT CODE: BIG.); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY

STPP Information on status: patent application and granting procedure in general

Free format text: DOCKETED NEW CASE - READY FOR EXAMINATION

STPP Information on status: patent application and granting procedure in general

Free format text: NOTICE OF ALLOWANCE MAILED -- APPLICATION RECEIVED IN OFFICE OF PUBLICATIONS

STPP Information on status: patent application and granting procedure in general

Free format text: PUBLICATIONS -- ISSUE FEE PAYMENT RECEIVED

STPP Information on status: patent application and granting procedure in general

Free format text: PUBLICATIONS -- ISSUE FEE PAYMENT VERIFIED

STCF Information on status: patent grant

Free format text: PATENTED CASE