WO2017054450A1 - 一种信息处理方法、终端和计算机存储介质 - Google Patents
一种信息处理方法、终端和计算机存储介质 Download PDFInfo
- Publication number
- WO2017054450A1 WO2017054450A1 PCT/CN2016/081041 CN2016081041W WO2017054450A1 WO 2017054450 A1 WO2017054450 A1 WO 2017054450A1 CN 2016081041 W CN2016081041 W CN 2016081041W WO 2017054450 A1 WO2017054450 A1 WO 2017054450A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- character
- role
- user interface
- graphical user
- character object
- Prior art date
Links
Images
Classifications
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F13/00—Video games, i.e. games using an electronically generated display having two or more dimensions
- A63F13/55—Controlling game characters or game objects based on the game progress
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0484—Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F13/00—Video games, i.e. games using an electronically generated display having two or more dimensions
- A63F13/55—Controlling game characters or game objects based on the game progress
- A63F13/58—Controlling game characters or game objects based on the game progress by computing conditions of game characters, e.g. stamina, strength, motivation or energy level
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F13/00—Video games, i.e. games using an electronically generated display having two or more dimensions
- A63F13/20—Input arrangements for video game devices
- A63F13/21—Input arrangements for video game devices characterised by their sensors, purposes or types
- A63F13/214—Input arrangements for video game devices characterised by their sensors, purposes or types for locating contacts on a surface, e.g. floor mats or touch pads
- A63F13/2145—Input arrangements for video game devices characterised by their sensors, purposes or types for locating contacts on a surface, e.g. floor mats or touch pads the surface being also a display device, e.g. touch screens
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F13/00—Video games, i.e. games using an electronically generated display having two or more dimensions
- A63F13/50—Controlling the output signals based on the game progress
- A63F13/53—Controlling the output signals based on the game progress involving additional visual information provided to the game scene, e.g. by overlay to simulate a head-up display [HUD] or displaying a laser sight in a shooting game
- A63F13/533—Controlling the output signals based on the game progress involving additional visual information provided to the game scene, e.g. by overlay to simulate a head-up display [HUD] or displaying a laser sight in a shooting game for prompting the player, e.g. by displaying a game menu
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F13/00—Video games, i.e. games using an electronically generated display having two or more dimensions
- A63F13/50—Controlling the output signals based on the game progress
- A63F13/53—Controlling the output signals based on the game progress involving additional visual information provided to the game scene, e.g. by overlay to simulate a head-up display [HUD] or displaying a laser sight in a shooting game
- A63F13/537—Controlling the output signals based on the game progress involving additional visual information provided to the game scene, e.g. by overlay to simulate a head-up display [HUD] or displaying a laser sight in a shooting game using indicators, e.g. showing the condition of a game character on screen
- A63F13/5378—Controlling the output signals based on the game progress involving additional visual information provided to the game scene, e.g. by overlay to simulate a head-up display [HUD] or displaying a laser sight in a shooting game using indicators, e.g. showing the condition of a game character on screen for displaying an additional top view, e.g. radar screens or maps
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/017—Gesture based interaction, e.g. based on a set of recognized hand gestures
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0481—Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
- G06F3/0482—Interaction with lists of selectable items, e.g. menus
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0484—Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
- G06F3/04842—Selection of displayed objects or displayed text elements
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0487—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
- G06F3/0488—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
- G06F3/04883—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures for inputting data by handwriting, e.g. gesture or text
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T15/00—3D [Three Dimensional] image rendering
- G06T15/10—Geometric effects
- G06T15/20—Perspective computation
- G06T15/205—Image-based rendering
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F2300/00—Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
- A63F2300/30—Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by output arrangements for receiving control signals generated by the game device
- A63F2300/308—Details of the user interface
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F2300/00—Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
- A63F2300/60—Methods for processing data by generating or executing the game program
- A63F2300/64—Methods for processing data by generating or executing the game program for computing dynamical parameters of game objects, e.g. motion determination or computation of frictional forces for a virtual car
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F2300/00—Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
- A63F2300/60—Methods for processing data by generating or executing the game program
- A63F2300/65—Methods for processing data by generating or executing the game program for computing the condition of a game character
Definitions
- the present invention relates to information processing technologies, and in particular, to an information processing method, a terminal, and a computer storage medium.
- Embodiments of the present invention are desirable to provide an information processing method, a terminal, and a computer storage medium. It can quickly and accurately select target objects in the process of information interaction to enhance the user experience.
- An embodiment of the present invention provides an information processing method, by executing a software application on a processor of a terminal and rendering on a display of the terminal, to obtain a graphical user interface, the processor, a graphical user interface, and the Software applications are implemented on a gaming system; the methods include:
- At least one character object deployed in at least one character selection area of the graphical user interface includes at least one window bit
- the character operation object is rendered in at least one of the window bits
- the first character object When detecting a selection operation gesture for at least one of the character object objects that is rendered according to the first display parameter, the first character object performs the first virtual operation on the corresponding second role object. At least one of them.
- the embodiment of the present invention further provides a terminal, where the terminal includes: a rendering processing unit, a deployment unit, a detecting unit, and an operation executing unit;
- the rendering processing unit is configured to execute a software application and render to obtain a graphical user interface, and render at least one virtual resource object on the graphical user interface, at least one of the virtual resource objects being configured according to the input a first character object that performs a first virtual operation by a user command; and is further configured to: at least one of the character operation objects associated with the second role object detected by the detecting unit according to the first display parameter Rendering in the window bit;
- the deployment unit configured to deploy at least one role object of the at least one role selection area of the graphical user interface, including at least one window bit;
- the detecting unit is configured to detect a second role object in the graphical user interface that meets a first preset condition by a distance from the first character object;
- the operation execution unit is configured to: when detecting a selection operation gesture for at least one character operation object that is rendered according to the first display parameter, the first role object is corresponding to the second role The object performs at least one of the first virtual operations.
- An embodiment of the present invention further provides a terminal, where the terminal includes: a processor and a display; the processor is configured to execute a software application and perform rendering on the display to obtain a graphical user interface, the processor, A graphical user interface and the software application are implemented on a gaming system;
- the processor is further configured to render at least one virtual resource object on the graphical user interface, at least one of the virtual resource objects being configured to perform a first virtual operation according to the input first user command a role object;
- At least one character object deployed in at least one character selection area of the graphical user interface includes at least one window bit
- the character operation object is rendered in at least one of the window bits
- the first character object When detecting a selection operation gesture for at least one of the character object objects that is rendered according to the first display parameter, the first character object performs the first virtual operation on the corresponding second role object. At least one of them.
- the embodiment of the invention further provides a computer storage medium, wherein the computer storage medium stores computer executable instructions, and the computer executable instructions are used to execute the information processing method according to the embodiment of the invention.
- the information processing method, the terminal, and the computer storage medium of the embodiment of the present invention are associated with the second role object that performs information interaction on the first role object by using a window bit in the character device object of the character selection area deployed in the graphical user interface.
- the character operation object is rendered in the corresponding window bit, and the character operation object associated with the second character object whose distance between the first character object and the first preset object meets the first preset condition is rendered according to the first display parameter.
- the UI avatar associated with the second character object whose distance between the first character object and the first role object meets the first preset condition is rendered according to the first display parameter, so that the UI avatar has a difference from other UI avatars.
- the display effect is convenient for the user to select the target character object quickly and accurately by selecting the operation gesture of the character operation object based on the display effect of the difference when the selected target character operates the object, thereby greatly improving the user interaction process. Operating experience.
- FIG. 1 is a schematic diagram of an application architecture of an information processing method according to an embodiment of the present invention
- FIG. 2 is a schematic flowchart of an information processing method according to Embodiment 1 of the present invention.
- FIG. 3 is a first schematic diagram of a graphical user interface of an information processing method according to an embodiment of the present invention.
- FIG. 4 is a schematic diagram of a detection principle of a second role object in which the distance between the first role object and the first role object meets the first preset condition in the information processing method according to the embodiment of the present invention
- FIG. 5 is a second schematic diagram of a graphical user interface of an information processing method according to an embodiment of the present invention.
- FIG. 6 is a schematic flowchart of an information processing method according to Embodiment 2 of the present invention.
- FIG. 7 is a schematic diagram of a detection principle of a second role object in which the distance between the first role object and the first role object meets the second preset condition in the information processing method according to the embodiment of the present invention.
- FIG. 8 is a schematic flowchart of an information processing method according to Embodiment 3 of the present invention.
- FIG. 9 is a third schematic diagram of a graphical user interface of an information processing method according to an embodiment of the present invention.
- FIG. 10 is a schematic diagram of an interaction application of an information processing method according to an embodiment of the present invention.
- FIG. 11 is a fourth schematic diagram of a graphical user interface in an information processing method according to an embodiment of the present invention.
- FIG. 12 is a schematic structural diagram of a terminal of a fourth embodiment of the present invention.
- FIG. 13 is a schematic structural diagram of a terminal of a fifth embodiment of the present invention.
- FIG. 14 is a schematic structural diagram of a terminal of a sixth embodiment of the present invention.
- FIG. 1 is a schematic diagram of an application architecture of an information processing method according to an embodiment of the present invention
- the application architecture includes: a server 101 and at least one terminal, where the terminal includes: The terminal 102, the terminal 103, the terminal 104, the terminal 105, and the terminal 106, wherein the at least one terminal can establish a connection with the server 101 through a network 100 such as a wired network or a wireless network.
- the terminal includes a mobile phone, a desktop computer, a PC, an all-in-one, and the like.
- the processor of the terminal is capable of executing a software application and rendering on a display of the terminal to obtain a graphical user interface, the processor, the graphical user interface and the software application being implemented on the game system .
- the at least one terminal may perform information interaction with the server 101 through a wired network or a wireless network.
- a one-to-one or many-to-many eg, three to three, five to five
- the one-to-one application scenario may be that the virtual resource object in the graphical user object that is rendered by the terminal interacts with the information of the virtual resource object preset in the game system (can be understood as a human-machine battle), that is, The information interaction between the terminal and the server; the one-to-one application scenario may also be a virtual resource object in a graphical user object rendered by one terminal and a virtual resource object in a graphical user object rendered by another terminal.
- the information interaction for example, the virtual resource object in the graphical user object rendered by the terminal 102 interacts with the information of the virtual resource object in the graphical user object rendered by the terminal 103.
- the multi-to-many application mode scenario takes a 3-to-3 application mode scenario as an example, and the terminal 1, the terminal 2, and the terminal 3 respectively render the obtained image.
- the virtual resource objects in the user object form a first group
- the virtual resource objects in the graphical user object respectively rendered by the terminal 4 form a second group
- FIG. 1 is only an example of an application architecture that implements an embodiment of the present invention.
- the embodiment of the present invention is not limited to the application structure described in FIG. 1 above, and various embodiments of the present invention are proposed based on the application architecture.
- FIG. 2 is a schematic flowchart diagram of an information processing method according to Embodiment 1 of the present invention.
- the information processing method is applied to a terminal, by executing a software application on a processor of the terminal and rendering on a display of the terminal to obtain a graphical user interface, the processor, the graphical user interface, and the software application Implemented on a gaming system; as shown in Figure 2, the method includes:
- Step 201 Rendering at least one virtual resource object on the graphical user interface, at least one of the virtual resource objects being configured to perform a first character object of the first virtual operation according to the input first user command.
- Step 202 At least one role object deployed in at least one role selection area of the graphical user interface includes at least one window bit.
- Step 203 Detect a second role object in the graphical user interface that meets a first preset condition with a distance between the first character object, and the detected second object object according to the first display parameter.
- the associated character action object is rendered in at least one of the window bits.
- Step 204 When detecting a selection operation gesture for at least one character operation object that is rendered according to the first display parameter, the first role object performs the first role on the corresponding second role object. At least one of a virtual operation.
- the graphical user interface includes at least one character selection area, where the role selection area includes at least one character object, and the role object includes at least one window. a bit, wherein at least part of the window bit carries a corresponding character operation object, wherein the character operation object is represented in the graphical user interface by an identifier of the role object associated with the role operation object (the identifier may be an avatar);
- the second role object associated with the role operation object belongs to a different group from the first role object.
- the rendering manner of the character device object in the character selection area includes, but is not limited to, a strip shape and a ring shape, that is, the character object object can be characterized by a character selection bar object or a character selection disk object.
- FIG. 3 is a first schematic diagram of a graphical user interface of an information processing method according to an embodiment of the present invention
- at least one virtual resource object is included in a graphical user interface rendered on a display of the terminal
- the virtual resource object includes at least one first role object a10, and the user of the terminal can perform information interaction through the graphical user interface, that is, input a user command; the first role object a10 can be detected based on the terminal
- the first user command to perform the first virtual operation; the first virtual operation includes but is not limited to: a mobile operation, a physical attack operation, a skill attack operation, and the like.
- the first character object a10 is a character object manipulated by a user of the terminal; in the game system, the first character object a10 can be in the graphical user interface based on the operation of the user. Perform the appropriate action.
- the graphical user interface further includes a small map 801 of the virtual area where the user role object is located; a detailed method diagram of the small map 801 is shown in 801a, and each role object can be seen. The location including the friend belonging to the first group and the enemy belonging to the second group with the first role object a10 is identified in the small map 801 at the location of the virtual area.
- the graphical user interface further includes at least one skill object 803, and the user can control the user character object to perform a corresponding skill release operation through a skill release operation.
- the graphical user interface has a role selection area 802; a role object is deployed in the role selection area 802.
- the role object is characterized by a role selection bar object.
- the character object object presents a strip display effect.
- the character set object includes at least one window bit, and a second character interacting with the first character object
- the object-related object operation object is rendered in the corresponding window bit;
- the character operation object is represented by the avatar representation as an example, that is, the character selection area 802 includes at least one avatar, and the at least one avatar is respectively associated with the
- the at least one second role object that the first character object interacts has a one-to-one correspondence. As shown in FIG.
- a second role object belonging to a different group from the first role object a10 includes five, and correspondingly includes five in the role selection area 802.
- a character operation object such as the character operation object b11, the character operation object b12, the character operation object b13, the character operation object b14, and the character operation object b15 shown in FIG. 3; it can be understood that five of the character selection areas 802
- the character operation objects are in one-to-one correspondence with the five second role objects of the first role object belonging to different groups.
- the position of the first character object changes in real time during the manipulation of the user of the terminal, and correspondingly, the position of the second character object in the graphical user interface is real-time. A change occurs. Therefore, in a process in which the first character object performs a virtual operation on the second role object, the user of the terminal does not easily select the character object to be virtualally operated. Based on this, in this embodiment, a second role object in which the distance between the graphical user interface and the first character object meets the first preset condition is detected.
- the detecting, by the second user object, the distance between the graphical user interface and the first character object that meets the first preset condition includes: detecting the first user in the graphical user interface A second character object having a distance between objects that is less than a first predetermined threshold.
- the object 1 is a circle, a circular area having a radius of the first predetermined threshold (R), and a range of regions included in the circular area is obtained, and the range of the area may be represented by a coordinate range;
- An XY coordinate system is established in a virtual space in which the first character object and the second character object are located, and a coordinate range of the circular area in the XY coordinate system is acquired; further, detecting the real-time in the graphical user interface
- the coordinates of the second character object determine the detected coordinates Whether it is in the coordinate range characterizing the circular area; when it is determined that the coordinate falls in the coordinate range characterizing the graphic area (such as the second character object 2, the second character object 3, and the The second character object 4 is in the circular area, and the second character object in the
- FIG. 5 is a second schematic diagram of a graphical user interface of an information processing method according to an embodiment of the present invention; as shown in FIG. 5, a role operation object associated with a second role object that satisfies a first preset condition passes the first display The parameter is rendered (see the character operation object b12 shown in FIG.
- the outer ring edge of the character operation object b12 has a rendering effect different from that of other character operation objects, so that it has a bright display effect);
- the other character operation object, the character operation object (such as the character operation object b12) rendered by the first display parameter has a distinct distinguishing feature, so that the terminal user can immediately recognize the character operation object, thereby facilitating the The terminal user can quickly select such a character operation object with distinct distinguishing features in subsequent operations.
- At least one role operation object that is rendered according to the first display parameter in the character bar selection object in the graphic user interface, when the terminal user triggers the at least one role operation object
- An operation that is, when the terminal detects a selection operation gesture for the at least one character operation object, indicating that the second role object associated with the role operation object is selected; further the first role object pair
- the second role object performs the first virtual operation.
- the first virtual operation may be a physical attack operation. It can also be a skill release operation.
- the first virtual operation is a physical attack operation, after the role operation object associated with the second role object is selected, the first role object directly performs a physical attack operation on the second role object.
- the skill release operation is to be performed, the skill object is first selected by the skill selection operation gesture; after the role operation object associated with the second role object is selected, the first role object is executed for the second role object.
- the skill release operation of the skill object is first selected by the skill selection operation gesture.
- the role operation object associated with the second role object that performs information interaction on the first role object is corresponding to the window position in the character device object of the role selection area deployed in the graphical user interface. Rendering in the window bit, and rendering the character operation object associated with the second character object whose distance between the first character object and the first character object meets the first preset condition, according to the first display parameter, The UI avatar associated with the second character object that satisfies the first preset condition is rendered according to the first display parameter, so that the UI avatar has a display effect different from other UI avatars, so that the user is convenient
- the target character object can be quickly and accurately selected by the selection operation gesture of the character operation object based on the display effect of the difference, thereby greatly improving the operation experience of the user in the interaction process.
- FIG. 6 is a schematic flowchart diagram of an information processing method according to Embodiment 2 of the present invention.
- the information processing method is applied to a terminal, by executing a software application on a processor of the terminal and rendering on a display of the terminal to obtain a graphical user interface, the processor, the graphical user interface, and the software application Implemented on a gaming system; as shown in Figure 6, the method includes:
- Step 301 Render at least one virtual resource object on the graphical user interface, at least one of the virtual resource objects being configured to perform a first character object of the first virtual operation according to the input first user command.
- Step 302 The at least one role object deployed in the at least one role selection area of the graphical user interface includes at least one window bit.
- the graphical user interface includes at least one role selection area, where the role selection area includes at least one character object, and the role object includes at least one window bit, wherein at least part of the window bit carries the corresponding role.
- An operation object wherein the character operation object is represented by an identifier of the role object associated with the role operation object (the identifier may be an avatar) in the graphical user interface; here, the role operation object is associated with The second role object belongs to a different group than the first role object.
- the rendering manner of the character device object in the character selection area includes, but is not limited to, a strip shape and a ring shape, that is, the character object object can be characterized by a character selection bar object or a character selection disk object.
- the graphical user interface rendered on the display of the terminal includes at least one virtual resource object; wherein the virtual resource object includes at least one first role object a10, and the terminal is used.
- the user may perform information interaction through the graphical user interface, that is, input a user command; the first role object a10 can perform a first virtual operation based on the first user command detected by the terminal; the first virtual operation includes but Not limited to: mobile operations, physical attack operations, skill attack operations, and so on.
- the first character object a10 is a character object manipulated by a user of the terminal; in the game system, the first character object a10 can be in the graphical user interface based on the operation of the user. Perform the appropriate action.
- the graphical user interface further includes a small map 801 of the virtual area where the user role object is located; a detailed method diagram of the small map 801 is shown in 801a, and each role object can be seen.
- the location including the friend belonging to the first group and the enemy belonging to the second group with the first role object a10 is identified in the small map 801 at the location of the virtual area.
- the graphical user interface further includes at least one skill object 803, and the user can control the user character object to perform a corresponding skill release operation through a skill release operation.
- the graphical user interface has a character selection area 802;
- a character object is deployed in the character selection area 802.
- the character object is characterized by a character selection bar object (ie, the character device object presents a bar display effect).
- the character object includes at least one window bit, and the character operation object associated with the second character object interacting with the first character object is rendered in a corresponding window bit;
- the role selection area 802 includes at least one avatar, and the at least one avatar respectively corresponds to at least one second role object that interacts with the first role object.
- a second role object belonging to a different group from the first role object a10 includes five, and correspondingly includes five in the role selection area 802.
- a character operation object such as the character operation object b11, the character operation object b12, the character operation object b13, the character operation object b14, and the character operation object b15 shown in FIG. 3; it can be understood that five of the character selection areas 802
- the character operation objects are in one-to-one correspondence with the five second role objects of the first role object belonging to different groups.
- the position of the first character object changes in real time during the manipulation of the user of the terminal, and correspondingly, the position of the second character object in the graphical user interface is real-time.
- a change occurs, and based on this, in a process in which the first character object performs a virtual operation on the second role object, the user of the terminal does not easily select the character object to be virtualized.
- a second role object in which the distance between the graphical user interface and the first character object meets the first preset condition is detected.
- Step 303 Detect a second role object in the graphical user interface that meets a first preset condition with a distance between the first character object, and the detected second object object according to the first display parameter.
- the associated character action object is rendered in at least one of the window bits.
- the detecting the second role object that the distance between the graphical user interface and the first character object meets the first preset condition comprises: detecting the first user object in the graphical user interface A second character object having a distance less than a first predetermined threshold.
- detecting that the first character object 1 is centered, and the first preset threshold (R) is half a circular area of the path the range of the area included in the circular area is obtained, and the range of the area may be represented by a coordinate range; that is, the virtual space in which the first character object and the second role object are located is established.
- XY coordinate system acquiring a coordinate range of the circular area in the XY coordinate system; further detecting a coordinate of the second character object in the graphical user interface in real time, determining whether the detected coordinate is in the characterizing In the coordinate range of the circular area; when it is determined that the coordinates fall within the coordinate range characterizing the graphic area (such as the second character object 2, the second character object 3, and the second character object 4 as shown in FIG. 4 In the circular area, a second character object in the graphical user interface that is less than the first preset threshold is detected.
- the first preset threshold satisfies an attack distance or a skill release distance of the first character object, so that the second character object can be quickly selected and used by the first role object in a subsequent operation. Perform a virtual operation.
- Step 304 Detecting that the distance between the first character object and the first character object meets the first preset condition, and the distance between the first character object and the first character object meets at least part of the second preset condition.
- the character object renders the character operation object associated with the detected at least part of the second character object in the at least one of the window bits according to the second display parameter.
- Two role objects including:
- the second role object that reaches a second preset threshold value;
- the second preset threshold is greater than or equal to the first preset threshold.
- FIG. 7 is a schematic diagram of a detection principle of a second role object in which the distance between the first character object and the first character object meets the second preset condition in the information processing method according to the embodiment of the present invention; FIG. 4 and FIG.
- the second character object in which the distance between the character objects satisfies the first preset condition (the second character object 2, the second character object 3, and the second character object 4 as shown in FIG.
- a first preset condition that is, a second character object whose previous coordinate value is in a circular area having a radius of a first preset threshold (R); due to the position of the second character object in the graphical user interface
- Real-time detection changes, based on this, before step 305, that is, before detecting a selection operation gesture for at least one of the character object objects that is rendered according to the first display parameter, the real-time detection of the previous coordinate value is a coordinate value of the second character object in a circular area having a radius of the first preset threshold (R), and determining whether the coordinate value is at the second preset threshold (r shown in FIG. 7) a circular area in which the first character object is a center of the circle; in the illustration shown in FIG.
- the second preset threshold (r) is greater than the first predetermined threshold (R), that is, The first coordinate object is in the second character object in the circular area with the radius of the first preset threshold.
- R first predetermined threshold
- the second preset threshold (r) is greater than the first predetermined threshold (R)
- the second preset threshold (r) is greater than the first pre- Threshold (R) and reaching the second preset threshold (r), such as the second character object 4 shown in FIG. 7, further releasing the role operation object associated with the at least part of the second character object can be selected
- An operation state and the character operation object is rendered in the corresponding window position according to the second display parameter.
- the second display parameter may be a normal display parameter, that is, in the graphical user interface, except for the role operation object displayed according to the first display parameter, the remaining virtual resource objects are according to the The second display parameter is rendered.
- Step 305 When detecting a selection operation gesture for at least one character operation object that is rendered according to the first display parameter, the first role object performs the first role on the corresponding second role object. At least one of a virtual operation.
- the character operation object is rendered in a corresponding window bit according to the first display parameter.
- a character operation object associated with a second character object that satisfies a first preset condition is rendered by the first display parameter.
- the outer ring edge of the character operation object b12 has a rendering effect different from that of other character operation objects, so as to have a bright display effect); compared to other characters
- the character operation object (such as the character operation object b12) rendered by the first display parameter has obvious distinguishing features, so that the terminal user can immediately recognize the character operation object, thereby facilitating use by the terminal. In subsequent operations, the character action object with such distinct features can be quickly selected.
- At least one role operation object that is rendered according to the first display parameter in the character bar selection object in the graphic user interface, when the terminal user triggers the at least one role operation object
- An operation that is, when the terminal detects a selection operation gesture for the at least one character operation object, indicating that the second role object associated with the role operation object is selected; further the first role object pair
- the second role object performs the first virtual operation.
- the first virtual operation may be a physical attack operation or a skill release operation.
- the first virtual operation is a physical attack operation
- the first role object associated with the second role object is selected, the first role object directly performs a physical attack operation on the second role object.
- the skill release operation is to be performed, the skill object is first selected by the skill selection operation gesture; after the role operation object associated with the second role object is selected, the first role object is executed for the second role object.
- the skill release operation of the skill object is first selected by the skill selection operation gesture.
- the role operation object associated with the second role object that performs information interaction on the first role object is corresponding to the window position in the character device object of the role selection area deployed in the graphical user interface. Rendering in the window bit, and rendering the character operation object associated with the second character object whose distance between the first character object and the first character object meets the first preset condition, according to the first display parameter, The UI avatar associated with the second character object that satisfies the first preset condition is rendered according to the first display parameter, so that the UI avatar has a display effect different from other UI avatars, so that the user is convenient
- the target character object can be quickly and accurately selected by the selection operation gesture of the character operation object based on the display effect of the difference, thereby greatly improving the operation experience of the user in the interaction process.
- FIG. 8 is a schematic flowchart diagram of an information processing method according to Embodiment 3 of the present invention.
- the information processing method is applied to a terminal, by executing a software application on a processor of the terminal and rendering on a display of the terminal to obtain a graphical user interface, the processor, the graphical user interface, and the software application Implemented on a gaming system; as shown in Figure 8, the method includes:
- Step 401 Render at least one virtual resource object on the graphical user interface, at least one of the virtual resource objects being configured to execute a first role object of the first virtual operation according to the input first user command.
- Step 402 The at least one role object deployed in the at least one role selection area of the graphical user interface includes at least one window bit.
- the graphical user interface includes at least one role selection area, where the role selection area includes at least one character object, and the role object includes at least one window bit, wherein at least part of the window bit carries the corresponding role.
- An operation object wherein the character operation object is represented by an identifier of the role object associated with the role operation object (the identifier may be an avatar) in the graphical user interface; here, the role operation object is associated with The second role object belongs to a different group than the first role object.
- the rendering manner of the character device object in the character selection area includes, but is not limited to, a strip shape and a ring shape, that is, the character object object can be characterized by a character selection bar object or a character selection disk object.
- the graphical user interface rendered on the display of the terminal includes at least one virtual resource object; wherein the virtual resource object includes at least one first role object a10, and the terminal is used.
- Information can be exchanged through the graphical user interface Interacting with each other, that is, inputting a user command; the first role object a10 can perform a first virtual operation based on the first user command detected by the terminal; the first virtual operation includes but is not limited to: a mobile operation, a physical attack operation, Skill attack operations and more.
- the first character object a10 is a character object manipulated by a user of the terminal; in the game system, the first character object a10 can be in the graphical user interface based on the operation of the user.
- the graphical user interface further includes a small map 801 of the virtual area where the user role object is located; a detailed method diagram of the small map 801 is shown in 801a, and each role object can be seen.
- the location including the friend belonging to the first group and the enemy belonging to the second group with the first role object a10 is identified in the small map 801 at the location of the virtual area.
- the graphical user interface further includes at least one skill object 803, and the user can control the user character object to perform a corresponding skill release operation through a skill release operation.
- the graphical user interface has a role selection area 802; a role object is deployed in the role selection area 802.
- the role object is characterized by a role selection bar object.
- the character object object presents a strip display effect.
- the character object includes at least one window bit, and the character operation object associated with the second character object interacting with the first character object is rendered in a corresponding window bit; and the character operation object is represented by the avatar as
- the role selection area 802 includes at least one avatar, and the at least one avatar respectively corresponds to at least one second role object that interacts with the first role object.
- FIG. 1 the role selection area 802
- the at least one avatar respectively corresponds to at least one second role object that interacts with the first role object.
- a second role object belonging to a different group from the first role object a10 includes five, and correspondingly includes five in the role selection area 802.
- a character operation object such as the character operation object b11, the character operation object b12, the character operation object b13, the character operation object b14, and the character operation object b15 shown in FIG. 3; it can be understood that five of the character selection areas 802
- the character operation objects are in one-to-one correspondence with the second role objects of the first role object belonging to different groups.
- the location of the first role object changes in real time.
- the location of the second role object in the graphical user interface changes in real time.
- the first role object performs a virtual operation on the second role object.
- the user of the terminal does not easily select the character object to be virtualized.
- a second role object in which the distance between the graphical user interface and the first character object meets the first preset condition is detected.
- Step 403 Detect a second role object in the graphical user interface that meets a first preset condition with a distance between the first character object, and the detected second object object according to the first display parameter.
- the associated character action object is rendered in at least one of the window bits.
- the detecting the second role object that the distance between the graphical user interface and the first character object meets the first preset condition comprises: detecting the first user object in the graphical user interface A second character object having a distance less than a first predetermined threshold.
- a circular area having a radius of the first predetermined threshold (R) centered on the first character object 1 is detected, and a range of regions included in the circular area is acquired.
- the area range may be represented by a coordinate range; that is, an XY coordinate system is established in a virtual space in which the first character object and the second character object are located, and a coordinate range of the circular area in the XY coordinate system is acquired.
- detecting coordinates of the second character object in the graphical user interface in real time determining whether the detected coordinates are in a coordinate range characterizing the circular area; when determining that coordinates are falling on the graphic area In the coordinate range (as shown in FIG. 4, the second character object 2, the second character object 3, and the second character object 4 are all in the circular area), then the graphical user interface is detected
- the first preset threshold satisfies an attack distance or a skill release distance of the first character object, so that the second character object can be quickly selected and used by the first role object in a subsequent operation. Perform a virtual operation.
- Step 404 When detecting a selection operation gesture for at least one character operation object that is rendered according to the first display parameter, the first role object is corresponding to the first The second role object performs at least one of the first virtual operations.
- the character operation object is rendered in a corresponding window bit according to the first display parameter.
- a character operation object associated with a second character object that satisfies a first preset condition is rendered by the first display parameter (see the character operation object b12 shown in FIG.
- the character operation object The outer ring edge of b12 has a rendering effect different from that of other character operation objects, so as to have a bright display effect); the character operation object rendered by the first display parameter is compared with other character operation objects (for example, the role operation object b12) has obvious distinguishing features, so that the terminal user can immediately recognize the character operation object, thereby facilitating the terminal user to quickly operate the character operation object with the distinct distinguishing feature in subsequent operations. Make a selection.
- At least one role operation object that is rendered according to the first display parameter in the character bar selection object in the graphic user interface, when the terminal user triggers the at least one role operation object
- An operation that is, when the terminal detects a selection operation gesture for the at least one character operation object, indicating that the second role object associated with the role operation object is selected; further the first role object pair
- the second role object performs the first virtual operation.
- the first virtual operation may be a physical attack operation or a skill release operation.
- the first virtual operation is a physical attack operation
- the first role object associated with the second role object is selected, the first role object directly performs a physical attack operation on the second role object.
- the skill release operation is to be performed, the skill object is first selected by the skill selection operation gesture; after the role operation object associated with the second role object is selected, the first role object is executed for the second role object.
- the skill release operation of the skill object is first selected by the skill selection operation gesture.
- Step 405 Acquire state attribute information of the second role object in the graphic user interface, Synchronizing the state attribute information to a server; and obtaining state attribute information of a role object associated with the role operation object in the roler object from the server.
- the terminal acquires state attribute information of the second role object in the graphical user interface; the virtual space in which the first role object and the second role object are located may be relatively relatively based on the setting of the software application.
- the terminal obtains state attribute information of the second role object included in the graphic user interface, and synchronizes the second role object corresponding to the state attribute information to the server.
- the status attribute information includes, but is not limited to, a blood volume value, a life value, or skill attribute information of the second role object.
- the terminal obtains state attribute information of the role object associated with the role operation object in the role object from the server according to a preset rule, so that when the terminal's own graphical user interface does not
- the second role object and the associated state attribute information in the server may be synchronized by other terminals, thereby obtaining the Status attribute information of the second role object associated with all role action objects included in the character object.
- the terminal and the other terminal belong to the same group. It can be understood that the first character object that is controlled by the terminal and the first role object that is controlled by the other terminal belong to the same group in the game system.
- Groups collectively perform virtual operations on second role objects that belong to another group.
- the graphical user interface of the other terminal may include at least part of the second role object, thereby based on at least one terminal belonging to the same group. Acquiring state attribute information of the second role object included in the graphical user interface, and realizing synchronization of state attribute information of the second role object.
- Step 406 The second role pair is preset according to the obtained state attribute information.
- the associated role manipulation object is rendered in the corresponding window position.
- FIG. 9 is a third schematic diagram of a graphical user interface of an information processing method according to an embodiment of the present invention. As shown in FIG. 9 , the state attribute information is used as an example of a blood volume value, and is associated with a second role object.
- the outer ring area of the character operation object (see the character operation object b22 in Fig.
- the second role object included in the graphical user interface of the terminal and the other terminal may not include all second role objects belonging to another group that interact with the terminal.
- the group members belonging to the first group include: group member 1, group member 2, group member 3, group member 4, and group member 5; group members belonging to the second group include: Group member 6, group member 7, group member 8, group member 9, and group member 10.
- the terminal manipulates the group member 1, only the group member 6 is included in the view image of the graphical user interface of the terminal; in the view image of the graphical user interface of the other terminal belonging to the group member of the first group Including the group member 7, the group member 8 and the group member 9, the group member 10 does not exist in the field of view image of the graphical user interface of any terminal controlled by the group members belonging to the first group.
- the character operation object b21 presents a display effect different from the other operation objects, specifically a gray effect display, indicating that the second role object corresponding to the character operation object b21 is not in the first role object.
- the field of view image of a10 is also not in the field of view image of the other character object belonging to the same group as the first character object a10; correspondingly, the outer ring area of the character operation object b21 does not display the character operation object Status attribute information of the second role object associated with b21.
- the role operation object associated with the second role object that performs information interaction on the first role object by using the window bit in the role object of the role selection area in the graphical user interface Rendering in the corresponding window bit, and rendering the character operation object associated with the second character object whose distance between the first character object and the first preset object meets the first preset condition, according to the first display parameter
- the UI avatar associated with the second character object that meets the first preset condition is rendered according to the first display parameter, so that the UI avatar has a display effect different from other UI avatars, and is convenient.
- the user can select the target character object quickly and accurately by the selection operation gesture of the character operation object based on the display effect of the difference, thereby greatly improving the operation experience of the user in the interaction process.
- the state of the second role object associated with the character operation object in the character object is obtained by synchronizing the state attribute information of the second character object in the view image of the character object (ie, teammate) belonging to the same group. Attribute information, and the state attribute information is rendered in a corresponding window position in a specific manner, that is, the state attribute information of the second character object (ie, the enemy) is reflected on the corresponding character operation object (UI avatar), which is convenient for the user.
- the status attribute information of the second role object (ie, the enemy) can be quickly learned, and the user's operation experience is improved in the information interaction process.
- the application scenario of the one-to-one is taken as an example for detailed description.
- the one-to-one application scenario is an application scenario in which the first role object controlled by the terminal 1 and the second role object controlled by the terminal 2 are used for information interaction, and the remaining application scenarios may refer to the description of the application scenario, in this embodiment. Do not repeat them.
- 10 is a schematic diagram of an interaction application of an information processing method according to an embodiment of the present invention; as shown in FIG. 10, in the application scenario, the terminal 1 includes a terminal 1, a terminal 2, and a server, and the terminal 1 performs trigger control by using the user 1; The terminal 2 performs trigger control by the user 2; the method includes:
- Step 11 User 1 can trigger the game system and log in to the authentication information, which can be a username and password.
- Step 12 The terminal 1 transmits the obtained authentication information to the server 3, and the server 3 performs identity verification, and after the identity verification is passed, returns to the first graphical user interface to the terminal 1;
- the first graphical user interface includes a first role object capable of performing a virtual operation based on a triggering operation of the user 1, the virtual operation including a moving operation of the first character object, the first character Objects attack operations or skill release operations for other character objects, and so on.
- Step 21 User 2 can trigger the game system and log in to the authentication information, which can be a username and password.
- Step 22 The terminal 2 transmits the obtained authentication information to the server 3, and the server 3 performs identity verification, and after the identity verification is passed, returns to the second graphical user interface to the terminal 2;
- the second graphical user interface includes a second role object capable of performing a virtual operation based on a triggering operation of the user 2, the virtual operation including a moving operation of the second character object, the second character Objects attack operations or skill release operations for other character objects, and so on.
- the first role object when the user 1 and the user 2 make the first role object and the second role object as objects of information interaction based on a triggering operation, that is, the first role object
- the second character object is used as a target interaction object.
- the second character object uses the first role object as a target interaction object, that is, the user 1 and the user 2 play as a game.
- the window bit of the character object of the character selection area of the first graphical user interface renders the character operation object associated with the second character object; correspondingly, the role selection area of the second graphical user interface
- the window bit of the character object object renders the character action object associated with the first character object.
- the terminal 1 detects the distance between the second character object and the first character object in real time, and when the distance is within the first preset threshold range, the second role is determined according to the first display parameter.
- the object operation object associated with the object is rendered in the window bit, that is, the character operation object is highlighted;
- the terminal 2 detects the distance between the first character object and the second character object in real time, and when the distance is within the first preset threshold range, the first role is determined according to the first display parameter.
- the object operation object associated with the object is rendered in the window bit, that is, the character operation object is highlighted.
- step 13 the user performs a triggering operation on the first graphical user interface presented by the terminal 1, and the triggering operation may be directed to any virtual resource object in the first graphical user interface, including for any skill object.
- a skill release operation an information interaction operation for any character object (which can be understood as a physical attack operation), a movement operation of the first character object, and the like.
- the triggering operation is a selection gesture operation for a character operation object that is rendered according to the first display parameter in a character device object of the first graphical user interface.
- Step 14 When the terminal 1 acquires the triggering operation, it identifies an instruction corresponding to the triggering operation gesture, and executes the instruction; for example, executing a skill release instruction for the corresponding operation object, and executing an information interaction instruction for the corresponding role object (eg, Physical attack instructions), execution of move instructions, and so on. And, in the process of executing the instruction, the change of the corresponding data is recorded.
- the terminal 1 acquires a selection gesture operation of the character operation object that is rendered according to the first display parameter, a corresponding first instruction is generated, and the first instruction is executed to control the first
- the role object performs a virtual operation (such as a physical attack operation or a skill release operation) on the corresponding second role object.
- step 15 the changed data is synchronized to the server 3 as the first data corresponding to the terminal 1.
- step 23 the user performs a triggering operation on the second graphical user interface presented by the terminal 2, and the triggering operation may be directed to any virtual resource object in the second graphical user interface, including for any skill object. Skill release operation, information interaction for any role object (can be understood as a physical attack operation), movement of the second character object, etc. Wait.
- the triggering operation is a selection gesture operation for a character operation object that is rendered according to the first display parameter in a character device object of the second graphical user interface.
- Step 24 When the terminal 2 acquires the triggering operation, it identifies an instruction corresponding to the triggering operation gesture, and executes the instruction; for example, executing a skill release instruction for the corresponding operation object, and executing an information interaction instruction for the corresponding role object (eg, Physical attack instructions), execution of move instructions, and so on. And, in the process of executing the instruction, the change of the corresponding data is recorded.
- the terminal 2 acquires a selection gesture operation of the character operation object that is rendered according to the first display parameter, a second instruction is generated, and the second instruction is executed to control the second
- the role object performs a virtual operation (such as a physical attack operation or a skill release operation) on the corresponding first role object.
- step 25 the changed data is synchronized to the server 3 as the second data corresponding to the terminal 2.
- step 30 the data is updated based on the first data synchronized by the terminal 1 and the second data synchronized by the terminal 2, and the updated data is synchronized to the terminal 1 and the terminal 2, respectively.
- the application scenario relates to a Multiplayer Online Battle Arena Games (MOBA).
- MOBA Multiplayer Online Battle Arena Games
- the technical terms involved in MOBA are: 1) UI layer: the icon in the graphical user interface; 2) skill indicator: special effects, aperture, operation to assist the release of skills; 3) virtual lens: can be understood as a game The camera; 4) Small map: The reduced version of the big map can be understood as a radar map, and the information and location of the enemy will be displayed on the map.
- FIG. 11 is a fourth schematic diagram of a graphical user interface in an information processing method according to an embodiment of the present invention; the present application is based on an application scenario of an actual interaction process.
- the first character object 93 and the at least one skill object 92 are rendered in the embodiment; the first role object 93 can perform a corresponding virtual operation based on a trigger operation of the user.
- the graphical user interface 90 further includes a character selection area 91, the character selection area 91 including a character player object;
- the object includes five window bits in the present illustration, and each window bit respectively renders a character operation object, including a character operation object 911, a character operation object 912, a character operation object 913, a character operation object 914, and a character operation object 915;
- Each of the character operation objects is associated with a role object;
- each of the five character objects is a role object belonging to a different group from the first role object 93, that is, five role objects are the enemy of the first role object 93. Interact.
- the first role object 93 detects a second role object in the graphical user interface 90 that meets a first preset threshold in a distance from the first character object 93, and the corresponding The character operation object associated with the second character object is rendered according to the first display parameter.
- the first preset threshold may be set as the skill release distance of the skill object according to actual needs, and is not limited to the above setting manner.
- the character operation object 913 shown in FIG. 10 has a bright display effect compared to other character operation objects. Based on this, the user can quickly and accurately select the target character object by selecting the operation gesture of the character operation object based on the display effect of the difference, thereby performing virtual operation on the target character object, thereby greatly improving the user interaction process. Operating experience.
- the embodiment of the invention further provides a terminal.
- 12 is a schematic structural diagram of a terminal according to Embodiment 4 of the present invention; as shown in FIG. 12, the terminal includes: a rendering processing unit 61, a deployment unit 62, a detecting unit 63, and an operation executing unit 64;
- the rendering processing unit 61 is configured to execute a software application and render to obtain a graphical user interface; render at least one virtual resource object on the graphical user interface, at least one of the virtual resource objects configured to be input according to a first character object that performs a first virtual operation by the first user command; and is further configured to: at least one character operation object associated with the second character object detected by the detecting unit 63 according to the first display parameter Rendering in the window bit;
- the deployment unit 62, the at least one role object configured to be deployed in the at least one role selection area of the graphical user interface includes at least one window bit;
- the detecting unit 63 is configured to detect a second role object in the graphical user interface that meets a first preset condition by a distance from the first character object;
- the operation execution unit 64 is configured to: when detecting a selection operation gesture for at least one character operation object that is rendered according to the first display parameter, the first role object pair corresponds to a second The role object performs at least one of the first virtual operations.
- the graphical user interface includes at least one role selection area, where the role selection area includes at least one character object, and the role object includes at least one window bit, wherein at least part of the window bit carries the corresponding role.
- An operation object wherein the character operation object is represented by an identifier of the role object associated with the role operation object (the identifier may be an avatar) in the graphical user interface; here, the role operation object is associated with The second role object belongs to a different group than the first role object.
- the rendering manner of the character device object in the character selection area includes, but is not limited to, a strip shape and a ring shape, that is, the character object object can be characterized by a character selection bar object or a character selection disk object.
- the graphical user interface rendered by the rendering processing unit 61 includes at least one virtual resource object, where the virtual resource object includes at least one first role object a10, and the terminal
- the user can perform information interaction through the graphical user interface, that is, input a user command; the first role object a10 can perform a first virtual operation based on the first user command detected by the terminal; the first virtual operation includes But not limited to: mobile operations, physical attack operations, skill attack operations, and so on.
- the first character object a10 is a character object manipulated by a user of the terminal; in the game system, the first character object a10 can be in the graphical user interface based on the operation of the user. Perform the appropriate action.
- the graphical user interface further includes a small map 801 of the virtual area where the user role object is located; a detailed method diagram of the small map 801 is shown in 801a, and each role object can be seen. (including a friend belonging to the first group and the enemy belonging to the second group with the first character object a10) in the virtual area at the location of the small area Figure 801 identifies.
- the graphical user interface further includes at least one skill object 803, and the user can control the user character object to perform a corresponding skill release operation through a skill release operation.
- the deployment unit 62 deploys a role selection area 802 in the graphical user interface; a role object is deployed in the role selection area 802.
- the role object is Characterization by the role selection bar object (ie, the character object object presents a strip display effect).
- the character object includes at least one window bit, and the character operation object associated with the second character object interacting with the first character object is rendered in a corresponding window bit, and the character operation object is represented by the avatar as
- the role selection area 802 includes at least one avatar, and the at least one avatar respectively corresponds to at least one second role object that interacts with the first role object. As shown in FIG.
- a second role object belonging to a different group from the first role object a10 includes five, and correspondingly includes five in the role selection area 802.
- a character operation object such as the character operation object b11, the character operation object b12, the character operation object b13, the character operation object b14, and the character operation object b15 shown in FIG. 3; it can be understood that five of the character selection areas 802
- the character operation objects are in one-to-one correspondence with the second role objects of the first role object belonging to different groups.
- the position of the first character object changes in real time during the manipulation of the user of the terminal, and correspondingly, the position of the second character object in the graphical user interface is real-time.
- a change occurs, and based on this, in a process in which the first character object performs a virtual operation on the second role object, the user of the terminal does not easily select the character object to be virtualized.
- the detecting unit 63 detects a second role object in the graphical user interface that meets the first preset condition by the distance between the first character object.
- the detecting unit 63 is configured to detect a second role object in the graphical user interface that is smaller than a first preset threshold by a distance between the first character object.
- the detecting unit 63 detects that the first character object 1 is centered, Obtaining, by a circular area having a radius of the first preset threshold (R), a range of regions included in the circular area, where the range of the area may be represented by a coordinate range; that is, in the first role object and Establishing an XY coordinate system in the virtual space where the second character object is located, and acquiring a coordinate range of the circular area in the XY coordinate system; further, the detecting unit 63 detects the real-time in the graphical user interface.
- R radius of the first preset threshold
- a coordinate of the second character object determining whether the detected coordinate is in a coordinate range characterizing the circular area; when it is determined that the coordinate falls in a coordinate range characterizing the graphic area (as shown in FIG. 4 If the second character object 2, the second character object 3, and the second character object 4 are both in the circular area, detecting that the distance between the graphical user interface and the first character object is less than the first pre- Set the second role object of the threshold.
- the first preset threshold satisfies an attack distance or a skill release distance of the first character object, so that the second character object can be quickly selected and used by the first role object in a subsequent operation. Perform a virtual operation.
- the detecting unit 63 is further configured to detect, between the second role object that the distance between the first character object and the first role object meets the first preset condition, and the first role object At least a portion of the second character object that meets the second predetermined condition;
- the rendering processing unit 61 is configured to perform, in the at least one of the window positions, a character operation object associated with at least a part of the second character object detected by the detecting unit 63 according to the second display parameter. Rendering; wherein the rendering according to the second display parameter is different from the rendering performed by the first display parameter.
- the detecting unit 63 is configured to detect that a distance between the second character object and the first character object meets a first preset condition, and a distance from the first character object reaches a second And a second role object of the preset threshold; wherein the second preset threshold is greater than or equal to the first preset threshold.
- the position in the graphical user interface changes in real time, based on which the detecting unit 63 detects a selection operation gesture for at least one of the character object objects that has been rendered according to the first display parameter And detecting, in real time, a coordinate value of the second character object in the circular region whose previous coordinate value is a radius with the first preset threshold (R), and determining whether the coordinate value is at the second preset threshold ( r) shown in FIG.
- the second preset threshold (r) is larger than the first pre- Setting a threshold (R), that is, a second character object in which the previous coordinate value is in a circular area with a radius of the first preset threshold, with at least part of the real-time movement of the first character object and the second character object
- R pre- Setting a threshold
- the second character object moves to be between the first character object
- the distance is greater than the first preset threshold (R) and reaches the second preset threshold (r), such as the second character object 4 shown in FIG.
- the character operation object can be selected for the operation state, and the character operation object is rendered in the corresponding window position according to the second display parameter.
- the second display parameter may be a normal display parameter, that is, in the graphical user interface, except for the role operation object displayed according to the first display parameter, the remaining virtual resource objects are according to the The second display parameter is rendered.
- the rendering processing unit 61 renders the character operation object associated with the second character object that satisfies the first preset condition in the corresponding window bit according to the first display parameter.
- a character operation object associated with a second character object that satisfies a first preset condition is rendered by the first display parameter (see the character operation object b12 shown in FIG. 5, the character operation object
- the outer ring edge of b12 has a rendering effect different from that of other character operation objects, so as to have a bright display effect);
- the character operation object rendered by the first display parameter is compared with other character operation objects (
- the role operation object b12) has obvious distinguishing features, so that the terminal user can immediately recognize the role operation.
- the object is such that the end user can quickly select such a character operation object with distinct distinguishing features in subsequent operations.
- the functions of the processing units in the terminal of the embodiment of the present invention can be understood by referring to the related description of the information processing method, and the processing units in the information processing terminal according to the embodiments of the present invention can implement the present invention.
- the implementation of the analog circuit of the functions described in the embodiments can also be implemented by running the software of the functions described in the embodiments of the present invention on the smart terminal.
- an embodiment of the present invention further provides a terminal.
- 13 is a schematic structural diagram of a terminal according to Embodiment 5 of the present invention; as shown in FIG. 13, the terminal includes: a rendering processing unit 61, a deployment unit 62, a detecting unit 63, an operation executing unit 64, an obtaining unit 65, and a communication unit. 66; Among them,
- the rendering processing unit 61 is configured to execute a software application and render to obtain a graphical user interface; render at least one virtual resource object on the graphical user interface, at least one of the virtual resource objects configured to be input according to a first character object that performs a first virtual operation by the first user command; and is further configured to: at least one character operation object associated with the second character object detected by the detecting unit 63 according to the first display parameter Rendering in the window bit;
- the deployment unit 62, the at least one role object configured to be deployed in the at least one role selection area of the graphical user interface includes at least one window bit;
- the detecting unit 63 is configured to detect a second role object in the graphical user interface that meets a first preset condition by a distance from the first character object;
- the operation execution unit 64 is configured to: when detecting a selection operation gesture for at least one character operation object that is rendered according to the first display parameter, the first role object pair corresponds to a second The role object performs at least one of the first virtual operations;
- the obtaining unit 65 is configured to acquire state attribute information of the second role object in the graphical user interface
- the communication unit 66 is configured to synchronize the state attribute information acquired by the acquiring unit 65 to the server, and obtain state attribute information of the role object associated with the role operation object in the role object from the server.
- the rendering processing unit 61 is configured to render the role operation object associated with the second role object in the corresponding window position according to the obtained state attribute information in a preset manner.
- the obtaining unit 65 acquires state attribute information of the second role object in the graphical user interface; and the virtual space in which the first role object and the second role object are located are based on the setting of the software application.
- the second role object may be included in the business image displayed by the graphic user interface rendered by the terminal, and may or may not include the The second role object.
- the terminal obtains state attribute information of the second role object included in the graphic user interface, and synchronizes the second role object corresponding to the state attribute information to the server.
- the status attribute information includes, but is not limited to, a blood volume value, a life value, or skill attribute information of the second role object.
- the communication unit 66 obtains state attribute information of the role object associated with the role operation object in the role object from the server according to a preset rule, so that the graphic user interface of the terminal itself
- the second role object and the associated state attribute information in the server may be synchronized by other terminals, thereby obtaining Status attribute letter of the second role object associated with all role manipulation objects included in the roler object interest.
- the terminal and the other terminal belong to the same group. It can be understood that the first character object that is controlled by the terminal and the first role object that is controlled by the other terminal belong to the same group in the game system.
- Groups collectively perform virtual operations on second role objects that belong to another group.
- the graphical user interface of the other terminal may include at least part of the second role object, thereby based on at least one terminal belonging to the same group. Acquiring state attribute information of the second role object included in the graphical user interface, and realizing synchronization of state attribute information of the second role object.
- the role operation object associated with the second role object is correspondingly arranged in the role object. Rendering in the window bit.
- the state attribute information is taken as an example of the blood volume value
- the outer ring area of the character operation object (shown by the character operation object b22 in FIG. 9) associated with the second character object is used as
- the blood cell display area b221 represents the current blood volume value of the corresponding second character object by the proportional relationship of the blood volume in the blood cell display area b221 in the blood cell display area b221.
- the manner in which the state attribute information in the embodiment of the present invention renders the role operation object associated with the second role object in the corresponding window bit is not limited to that shown in FIG. 9.
- the functions of the processing units in the terminal of the embodiment of the present invention can be understood by referring to the related description of the information processing method, and the processing units in the information processing terminal according to the embodiments of the present invention can implement the present invention.
- the implementation of the analog circuit of the functions described in the embodiments can also be implemented by running the software of the functions described in the embodiments of the present invention on the smart terminal.
- the rendering processing unit 61, the deployment unit 62, the detecting unit 63, the operation executing unit 64, and the obtaining unit 65 in the terminal may be used by the terminal in the actual application.
- the communication unit 66 in the terminal may be used by a transceiver antenna or communication in the terminal in practical applications. Interface implementation.
- the embodiment of the invention further provides a terminal.
- the terminal may be an electronic device such as a PC, and may also be a portable electronic device such as a tablet computer, a laptop computer, a smart phone, etc., and the game system is implemented on the terminal by installing a software application (such as a game application), the terminal.
- a software application such as a game application
- At least a memory for storing data and a processor for data processing are included.
- the processor for data processing may be implemented by a microprocessor, a CPU, a DSP, or an FPGA when performing processing; and the operation instruction may be a computer executable code for the memory,
- the steps in the flow of the information processing method of the embodiment of the present invention are implemented by the operation instruction.
- the terminal includes: a processor 71 and a display 72; the processor 71 is configured to execute a software application and on the display 72. Rendering to obtain a graphical user interface, the processor 71, the graphical user interface and the software application being implemented on a gaming system;
- the processor 71 is further configured to render at least one virtual resource object on the graphical user interface, at least one of the virtual resource objects being configured to perform a first virtual operation according to the input first user command First role object;
- At least one character object deployed in at least one character selection area of the graphical user interface includes at least one window bit
- the character operation object is rendered in at least one of the window bits
- the first character object performs at least one of the first virtual operations on the corresponding second character object.
- the processor 71 detects a second role object that meets a first preset condition in the graphical user interface and the first role object, and includes:
- the processor 71 is further configured to detect, before detecting, a selection operation gesture for at least one of the character object objects that is rendered according to the first display parameter.
- a second role object in which a distance between a character object meets a first preset condition, and a distance between the first character object and the first character object satisfies a second preset condition, according to the second display parameter Rendering a character operation object associated with the detected at least part of the second character object in the at least one of the window bits; wherein rendering according to the second display parameter and performing according to the first display parameter Rendering has different display effects.
- the processor 71 detects a distance between at least a portion of the second character object and the first character object in the second character object that the distance between the first character object and the first character object meets the first preset condition Meet the second preset condition, including:
- the second role object that reaches a second preset threshold value;
- the second preset threshold is greater than or equal to the first preset threshold.
- the terminal further includes a communication device 74;
- the processor 71 is further configured to acquire state attribute information of the second role object in the graphical user interface, synchronize the state attribute information to the server by using the communication device 74, and The state attribute information of the role object associated with the role operation object in the role object is obtained in the server.
- the processor 71 is further configured to render the role operation object associated with the second character object in the corresponding window position in a preset manner according to the obtained state attribute information.
- the terminal in this embodiment includes: a processor 71, a display 72, a memory 73, an input device 76, a bus 75, and a communication device 74; the processor 71, the memory 73, the input device 76, the display 72, and the communication device 74 are both Connected via a bus 75 for transferring data between the processor 71, the memory 73, the display 72, and the communication device 74.
- the input device 76 is mainly configured to obtain an input operation of a user, and the input device 76 may also be different when the terminals are different.
- the input device 76 may be an input device 76 such as a mouse or a keyboard; when the terminal is a portable device such as a smart phone or a tablet computer, the input device 76 may be a touch device. Screen.
- a computer storage medium is stored in the memory 73, and the computer storage medium stores computer executable instructions, and the computer executable instructions are used in the information processing method according to the embodiment of the present invention.
- the disclosed apparatus and method may be implemented in other manners.
- the device embodiments described above are merely illustrative.
- the division of the unit is only a logical function division.
- there may be another division manner such as: multiple units or components may be combined, or Can be integrated into another system, or some features can be ignored or not executed.
- the coupling, or direct coupling, or communication connection of the components shown or discussed may be indirect coupling or communication connection through some interfaces, devices or units, and may be electrical, mechanical or other forms. of.
- the units described above as separate components may or may not be physically separated, and the components displayed as the unit may or may not be physical units, that is, may be located in one place or distributed to multiple network units; Some or all of the units may be selected according to actual needs to achieve the purpose of the solution of the embodiment.
- each functional unit in each embodiment of the present invention may be integrated into one processing unit, or each unit may be separately used as one unit, or two or more units may be integrated into one unit;
- the unit can be implemented in the form of hardware or in the form of hardware plus software functional units.
- the foregoing program may be stored in a computer readable storage medium, and the program is executed when executed.
- the foregoing storage device includes the following steps: the foregoing storage medium includes: a mobile storage device, a read-only memory (ROM), a random access memory (RAM), a magnetic disk, or an optical disk.
- ROM read-only memory
- RAM random access memory
- magnetic disk or an optical disk.
- optical disk A medium that can store program code.
- the above-described integrated unit of the present invention may be stored in a computer readable storage medium if it is implemented in the form of a software function module and sold or used as a standalone product.
- the technical solution of the embodiments of the present invention may be embodied in the form of a software product in essence or in the form of a software product stored in a storage medium, including a plurality of instructions.
- a computer device (which may be a personal computer, server, or network device, etc.) is caused to perform all or part of the methods described in various embodiments of the present invention.
- the foregoing storage medium includes various media that can store program codes, such as a mobile storage device, a ROM, a RAM, a magnetic disk, or an optical disk.
- the embodiment of the present invention associates a corner of a second character object that performs information interaction on the first character object by using a window bit in the character object of the character selection area in the graphical user interface.
- the color operation object is rendered in the corresponding window bit
- the character operation object associated with the second character object whose distance between the first character object and the first preset object meets the first preset condition is rendered according to the first display parameter.
- the UI avatar associated with the second character object whose distance between the first character object and the first target object meets the first preset condition is rendered according to the first display parameter, so that the UI avatar has a display different from other UI avatars.
- the effect is that when the selected target character operates the object, the user can select the target character object quickly and accurately by selecting the operation gesture of the character operation object based on the display effect of the difference, thereby greatly improving the user's interaction process. Operating experience.
Landscapes
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Theoretical Computer Science (AREA)
- General Engineering & Computer Science (AREA)
- Human Computer Interaction (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Optics & Photonics (AREA)
- Radar, Positioning & Navigation (AREA)
- Computing Systems (AREA)
- Geometry (AREA)
- Computer Graphics (AREA)
- User Interface Of Digital Computer (AREA)
Abstract
Description
Claims (19)
- 一种信息处理方法,通过在终端的处理器上执行软件应用并在所述终端的显示器上进行渲染,以得到图形用户界面,所述处理器、图形用户界面和所述软件应用在游戏系统上被实施;所述方法包括:在所述图形用户界面上渲染出至少一个虚拟资源对象,所述虚拟资源对象中的至少一个被配置为根据输入的第一用户命令而执行第一虚拟操作的第一角色对象;部署于所述图形用户界面至少一角色选择区域的至少一个角色器对象包括至少一个窗口位;检测所述图形用户界面中与所述第一角色对象之间的距离满足第一预设条件的第二角色对象,依据第一显示参数将与检测到的与所述第二角色对象相关联的角色操作对象在至少一个所述窗口位中进行渲染;检测到对所述角色器对象中至少一个依据所述第一显示参数渲染过的角色操作对象的选择操作手势时,所述第一角色对象对相应的第二角色对象执行所述第一虚拟操作中的至少一种。
- 根据权利要求1所述的方法,其中,所述检测所述图形用户界面中与所述第一角色对象之间的距离满足第一预设条件的第二角色对象,包括:检测所述图形用户界面中与所述第一角色对象之间的距离小于第一预设阈值的第二角色对象。
- 根据权利要求1所述的方法,其中,所述检测到对所述角色器对象中至少一个依据所述第一显示参数渲染过的角色操作对象的选择操作手势之前,所述方法还包括:检测与所述第一角色对象之间的距离满足第一预设条件的第二角色对象中,与所述第一角色对象之间的距离满足第二预设条件的至少部分第二角色对象,依据第二显示参数将与检测到的至少部分第二角色对象相关联 的角色操作对象在所述至少一个所述窗口位中进行渲染。其中,按所述第二显示参数进行渲染与按所述第一显示参数进行渲染的显示效果不同。
- 根据权利要求3所述的方法,其中,所述检测与所述第一角色对象之间的距离满足第一预设条件的第二角色对象中,与所述第一角色对象之间的距离满足第二预设条件的至少部分第二角色对象,包括:检测与所述第一角色对象之间的距离满足第一预设条件的第二角色对象中,与所述第一角色对象之间的距离达到第二预设阈值的第二角色对象;其中,所述第二预设阈值大于等于所述第一预设阈值。
- 根据权利要求1所述的方法,其中,所述方法还包括:获取所述图形用户界面中的第二角色对象的状态属性信息,同步所述状态属性信息至服务器;以及从所述服务器中获得所述角色器对象中的角色操作对象相关联的角色对象的状态属性信息。
- 根据权利要求5所述的方法,其中,所述方法还包括:依据获得的所述状态属性信息按预设方式将所述第二角色对象相关联的角色操作对象在对应的所述窗口位中进行渲染。
- 一种终端,所述终端包括:渲染处理单元、部署单元、检测单元和操作执行单元;其中,所述渲染处理单元,配置为执行软件应用并进行渲染得到图形用户界面;在所述图形用户界面上渲染出至少一个虚拟资源对象,所述虚拟资源对象中的至少一个被配置为根据输入的第一用户命令而执行第一虚拟操作的第一角色对象;还配置为依据第一显示参数将与所述检测单元检测到的与所述第二角色对象相关联的角色操作对象在至少一个所述窗口位中进行渲染;所述部署单元,配置为部署于所述图形用户界面至少一角色选择区域的至少一个角色器对象包括至少一个窗口位;所述检测单元,配置为检测所述图形用户界面中与所述第一角色对象之间的距离满足第一预设条件的第二角色对象;所述操作执行单元,配置为检测到对所述角色器对象中至少一个依据所述第一显示参数渲染过的角色操作对象的选择操作手势时,所述第一角色对象对相应的第二角色对象执行所述第一虚拟操作中的至少一种。
- 根据权利要求7所述的终端,其中,所述检测单元,配置为检测所述图形用户界面中与所述第一角色对象之间的距离小于第一预设阈值的第二角色对象。
- 根据权利要求7所述的终端,其中,所述检测单元,还配置为检测与所述第一角色对象之间的距离满足第一预设条件的第二角色对象中,与所述第一角色对象之间的距离满足第二预设条件的至少部分第二角色对象;相应的,所述渲染处理单元,配置为依据第二显示参数将与所述检测单元检测到的至少部分第二角色对象相关联的角色操作对象在所述至少一个所述窗口位中进行渲染;其中,按所述第二显示参数进行渲染与按所述第一显示参数进行渲染的显示效果不同。
- 根据权利要求9所述的终端,其中,所述检测单元,配置为检测与所述第一角色对象之间的距离满足第一预设条件的第二角色对象中,与所述第一角色对象之间的距离达到第二预设阈值的第二角色对象;其中,所述第二预设阈值大于等于所述第一预设阈值。
- 根据权利要求7所述的终端,其中,所述终端还包括获取单元和通讯单元;其中,所述获取单元,配置为获取所述图形用户界面中的第二角色对象的状 态属性信息;所述通讯单元,配置为同步所述获取单元获取的状态属性信息至服务器,以及从所述服务器中获得所述角色器对象中的角色操作对象相关联的角色对象的状态属性信息。
- 根据权利要求11所述的终端,其中,所述渲染处理单元,配置为依据获得的所述状态属性信息按预设方式将所述第二角色对象相关联的角色操作对象在对应的所述窗口位中进行渲染。
- 一种终端,所述终端包括:处理器和显示器;所述处理器,配置为执行软件应用并在所述显示器上进行渲染以得到图形用户界面,所述处理器、图形用户界面和所述软件应用在游戏系统上被实施;所述处理器,还配置为在所述图形用户界面上渲染出至少一个虚拟资源对象,所述虚拟资源对象中的至少一个被配置为根据输入的第一用户命令而执行第一虚拟操作的第一角色对象;部署于所述图形用户界面至少一角色选择区域的至少一个角色器对象包括至少一个窗口位;检测所述图形用户界面中与所述第一角色对象之间的距离满足第一预设条件的第二角色对象,依据第一显示参数将与检测到的与所述第二角色对象相关联的角色操作对象在至少一个所述窗口位中进行渲染;检测到对所述角色器对象中至少一个依据所述第一显示参数渲染过的角色操作对象的选择操作手势时,所述第一角色对象对相应的第二角色对象执行所述第一虚拟操作中的至少一种。
- 根据权利要求13所述的终端,其中,所述处理器检测所述图形用户界面中与所述第一角色对象之间的距离满足第一预设条件的第二角色对象,包括:检测所述图形用户界面中与所述第一角色对象之间的距离小于第一预 设阈值的第二角色对象。
- 根据权利要求13所述的终端,其中,所述处理器,还配置为检测到对所述角色器对象中至少一个依据所述第一显示参数渲染过的角色操作对象的选择操作手势之前,检测与所述第一角色对象之间的距离满足第一预设条件的第二角色对象中,与所述第一角色对象之间的距离满足第二预设条件的至少部分第二角色对象,依据第二显示参数将与检测到的至少部分第二角色对象相关联的角色操作对象在所述至少一个所述窗口位中进行渲染;其中,按所述第二显示参数进行渲染与按所述第一显示参数进行渲染的显示效果不同。
- 根据权利要求15所述的终端,其中,所述处理器检测与所述第一角色对象之间的距离满足第一预设条件的第二角色对象中,至少部分第二角色对象与所述第一角色对象之间的距离满足第二预设条件,包括:检测与所述第一角色对象之间的距离满足第一预设条件的第二角色对象中,与所述第一角色对象之间的距离达到第二预设阈值的第二角色对象;其中,所述第二预设阈值大于等于所述第一预设阈值。
- 根据权利要求13所述的终端,其中,所述终端还包括通讯设备;所述处理器,还配置为获取所述图形用户界面中的第二角色对象的状态属性信息,通过所述通讯设备同步所述状态属性信息至服务器;以及通过所述通讯接口从所述服务器中获得所述角色器对象中的角色操作对象相关联的角色对象的状态属性信息。
- 根据权利要求17所述的终端,其中,所述处理器,还配置为依据获得的所述状态属性信息按预设方式将所述第二角色对象相关联的角色操作对象在对应的所述窗口位中进行渲染。
- 一种计算机存储介质,所述计算机存储介质中存储有计算机可执行指令,所述计算机可执行指令用于执行权利要求1至6任一项所述的信 息处理方法。
Priority Applications (6)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
EP16850075.9A EP3273334B1 (en) | 2015-09-29 | 2016-05-04 | Information processing method, terminal and computer storage medium |
JP2018505518A JP6529659B2 (ja) | 2015-09-29 | 2016-05-04 | 情報処理方法、端末及びコンピュータ記憶媒体 |
CA2982868A CA2982868C (en) | 2015-09-29 | 2016-05-04 | Method for performing virtual operations on a character object, terminal, and computer storage medium |
KR1020177033331A KR102018212B1 (ko) | 2015-09-29 | 2016-05-04 | 정보 처리 방법, 단말 및 컴퓨터 저장 매체 |
MYPI2017703956A MY192140A (en) | 2015-09-29 | 2016-05-04 | Information processing method, terminal, and computer storage medium |
US15/725,140 US10639549B2 (en) | 2015-09-29 | 2017-10-04 | Information processing method, terminal, and computer storage medium |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201510633275.3 | 2015-09-29 | ||
CN201510633275.3A CN105335064B (zh) | 2015-09-29 | 2015-09-29 | 一种信息处理方法和终端 |
Related Child Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US15/725,140 Continuation-In-Part US10639549B2 (en) | 2015-09-29 | 2017-10-04 | Information processing method, terminal, and computer storage medium |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2017054450A1 true WO2017054450A1 (zh) | 2017-04-06 |
Family
ID=55285650
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/CN2016/081041 WO2017054450A1 (zh) | 2015-09-29 | 2016-05-04 | 一种信息处理方法、终端和计算机存储介质 |
Country Status (8)
Country | Link |
---|---|
US (1) | US10639549B2 (zh) |
EP (1) | EP3273334B1 (zh) |
JP (1) | JP6529659B2 (zh) |
KR (1) | KR102018212B1 (zh) |
CN (1) | CN105335064B (zh) |
CA (1) | CA2982868C (zh) |
MY (1) | MY192140A (zh) |
WO (1) | WO2017054450A1 (zh) |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US11811681B1 (en) | 2022-07-12 | 2023-11-07 | T-Mobile Usa, Inc. | Generating and deploying software architectures using telecommunication resources |
Families Citing this family (73)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US8977255B2 (en) | 2007-04-03 | 2015-03-10 | Apple Inc. | Method and system for operating a multi-function portable electronic device using voice-activation |
US8676904B2 (en) | 2008-10-02 | 2014-03-18 | Apple Inc. | Electronic devices with voice command and contextual data processing capabilities |
US10417037B2 (en) | 2012-05-15 | 2019-09-17 | Apple Inc. | Systems and methods for integrating third party services with a digital assistant |
CN113470641B (zh) | 2013-02-07 | 2023-12-15 | 苹果公司 | 数字助理的语音触发器 |
KR101772152B1 (ko) | 2013-06-09 | 2017-08-28 | 애플 인크. | 디지털 어시스턴트의 둘 이상의 인스턴스들에 걸친 대화 지속성을 가능하게 하기 위한 디바이스, 방법 및 그래픽 사용자 인터페이스 |
US10170123B2 (en) | 2014-05-30 | 2019-01-01 | Apple Inc. | Intelligent assistant for home automation |
US9715875B2 (en) | 2014-05-30 | 2017-07-25 | Apple Inc. | Reducing the need for manual start/end-pointing and trigger phrases |
US9338493B2 (en) | 2014-06-30 | 2016-05-10 | Apple Inc. | Intelligent automated assistant for TV user interactions |
US9886953B2 (en) | 2015-03-08 | 2018-02-06 | Apple Inc. | Virtual assistant activation |
US10460227B2 (en) | 2015-05-15 | 2019-10-29 | Apple Inc. | Virtual assistant in a communication session |
US10747498B2 (en) | 2015-09-08 | 2020-08-18 | Apple Inc. | Zero latency digital assistant |
CN105335064B (zh) | 2015-09-29 | 2017-08-15 | 腾讯科技(深圳)有限公司 | 一种信息处理方法和终端 |
US11587559B2 (en) | 2015-09-30 | 2023-02-21 | Apple Inc. | Intelligent device identification |
CN105194873B (zh) * | 2015-10-10 | 2019-01-04 | 腾讯科技(成都)有限公司 | 一种信息处理方法、终端及计算机存储介质 |
US10691473B2 (en) | 2015-11-06 | 2020-06-23 | Apple Inc. | Intelligent automated assistant in a messaging environment |
US10586535B2 (en) | 2016-06-10 | 2020-03-10 | Apple Inc. | Intelligent digital assistant in a multi-tasking environment |
DK201670540A1 (en) | 2016-06-11 | 2018-01-08 | Apple Inc | Application integration with a digital assistant |
AU2017100670C4 (en) | 2016-06-12 | 2019-11-21 | Apple Inc. | User interfaces for retrieving contextually relevant media content |
KR101866198B1 (ko) * | 2016-07-06 | 2018-06-11 | (주) 덱스인트게임즈 | 터치스크린 기반의 게임제공방법 및 프로그램 |
CN106774824B (zh) * | 2016-10-26 | 2020-02-04 | 网易(杭州)网络有限公司 | 虚拟现实交互方法及装置 |
CN106422329A (zh) * | 2016-11-01 | 2017-02-22 | 网易(杭州)网络有限公司 | 游戏操控方法和装置 |
CN106512406A (zh) * | 2016-11-01 | 2017-03-22 | 网易(杭州)网络有限公司 | 游戏操控方法和装置 |
JP6143934B1 (ja) * | 2016-11-10 | 2017-06-07 | 株式会社Cygames | 情報処理プログラム、情報処理方法、及び情報処理装置 |
CN106354418B (zh) * | 2016-11-16 | 2019-07-09 | 腾讯科技(深圳)有限公司 | 一种基于触摸屏的操控方法和装置 |
WO2018095366A1 (zh) | 2016-11-24 | 2018-05-31 | 腾讯科技(深圳)有限公司 | 视频推荐确定、信息显示、基于帧同步的数据处理方法 |
CN107132979A (zh) * | 2017-03-14 | 2017-09-05 | 网易(杭州)网络有限公司 | 在移动设备游戏中精确选择目标的交互方法、装置及计算机可读存储介质 |
US20180292952A1 (en) * | 2017-04-05 | 2018-10-11 | Riot Games, Inc. | Methods and systems for object selection |
DK180048B1 (en) | 2017-05-11 | 2020-02-04 | Apple Inc. | MAINTAINING THE DATA PROTECTION OF PERSONAL INFORMATION |
DK201770429A1 (en) | 2017-05-12 | 2018-12-14 | Apple Inc. | LOW-LATENCY INTELLIGENT AUTOMATED ASSISTANT |
DK179496B1 (en) | 2017-05-12 | 2019-01-15 | Apple Inc. | USER-SPECIFIC Acoustic Models |
DK201770411A1 (en) | 2017-05-15 | 2018-12-20 | Apple Inc. | MULTI-MODAL INTERFACES |
US10303715B2 (en) | 2017-05-16 | 2019-05-28 | Apple Inc. | Intelligent automated assistant for media exploration |
CN107441705B (zh) * | 2017-07-27 | 2018-07-20 | 腾讯科技(深圳)有限公司 | 对象显示方法和装置及存储介质 |
US10928918B2 (en) | 2018-05-07 | 2021-02-23 | Apple Inc. | Raise to speak |
US11145294B2 (en) | 2018-05-07 | 2021-10-12 | Apple Inc. | Intelligent automated assistant for delivering content from user experiences |
CN108579089B (zh) * | 2018-05-09 | 2021-11-12 | 网易(杭州)网络有限公司 | 虚拟道具控制方法及装置、存储介质、电子设备 |
DK201870355A1 (en) | 2018-06-01 | 2019-12-16 | Apple Inc. | VIRTUAL ASSISTANT OPERATION IN MULTI-DEVICE ENVIRONMENTS |
DK180639B1 (en) | 2018-06-01 | 2021-11-04 | Apple Inc | DISABILITY OF ATTENTION-ATTENTIVE VIRTUAL ASSISTANT |
CN108970114A (zh) * | 2018-08-21 | 2018-12-11 | 苏州蜗牛数字科技股份有限公司 | 一种通过自定义映射按键实现视野调整的方法 |
US11446579B2 (en) * | 2018-09-11 | 2022-09-20 | Ncsoft Corporation | System, server and method for controlling game character |
US11462215B2 (en) | 2018-09-28 | 2022-10-04 | Apple Inc. | Multi-modal inputs for voice commands |
CN109582136B (zh) * | 2018-11-13 | 2022-05-03 | 深圳市创凯智能股份有限公司 | 三维窗口手势导航方法、装置、移动终端及存储介质 |
CN109675307B (zh) * | 2019-01-10 | 2020-02-21 | 网易(杭州)网络有限公司 | 游戏中的显示控制方法、装置、存储介质、处理器及终端 |
CN109568956B (zh) * | 2019-01-10 | 2020-03-10 | 网易(杭州)网络有限公司 | 游戏中的显示控制方法、装置、存储介质、处理器及终端 |
US10786734B2 (en) * | 2019-02-20 | 2020-09-29 | Supercell Oy | Method for facilitating user interactions in gaming environment |
US11348573B2 (en) | 2019-03-18 | 2022-05-31 | Apple Inc. | Multimodality in digital assistant systems |
US11307752B2 (en) | 2019-05-06 | 2022-04-19 | Apple Inc. | User configurable task triggers |
DK201970509A1 (en) | 2019-05-06 | 2021-01-15 | Apple Inc | Spoken notifications |
US11140099B2 (en) | 2019-05-21 | 2021-10-05 | Apple Inc. | Providing message response suggestions |
CN110115838B (zh) * | 2019-05-30 | 2021-10-29 | 腾讯科技(深圳)有限公司 | 虚拟环境中生成标记信息的方法、装置、设备及存储介质 |
US11468890B2 (en) | 2019-06-01 | 2022-10-11 | Apple Inc. | Methods and user interfaces for voice-based control of electronic devices |
JP6818092B2 (ja) * | 2019-06-25 | 2021-01-20 | 株式会社コロプラ | ゲームプログラム、ゲーム方法、および情報端末装置 |
CN110368691B (zh) * | 2019-07-19 | 2023-09-19 | 腾讯科技(深圳)有限公司 | 多人在线对战程序中的提醒信息发送方法、装置及终端 |
CN110598035B (zh) * | 2019-09-08 | 2023-06-13 | 北京智明星通科技股份有限公司 | 一种手机游戏虚拟人物形象推荐方法、装置和移动终端 |
CN110807728B (zh) | 2019-10-14 | 2022-12-13 | 北京字节跳动网络技术有限公司 | 对象的显示方法、装置、电子设备及计算机可读存储介质 |
CN110882537B (zh) * | 2019-11-12 | 2023-07-25 | 北京字节跳动网络技术有限公司 | 一种交互方法、装置、介质和电子设备 |
CN111013135A (zh) * | 2019-11-12 | 2020-04-17 | 北京字节跳动网络技术有限公司 | 一种交互方法、装置、介质和电子设备 |
CN111013139B (zh) * | 2019-11-12 | 2023-07-25 | 北京字节跳动网络技术有限公司 | 角色交互方法、系统、介质和电子设备 |
CN111068311B (zh) * | 2019-11-29 | 2023-06-23 | 珠海豹趣科技有限公司 | 游戏场景的显示控制方法及装置 |
US11061543B1 (en) | 2020-05-11 | 2021-07-13 | Apple Inc. | Providing relevant data items based on context |
CN111589114B (zh) | 2020-05-12 | 2023-03-10 | 腾讯科技(深圳)有限公司 | 虚拟对象的选择方法、装置、终端及存储介质 |
CN111760267B (zh) * | 2020-07-06 | 2024-08-27 | 网易(杭州)网络有限公司 | 游戏中的信息发送方法及装置、存储介质、电子设备 |
US11490204B2 (en) | 2020-07-20 | 2022-11-01 | Apple Inc. | Multi-device audio adjustment coordination |
US11438683B2 (en) | 2020-07-21 | 2022-09-06 | Apple Inc. | User identification using headphones |
CN111821691A (zh) * | 2020-07-24 | 2020-10-27 | 腾讯科技(深圳)有限公司 | 界面显示方法、装置、终端及存储介质 |
CN112057856B (zh) * | 2020-09-17 | 2024-01-30 | 网易(杭州)网络有限公司 | 信息提示方法、装置和终端设备 |
CN112245920A (zh) * | 2020-11-13 | 2021-01-22 | 腾讯科技(深圳)有限公司 | 虚拟场景的显示方法、装置、终端及存储介质 |
KR102589889B1 (ko) * | 2021-02-23 | 2023-10-17 | (주)팀스노우볼 | 게임 ui 분석 방법 |
JP7416980B2 (ja) * | 2021-05-25 | 2024-01-17 | ネットイーズ (ハンチョウ) ネットワーク カンパニー リミテッド | ゲームシーンの処理方法、装置、記憶媒体及び電子デバイス |
CN113318444B (zh) * | 2021-06-08 | 2023-01-10 | 天津亚克互动科技有限公司 | 角色的渲染方法和装置、电子设备和存储介质 |
CN116059628A (zh) * | 2021-06-25 | 2023-05-05 | 网易(杭州)网络有限公司 | 游戏的交互方法、装置、电子设备及可读介质 |
CN113398566B (zh) * | 2021-07-16 | 2024-10-01 | 网易(杭州)网络有限公司 | 游戏的显示控制方法、装置、存储介质及计算机设备 |
CN113559505B (zh) * | 2021-07-28 | 2024-02-02 | 网易(杭州)网络有限公司 | 游戏中的信息处理方法、装置及移动终端 |
Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN1743043A (zh) * | 2005-06-19 | 2006-03-08 | 珠海市西山居软件有限公司 | 一种网络游戏系统及其实现方法 |
CN103096134A (zh) * | 2013-02-08 | 2013-05-08 | 广州博冠信息科技有限公司 | 一种基于视频直播和游戏的数据处理方法和设备 |
CN103157281A (zh) * | 2013-04-03 | 2013-06-19 | 广州博冠信息科技有限公司 | 一种二维游戏场景显示的方法和设备 |
US20140113718A1 (en) * | 2012-04-26 | 2014-04-24 | Riot Games, Inc. | Systems and methods that enable a spectator's experience for online active games |
CN105335064A (zh) * | 2015-09-29 | 2016-02-17 | 腾讯科技(深圳)有限公司 | 一种信息处理方法、终端和计算机存储介质 |
Family Cites Families (17)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US6273818B1 (en) * | 1999-10-25 | 2001-08-14 | Square Co., Ltd. | Video game apparatus and method and storage medium |
JP3888542B2 (ja) * | 2002-12-05 | 2007-03-07 | 任天堂株式会社 | ゲーム装置およびゲームプログラム |
JP4057945B2 (ja) * | 2003-04-25 | 2008-03-05 | 株式会社バンダイナムコゲームス | プログラム、情報記憶媒体及びゲーム装置 |
JP3880008B2 (ja) * | 2004-12-21 | 2007-02-14 | 株式会社光栄 | キャラクタ集団移動制御プログラム、記憶媒体及びゲーム装置 |
JP4291816B2 (ja) * | 2005-12-27 | 2009-07-08 | 株式会社コナミデジタルエンタテインメント | ゲーム装置、ゲーム装置の制御方法及びプログラム |
US8210943B1 (en) * | 2006-05-06 | 2012-07-03 | Sony Computer Entertainment America Llc | Target interface |
US20100302138A1 (en) | 2009-05-29 | 2010-12-02 | Microsoft Corporation | Methods and systems for defining or modifying a visual representation |
JP2011212347A (ja) * | 2010-03-31 | 2011-10-27 | Namco Bandai Games Inc | プログラム、情報記憶媒体、端末及びネットワークシステム |
JP5452429B2 (ja) * | 2010-09-14 | 2014-03-26 | 株式会社スクウェア・エニックス | ゲーム装置、ゲームプログラム及びゲームの進行方法 |
US20120122561A1 (en) * | 2010-11-12 | 2012-05-17 | Bally Gaming, Inc. | System and method for tournament gaming using social network based team formation |
US20130005417A1 (en) * | 2011-06-30 | 2013-01-03 | Peter Schmidt | Mobile device action gaming |
US8814674B2 (en) * | 2012-05-24 | 2014-08-26 | Supercell Oy | Graphical user interface for a gaming system |
EP3517190B1 (en) * | 2013-02-01 | 2022-04-20 | Sony Group Corporation | Information processing device, terminal device, information processing method, and programme |
JP5581434B1 (ja) * | 2013-10-31 | 2014-08-27 | 株式会社 ディー・エヌ・エー | ゲームプログラム、及び、情報処理装置 |
CN104618797B (zh) * | 2015-02-06 | 2018-02-13 | 腾讯科技(北京)有限公司 | 信息处理方法、装置及客户端 |
CN205064362U (zh) | 2015-05-12 | 2016-03-02 | 锘威科技(深圳)有限公司 | 一种新型扇叶 |
JP6632819B2 (ja) * | 2015-06-30 | 2020-01-22 | 株式会社バンダイナムコエンターテインメント | プログラム、ゲーム装置及びサーバシステム |
-
2015
- 2015-09-29 CN CN201510633275.3A patent/CN105335064B/zh active Active
-
2016
- 2016-05-04 EP EP16850075.9A patent/EP3273334B1/en active Active
- 2016-05-04 KR KR1020177033331A patent/KR102018212B1/ko active IP Right Grant
- 2016-05-04 JP JP2018505518A patent/JP6529659B2/ja active Active
- 2016-05-04 CA CA2982868A patent/CA2982868C/en active Active
- 2016-05-04 MY MYPI2017703956A patent/MY192140A/en unknown
- 2016-05-04 WO PCT/CN2016/081041 patent/WO2017054450A1/zh active Application Filing
-
2017
- 2017-10-04 US US15/725,140 patent/US10639549B2/en active Active
Patent Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN1743043A (zh) * | 2005-06-19 | 2006-03-08 | 珠海市西山居软件有限公司 | 一种网络游戏系统及其实现方法 |
US20140113718A1 (en) * | 2012-04-26 | 2014-04-24 | Riot Games, Inc. | Systems and methods that enable a spectator's experience for online active games |
CN103096134A (zh) * | 2013-02-08 | 2013-05-08 | 广州博冠信息科技有限公司 | 一种基于视频直播和游戏的数据处理方法和设备 |
CN103157281A (zh) * | 2013-04-03 | 2013-06-19 | 广州博冠信息科技有限公司 | 一种二维游戏场景显示的方法和设备 |
CN105335064A (zh) * | 2015-09-29 | 2016-02-17 | 腾讯科技(深圳)有限公司 | 一种信息处理方法、终端和计算机存储介质 |
Non-Patent Citations (1)
Title |
---|
See also references of EP3273334A4 * |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US11811681B1 (en) | 2022-07-12 | 2023-11-07 | T-Mobile Usa, Inc. | Generating and deploying software architectures using telecommunication resources |
Also Published As
Publication number | Publication date |
---|---|
US20180028918A1 (en) | 2018-02-01 |
KR102018212B1 (ko) | 2019-09-04 |
KR20170137913A (ko) | 2017-12-13 |
CN105335064B (zh) | 2017-08-15 |
CN105335064A (zh) | 2016-02-17 |
JP6529659B2 (ja) | 2019-06-12 |
EP3273334B1 (en) | 2023-05-10 |
US10639549B2 (en) | 2020-05-05 |
CA2982868C (en) | 2023-07-18 |
EP3273334A4 (en) | 2018-05-30 |
CA2982868A1 (en) | 2017-04-06 |
JP2018512988A (ja) | 2018-05-24 |
MY192140A (en) | 2022-07-29 |
EP3273334A1 (en) | 2018-01-24 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
WO2017054450A1 (zh) | 一种信息处理方法、终端和计算机存储介质 | |
US10661171B2 (en) | Information processing method, terminal, and computer storage medium | |
US11003261B2 (en) | Information processing method, terminal, and computer storage medium | |
US10814221B2 (en) | Method for locking target in game scenario and terminal | |
JP7502012B2 (ja) | 情報処理方法、端末及びコンピュータ記憶媒体 | |
KR102034367B1 (ko) | 정보 처리 방법과 단말기, 및 컴퓨터 저장 매체 | |
WO2017054464A1 (zh) | 一种信息处理方法、终端及计算机存储介质 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 16850075 Country of ref document: EP Kind code of ref document: A1 |
|
ENP | Entry into the national phase |
Ref document number: 2982868 Country of ref document: CA |
|
REEP | Request for entry into the european phase |
Ref document number: 2016850075 Country of ref document: EP |
|
ENP | Entry into the national phase |
Ref document number: 2018505518 Country of ref document: JP Kind code of ref document: A |
|
ENP | Entry into the national phase |
Ref document number: 20177033331 Country of ref document: KR Kind code of ref document: A |
|
NENP | Non-entry into the national phase |
Ref country code: DE |