CN113918067A - Interface logic execution method and device, electronic equipment and medium - Google Patents

Interface logic execution method and device, electronic equipment and medium Download PDF

Info

Publication number
CN113918067A
CN113918067A CN202111165928.1A CN202111165928A CN113918067A CN 113918067 A CN113918067 A CN 113918067A CN 202111165928 A CN202111165928 A CN 202111165928A CN 113918067 A CN113918067 A CN 113918067A
Authority
CN
China
Prior art keywords
interface
operable
display window
display
objects
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202111165928.1A
Other languages
Chinese (zh)
Inventor
徐昶
陈军英
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Perfect World Beijing Software Technology Development Co Ltd
Original Assignee
Perfect World Beijing Software Technology Development Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Perfect World Beijing Software Technology Development Co Ltd filed Critical Perfect World Beijing Software Technology Development Co Ltd
Priority to CN202111165928.1A priority Critical patent/CN113918067A/en
Publication of CN113918067A publication Critical patent/CN113918067A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/20Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
    • H04N21/21Server components or server architectures
    • H04N21/218Source of audio or video content, e.g. local disk arrays
    • H04N21/2187Live feed

Abstract

The embodiment of the invention provides an interface logic execution method, an interface logic execution device, electronic equipment and a medium, wherein the method comprises the following steps: receiving interface operation sent to a display interface; acquiring a plurality of operable objects corresponding to interface operation in a display interface, wherein the operable objects are displayed in a first display window and a second display window which are loaded in the display interface and respectively correspond to different application programs; selecting a target operation object displayed in the first display window or the second display window from the plurality of operation objects according to the prediction strategy; and triggering the action logic corresponding to the target operation object so as to execute the interface operation. According to the method, the corresponding action logic is screened and executed through the target operation object, the problem of interface logic execution errors caused by misjudgment of the operation object under the condition that the same interface operation corresponds to a plurality of interface logics is solved, the interface effect which is intended by a user is achieved, and the user experience is improved.

Description

Interface logic execution method and device, electronic equipment and medium
The invention is a divisional application of Chinese patent application with application number 202011312410.1, entitled "interface logic execution method and device, electronic equipment and medium", filed on 20.11.2020.
Technical Field
The invention belongs to the technical field of internet, and particularly relates to an interface logic execution method and device, electronic equipment and a medium.
Background
Currently, live video is a popular live broadcast mode, and a user enters a live broadcast room of a live broadcast platform through various terminal devices to watch the live video.
For example, a user may open a game program (e.g., a game client) to play a game while watching a live game program (e.g., a live client) through the live game program. That is to say, the user loads the respective display windows of the game program and the live program in the terminal interface at the same time, and at this time, the interface operation sent by the user to the terminal interface may trigger the respective interface logics of the game program and the live program at the same time, which causes a problem of an error in execution of the interface logics and affects user experience.
Disclosure of Invention
The invention provides an interface logic execution method and device, electronic equipment and a medium, which aim to solve the technical problem of interface logic execution errors caused by the fact that the same interface operation simultaneously triggers a plurality of interface logics.
In a first aspect, the present invention provides a method for executing interface logic, the method comprising: receiving interface operation sent to a display interface; acquiring a plurality of operable objects corresponding to the interface operation in the display interface, wherein the operable objects are displayed in a first display window and a second display window loaded in the display interface, and the first display window and the second display window respectively correspond to different application programs; selecting a target operation object displayed in the first display window or the second display window from the plurality of operable objects according to a prediction strategy; and triggering the action logic corresponding to the target operation object so as to execute the interface operation.
In a second aspect, the present invention provides an interface logic execution apparatus, which includes a transceiver module and a processing module. The receiving and transmitting module is used for receiving interface operation sent to the display interface; the processing module is used for acquiring a plurality of operable objects corresponding to the interface operation in the display interface, the operable objects are displayed in a first display window and a second display window loaded in the display interface, and the first display window and the second display window respectively correspond to different application programs; selecting a target operation object displayed in the first display window or the second display window from the plurality of operation objects according to a prediction strategy; and triggering the action logic corresponding to the target operation object so as to execute the interface operation.
In a third aspect, an embodiment of the present invention provides an electronic device, which includes a processor and a memory, where the memory stores executable code, and when the executable code is executed by the processor, the processor is enabled to implement at least the interface logic execution method in the first aspect.
An embodiment of the present invention further provides a system, including a processor and a memory, where the memory stores at least one instruction, at least one program, a code set, or an instruction set, and the at least one instruction, at least one program, code set, or instruction set is loaded and executed by the processor to implement an interface logic execution method described above.
Embodiments of the present invention further provide a computer-readable medium having stored thereon at least one instruction, at least one program, set of codes, or set of instructions, which is loaded and executed by a processor to implement one of the interface logic execution methods described above.
In the embodiment of the invention, aiming at the condition that the first display window and the second display window are loaded in the display interface, the interface operation sent out to the display interface is received, and the operable object corresponding to the interface operation in the display interface is obtained. Furthermore, if there are a plurality of operable objects corresponding to the interface operation, it is necessary to determine which operable object is the target operable object that the user intends to operate, and in this case, the target operable object displayed in the first display window or the second display window may be selected from the plurality of operable objects according to the prediction policy, so as to trigger the operation logic corresponding to the target operable object, and the interface operation may be executed. According to the embodiment of the invention, the target operation object which is intended to be operated by the user is screened out from the plurality of operable objects, and the action logic corresponding to the target operation object is executed, so that the problem of interface logic execution error caused by misjudgment of the operation object under the condition that the same operation corresponds to a plurality of interface logics is avoided, the interface effect which is intended by the user is realized, and the user experience is improved.
Drawings
The accompanying drawings, which are included to provide a further understanding of the invention and are incorporated in and constitute a part of this specification, illustrate embodiments of the invention and together with the description serve to explain the invention and not to limit the invention. In the drawings:
fig. 1 is a schematic flowchart illustrating an interface logic execution method according to an embodiment of the present invention;
FIG. 2 is a diagram illustrating a display interface according to an embodiment of the present invention;
fig. 3 is a schematic structural diagram of an interface logic execution apparatus according to an embodiment of the present invention;
fig. 4 is a schematic structural diagram of an electronic device corresponding to the interface logic execution apparatus provided in the embodiment shown in fig. 3.
Detailed Description
In order to make the objects, technical solutions and advantages of the embodiments of the present invention clearer, the technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are some, but not all, embodiments of the present invention. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
Unless defined otherwise, all technical and scientific terms used herein have the same meaning as commonly understood by one of ordinary skill in the art to which this invention belongs. The terminology used in the description of the invention herein is for the purpose of describing particular embodiments only and is not intended to be limiting of the invention.
The interface logic execution scheme provided by the embodiment of the invention can be executed by an electronic device, and the electronic device can be a terminal device such as a smart phone, a tablet computer, a PC (personal computer), a notebook computer and the like. In an alternative embodiment, the electronic device may have a service program installed thereon for executing the interface logic execution scheme. The service program is, for example, a game program (also referred to herein as a game program). The interface logic implementation may also be implemented by the game program and in conjunction with a server of the game program. Of course, the interface logic execution scheme can also be executed by the cooperation of the server and the terminal equipment.
In various terminal devices, a user can view a game interface provided by a game program and perform interactive operations in the game interface. Of course, the game interface may also be provided by a web page or medium. With the development of the game field, game programs can be loaded on various terminal devices. Taking a mobile phone as an example, more and more interactive contents are provided in a game interface of the mobile phone, and interface logic control modes corresponding to various interactive contents often conflict. In fact, the interface logic of a game program can also conflict with the interface logic in other applications.
The following takes a mobile phone as an example to illustrate the technical problems actually existing in the prior art:
the mobile phone mostly adopts a touch screen without keys, and various interface logics are controlled by touch operations such as frequently clicking, dragging and the like in a live interface of the mobile phone end in order to avoid too many keys being arranged in the live interface of the mobile phone end.
Taking a live game scene as an example, a user may open a game program to play a game while watching a live game through a live program. That is to say, the user loads the respective display windows of the game program and the live program in the terminal interface at the same time, and in this case, because the interface operation mode in the mobile phone is limited, the interface operation sent by the user to the terminal interface may trigger the respective interface logics of the game program and the live program at the same time, resulting in a problem of an error in execution of the interface logics.
Taking the click operation as an example, a user clicks an overlapping region of two display windows to acquire a property in a game program, and in this case, if a main broadcast attention key in a live program is also in the overlapping region, in response to the click operation, the main broadcast attention key in the live program may be used as an operation object, which may cause an interface logic execution error and poor user experience.
For another example, a player operation object in a game program is in a battle state, at this time, a horizontal sliding operation of a finger on a mobile phone screen may trigger the player operation object to release a certain skill, and the horizontal sliding operation is used to control a playing progress of live content in a live program, such as playing back the played live content. Assuming that the horizontal sliding operation falls into the respective display windows of the game program and the live program, which determination is to trigger which client interface execution logic is required to respond to the horizontal sliding operation is often required.
In summary, for the situation that the same interface operation in the terminal interface corresponds to multiple interface logics, an interface logic execution error is easily caused, and the user experience is poor.
The core idea of the interface logic execution scheme provided by the embodiment of the invention is as follows:
and receiving interface operation sent to the display interface aiming at the condition that the first display window and the second display window are loaded in the display interface, and acquiring an operable object corresponding to the interface operation in the display interface. Furthermore, if there are a plurality of operable objects corresponding to the interface operation, it is necessary to determine which operable object is the target operable object that the user intends to operate, and in this case, the target operable object displayed in the first display window or the second display window may be selected from the plurality of operable objects according to the prediction policy, so as to trigger the operation logic corresponding to the target operable object, and the interface operation may be executed. According to the embodiment of the invention, the target operation object which is intended to be operated by the user is screened out from the plurality of operable objects, and the action logic corresponding to the target operation object is executed, so that the problem of interface logic execution error caused by misjudging the operation object under the condition that the same interface operation corresponds to a plurality of interface logics is avoided, the interface effect which is intended by the user is realized, and the user experience is improved.
Having described the core concepts of the interface logic execution scheme, various non-limiting embodiments of the present invention are described in detail below.
The following describes the implementation of the interface logic execution method with reference to the following embodiments.
Fig. 1 is a schematic flowchart of an interface logic execution method according to an embodiment of the present invention. The interface logic execution method is applied to a display interface loaded with a first display window and a second display window (the first display window and the second display window are used for distinguishing and are not limited to the number or the type of the display windows). As shown in fig. 1, the interface logic execution method includes the following steps:
101. receiving interface operation sent to a display interface;
102. acquiring a plurality of operable objects corresponding to the interface operation in the display interface, wherein the operable objects are displayed in a first display window and a second display window loaded in the display interface, and the first display window and the second display window respectively correspond to different application programs;
103. selecting a target operation object displayed in the first display window or the second display window from the plurality of operation objects according to the prediction strategy;
104. and triggering the action logic corresponding to the target operation object so as to execute the interface operation.
In the interface logic execution method, the target operation object which is intended to be operated by the user is screened out from the plurality of operable objects aiming at the plurality of operable objects corresponding to the interface operation, and the action logic corresponding to the target operation object is executed, so that the problem of interface logic execution errors caused by misjudgment of the operation object under the condition that the same interface operation corresponds to the plurality of interface logics can be avoided, the interface effect which is intended by the user is realized, and the user experience is improved.
The embodiment of the invention is applied to various display interfaces. The display interface mentioned in the above step may be a display interface loaded in a terminal device, such as a mobile phone, a smart television, and a computer. The display content in the display interface can be constructed by data transmitted from the service device to the terminal device, or constructed by data pre-loaded into the terminal device through an installation package.
In order to meet the actual requirements of users, different display windows are loaded in the display interface. The different display windows respectively correspond to different application programs. For example, in a live game scene, a user may play a game while live (or watch live), and in this case, a display window of each of the game program and the live program is loaded in the display interface.
In fact, the embodiment of the present invention is also applicable to an application scenario in which a display interface is loaded with multiple application programs, for example, the multiple application programs loaded in the display interface may be a combination of a game program, a video playing client and a browser, a combination of a game program and a drawing client, or a combination of a game program and a game editor.
In addition, it should be understood that the position relationship between different display windows may overlap each other or be arranged in parallel. Of course, in some implementations, the display interface dynamically adjusts the positional relationship between the different display windows in response to control instructions from the user.
For convenience of description, the following describes in detail a specific implementation procedure of the interface logic implementation method shown in fig. 1, taking a live game scene in various terminals as an example:
in a live game scene, a first display window and a second display window are loaded in a display interface.
First, in 101, an interface operation issued to a display interface is received.
In the above steps, the interface operations sent out to the different types of display interfaces are different according to the actual situation. Taking a display interface loaded in a mobile phone as an example, after a user touches a mobile phone screen, touch operations sent to the mobile phone, such as clicking, double-clicking, sliding, and various control gestures corresponding to specific functions, are received through a sensor. Of course, the motion sensing control instruction sent to the mobile phone can be obtained through the gravity sensor, so that the corresponding interface operation can be obtained based on the motion sensing control instruction. For another example, in a display interface of a computer, interface operations issued to the computer, such as clicking, double-clicking, dragging, and various operation instructions input through a keyboard, may be received through an input device (a mouse, a keyboard, etc.).
Of course, besides the above-mentioned obtaining mode of the interface operation, the user image data may also be collected by a camera or other image collecting equipment, and the interface operation sent by the user is analyzed based on the user image data.
Further, after receiving the interface operation, a plurality of operable objects corresponding to the interface operation in the display interface are acquired in 102.
Wherein, at least one operable object is included in the display interface. An actionable object is an interface element in a display interface that can interact with a user. The operable object can be presented through a display interface carried by the terminal device, and of course, the operable object can also be in a hidden state in the display interface, for example, a certain item or NPC in the game program is set to be not shown in the display interface before the interface logic corresponding to the item or NPC is triggered.
Different display windows are loaded on the display interface, and the different display windows respectively correspond to different application programs. The operable objects are respectively arranged in various application programs and are displayed through display windows corresponding to the various application programs. Taking a game program as an example, an operable object set in the game program is displayed to a user through a display window corresponding to the game program. The object to be operated provided in the game program is, for example, a player operation object (i.e., a player-controlled game character), NPC, or prop. Taking the live program as an example, it is assumed that the live program is a live website, a display window of the live program is a live page of the live website, the live page is a website page containing live content, and here, an operable object shown in the live page is, for example, an interface control in the live page. In particular, the interface controls in the live page are, for example, a live room window, a title bar, a live room listing bar, an interface button, a listing bar, and a window for jumping to other pages (e.g., a pop-up window, etc.).
The operable objects corresponding to the interface operations can be one or more. Optionally, the multiple operable objects corresponding to the interface operation are respectively in different operable areas. Specifically, the plurality of operable objects respectively correspond to different display windows that overlap with each other in the display interface, and the different display windows respectively correspond to different application programs. In practice, the different operable areas are, for example, bottom layers, column surfaces, and popup windows at different levels. Optionally, these operational areas at different levels are superimposed on each other. Different operational regions may also be at the same level.
It can be understood that if the interface logic trigger mode of a certain operable object includes the interface operation received in 101, the operable object is considered to be the operable object related to the interface operation in the above step. Based on this, an alternative implementation of 102 is such as:
and determining the track of the interface operation in the display interface, so that the operable object corresponding to the interface operation in the display interface is determined based on the track of the interface operation and the interface logic trigger mode of each operable object.
For example: assume that the display interface is the display interface shown in fig. 2. The display interface is loaded with display windows corresponding to the game program and the live program, and the display window of the live program is suspended at the lower right corner of the display window of the game program. In fig. 2, the display window of the game program includes a player operation object a, an object to be attacked b, and an attack prop c; the display window of the live program includes a title bar, a live room window located in the center of the display window, and other live room jump spaces, i.e., rooms 1 to 3, located on the right side of the display window. Assume that the user issues a drag operation in the display interface by a finger. It is assumed that the operable objects in the current display interface include a player operating object a, an object to be attacked b, an attack prop c, a live broadcast room window, a volume control and a brightness control. The volume control and the brightness control can be operable objects which are selected and then displayed on the display interface.
Based on the above assumption, after the mobile phone detects that the drag operation issued to the mobile phone screen is received through the sensor, a track of the drag operation in the display interface, such as the track m dragged to the right side shown in fig. 2, is determined. In this case, based on the trajectory m of the drag operation in the display interface (in fig. 2, the trajectory m covers the display window corresponding to each of the game program and the live program), and the interface logic trigger manner of each operable object, it is determined that the operable object corresponding to the drag operation is an attack prop c in the game program (the trigger manner of the attack prop c is to apply the drag operation to the display window of the game program) and a volume control in the live program (the trigger manner of the volume control is to apply the drag operation to the display window of the live program).
For another example: assume that the received interface operation is a single click operation. It is assumed that the operable objects provided to the game program include a player operating object, an attack item, and an NPC that the player needs to attack. It is assumed that the operable objects set in the live program include a gift control and a volume control. Based on the above assumptions, in 102, a trajectory of the interface operation in the display interface (for example, a contact point x in the display interface) is determined, and thus, based on the contact point x and an interface logic trigger manner of each operable object, it is determined that the operable object corresponding to the single click operation in the display interface is a player operation object, an attack prop, an NPC that the player needs to attack, and a gift control in a live program. In fact, the click operation also needs to take into account the location of the click operation. For example, it is assumed that the position of the single-click operation is in the superimposed portion of the attack prop and the gift control, and in this case, the operable object corresponding to the single-click operation should be the attack prop and the gift control.
In an actual situation, there are various operable objects corresponding to interface operations in the display interface according to different display interfaces and different interface operations, which is not limited in the present invention.
Furthermore, assuming that there is one operable object corresponding to the interface operation acquired in 102, in this case, it is obvious that the operable object intended to be operated by the user can be directly determined, and further, the preset interface logic corresponding to the operable object can be directly executed to interact with the user, so that the interface operation is executed.
Further, assuming that there are a plurality of the objects corresponding to the interface operations acquired in 102, in this case, the object that the user intends to operate cannot be directly determined, and the interface logic in the game program has a higher requirement for the real-time performance of the operation with respect to the interface logic in the live program (and other application programs), and therefore, in order to ensure the real-time performance of the operation of the game program, if there are a plurality of the objects corresponding to the interface operations in 103, the target object to be operated, which is displayed in the first display window or the second display window, is selected from the plurality of the objects to be operated according to the prediction policy.
According to the actual conditions of various interface operations, the conditions of a plurality of operable objects corresponding to the same interface operation are various. For example, it is continuously assumed that the operable object displayed in the first display window or the second display window includes a player operating object, an attack prop, and an NPC that the player needs to attack. It is assumed that the operable objects set in the live program include a gift control and a volume control. Based on this, for the display interface loaded with the game program and the live program, the operable objects corresponding to the horizontal sliding operation are, for example: attack props and volume controls. Alternatively, the operable objects corresponding to the single click operation are, for example: attack props, and gift controls.
Assuming that the first display window corresponds to a game program, in 103, an optional embodiment of selecting a target operation object displayed in the first display window or the second display window from a plurality of operable objects according to a prediction strategy is as follows:
determining the current state of a player operation object in the game program; a target object that matches the current state of the player object is selected from the plurality of objects.
In the above steps, the target operation object matched with the current state is screened out from the plurality of operable objects according to the state of the player operation object, so that the interface operation can trigger the operable object conforming to the current state of the player operation object in time, and the real-time performance of the interface operation is further ensured.
The player operation role refers to a main body bearing game function playing methods and is responsible for interacting with game players. In a game scenario, a player-operated character may interact with various interaction objects. The interactive object may be an NPC in a game scene, the NPC is a non-player control unit, and may interact with a player according to a preset execution logic, and NPC functions in different playing methods may be different.
In the game program, the current state of the player operation object includes: a combat state and a non-combat state. The non-combat state includes an item collection state, a game copy state, and the like. In essence, regardless of the designation, the combat state has a high real-time requirement for interface operations, while the non-combat state has a low real-time requirement for interface logic.
In fact, in 103, an alternative embodiment of determining the current state of the player's operation object in the game program is: acquiring historical behavior data of a player operation object in a game program; the current state of the player's operation object is determined based on the historical behavior data. Optionally, to ensure real-time performance of the interface operation trigger object, historical behavior data of the player operation object may be set as historical behavior data within a preset time period.
For example, the historical behavior data of the player's operation object in the game program in 3 minutes includes: a game copy is entered. And determining the current state of the operation object of the player as the state of the game copy based on the historical behavior data of the game copy.
In another alternative embodiment, the current state of the player operation object is determined by obtaining configuration information of the player operation object. The configuration information may be obtained from the server side, for example, or may be obtained locally from the terminal device.
Further, several alternative embodiments of selecting a target object matching the current state of the player operation object from the plurality of objects under different states of the player operation object will be described below with reference to specific examples:
in an embodiment of the present invention, in 103, if the current state of the player operation object is the fighting state, an optional implementation manner of selecting a target operation object matching the current state of the player operation object from the plurality of operable objects is provided:
and selecting the operable object of the game program from the plurality of operable objects as the target operable object.
Alternatively, one implementation manner of selecting an operable object of the game program from the plurality of operable objects as the target operable object is as follows:
determining operation types corresponding to at least two operable objects of the game program; acquiring a target operation type corresponding to a combat state; and selecting an operable object corresponding to the target operation type from the at least two operable objects as a target operation object.
Wherein the operable object of the game program corresponds to at least one operation type. Optionally, the operation types include, for example: attack skills, defense skills, magic skills, treatment skills. The operable objects corresponding to the operation types are attack props, defense props, magic props and blood-enriching props respectively.
In fact, the operation types include various situations according to the game settings. To enrich the player's operational experience, the operation type can also be defined by the player himself.
For example, it is assumed that a plurality of operable objects corresponding to the interface operation are attack props and defense props of the game program. Assume that the preset period is 2 minutes. The historical behavior data of the player operation object in the preset time period is assumed as follows: the attack object is seen, specifically, the attack object appears in the view of the player (that is, the distance between the position of the attack object and the position of the player operation object is smaller than a preset value), and the attack object does not attack the player operation object first.
Based on the above assumptions, in 103, historical behavior data of the player operation object in the game program within 2 minutes is obtained (i.e., the player operation object is seen), and based on the historical behavior data of the player operation object, it is determined that the current state of the player operation object is a state of actively initiating a battle, so that the operation types, i.e., attack skill class and defense skill class, corresponding to the attack prop and the defense prop of the game program are determined.
For another example, assume that the historical behavior data of the player's operation object in the preset period is: and when the game program is attacked by NPC, determining the current state of the operation object of the player as the state of actively launching defense based on the historical behavior data attacked by NPC, and accordingly determining the operation types, namely attack skill class and defense skill class, corresponding to the attack prop and the defense prop of the game program.
In the above embodiment, no matter what implementation is adopted, the actual intention is to further screen out the target operation object that the user intends to operate from the plurality of operable objects by further judging the state of the player operation object, so that the action logic corresponding to the target operation object can be timely triggered subsequently, and thus, the real-time performance of the interface operation is ensured.
In another embodiment of the present invention, assuming that the second display window corresponds to a live program, based on this, if the current state of the player operation object is a non-combat state, then 103 further provides an optional implementation manner of selecting a target operation object matching the current state of the player operation object from the plurality of operable objects:
acquiring a plurality of historical interface operations sent to a display interface; determining the relevance of the interface operation and a plurality of historical interface operations; if the interface operation is related to the historical interface operation sent to the game program, selecting an operable object related to the game program from the plurality of operable objects as a target operation object; and if the interface operation is related to the historical interface operation sent to the live program, selecting an operable object related to the live program from the plurality of operable objects as a target operation object.
In the above steps, according to the relevance between the historical interface operation and the currently received interface operation, a target operation object relevant to the game program or the live program is screened out from the multiple operable objects, so that the action logic corresponding to the target operation object can be triggered in time subsequently, and the real-time performance of the interface operation is ensured.
Specifically, whether the received interface operation is related to a plurality of historical interface operations is judged, if the interface operation is judged to be related to the continuous interface operation sent to the game program or the live program, the interface operation is related to the continuous interface operation appearing before, and in this case, the operable object related to the continuous interface operation in the plurality of operable objects is preferentially taken as the target operable object.
The continuous interface operation can be understood as a plurality of interface operations which are performed sequentially and have consistent action types. Such as multiple single-click selection actions for a certain type of item.
Assume that the plurality of historical interface operations issued on the display interface include: continuous interface operation is sent to the game program. Based on the above assumptions, an alternative embodiment of determining 103 an association of an interface operation with a plurality of historical interface operations is provided: judging whether the received interface operation is related to a plurality of historical interface operations, and if the interface operation is judged to be related to continuous interface operations sent to the game program, the target operation object is: the game program's actionable objects associated with these continuous interface operations.
In practical applications, the relevance between the currently received interface operation and the continuous interface operation may be determined according to the type of the interface operation. For example, if the currently received interface operation and the continuous interface operation belong to the same type of operation, it may be determined that the received interface operation is associated with the continuous interface operation.
Of course, for some interface operation types, the judgment needs to be performed in combination with other information interface operation information. For example, for the sliding operation, it is further required to determine whether the sliding tracks of the currently received interface operation and the continuous interface operation are consistent, and if so, determine that the received interface operation is associated with the continuous interface operation. Or, it may also be determined whether the sliding direction of the currently received interface operation is consistent with the sliding direction of the continuous interface operation, and if so, it is determined that the received interface operation is associated with the continuous interface operation. For example, it is also necessary to actually determine the operation position in association with the click operation.
Further, if it is determined that the received interface operation is associated with a continuous interface operation, a target operation object associated with the continuous interface operation is selected from the plurality of operable objects.
According to practical situations, the continuous interface operation can be directed to the same operable object, and can also be directed to the same type of operable object.
For continuous interface operation aiming at the same operable object, selecting a target operation object associated with the continuous interface operation from a plurality of operable objects, specifically:
and if the execution object of the continuous interface operation is the same operable object in the game program, selecting the operable object consistent with the execution object from the plurality of operable objects as the target operation object.
For example, suppose a player issues a continuous interface operation, i.e., multiple clicks of a medical kit, when he needs to enrich the blood. Suppose that a live room interactive popup pushed by a live program to a user is overlaid on a medical kit. The operable objects corresponding to the one-click operation at this time include a medical bag and a popup. Based on this, since the execution object of multiple clicks is the same medical kit in the game program, in this case, the medical kit can be selected from the medical kit and the pop-up window as the target operation object.
For continuous interface operation aiming at the same type of operable objects, selecting a target operation object associated with the continuous interface operation from a plurality of operable objects, specifically:
and if the execution objects of the continuous interface operation are the same type of operable objects in the game program, selecting the operable objects which belong to the same type as the execution objects from the plurality of operable objects as target operation objects.
For example, suppose a player needs to collect mushrooms by issuing a continuous interface operation, i.e., multiple clicks on the mushrooms, to collect multiple mushrooms growing above the ground. In this case, the execution target of the continuous interface operation is a plurality of mushrooms in the game program. Suppose that live room interaction information pushed by the live program to the user is overlaid on a portion of the mushrooms. Assuming that the click operation falls on an area where the interactive information and the mushroom overlap, in this case, the operable object corresponding to the click operation includes the occluded mushroom and the interactive information.
Based on this, since the execution object of the multiple clicks is the same type of operable object in the game program, namely, multiple mushrooms, in this case, the occluded mushroom can be selected from the occluded mushroom and the interactive information as the target operation object.
As another example, assume that the player is clicking a gift button multiple times in a live program to deliver a gift. Assume that a trigger notification (including an open option) for a game program to push a copy of a game to a user is overlaid on a gift button. The operable objects corresponding to the one-click operation at this time include a trigger notification and a gift key. Based on this, since the execution object of the multiple clicks is the same gift key in the live program, in this case, the gift key can be selected from the trigger notification and the gift key as the target operation object.
In fact, no matter what implementation is adopted, the actual intention is to screen out the target operation object which is intended to be operated by the user from the multiple operable objects through the relevance between the historical interface operation and the currently received interface operation, so that the action logic corresponding to the target operation object can be triggered in time subsequently, and the real-time performance of the interface operation is guaranteed.
In yet another embodiment of the present invention, it is assumed that the second display window corresponds to a live program, and that the historical interface operations issued to the game program correspond to game tasks of the game program. Based on this, if the current state of the player operation object is a non-combat state, then 103 provides an alternative embodiment of selecting a target operation object matching the current state of the player operation object from the plurality of operable objects:
judging whether the interface operation is related to a to-be-triggered plot or an to-be-acquired object in the game task; and selecting an operable object associated with the scenario to be triggered or the object to be acquired from the plurality of operable objects as a target operable object based on the judgment result.
In the above steps, according to the relevance between the currently received interface operation and the game task, a target operation object relevant to the game program or the live program is screened out from the multiple operable objects, so that the action logic corresponding to the target operation object can be triggered in time subsequently, and the real-time performance of the interface operation is ensured.
Specifically, whether the currently received interface operation corresponds to a to-be-triggered scenario or an to-be-acquired object in the game task is judged, and if the currently received interface operation corresponds to the to-be-triggered scenario in the game task is judged, an operable object associated with the to-be-triggered scenario is selected from the plurality of operable objects as a target operating object. And if the currently received interface operation is judged to correspond to the object to be acquired in the game task, selecting an operable object consistent with the object to be acquired from the plurality of operable objects as a target operation object.
For example, assume that the player accepts a mushroom collection task. Suppose that live room interaction information pushed by the live program to the user is overlaid on a portion of the mushrooms. Assuming that the click operation falls on an overlapping area of the interactive information and the mushroom, in this case, the operable object corresponding to the click operation includes the occluded mushroom and the interactive information.
Based on the above assumption, it is determined whether the received click operation is associated with an object to be acquired (i.e., a mushroom) in a mushroom collection task, since the received click operation falls in an overlapping area of the interaction information and the mushroom, which is an object to be acquired in a game task being executed by the player, in this case, it is determined that the click operation is associated with the object to be acquired in the mushroom collection task, and thus, the mushroom is selected as a target operation object from the blocked mushroom and the interaction information.
In another embodiment of the present invention, it is assumed that the second display window corresponds to a live program. Based on this, if the current state of the player operation object is a non-combat state, then 103 provides an alternative embodiment of selecting a target operation object matching the current state of the player operation object from the plurality of operable objects:
determining the positions of a plurality of operable objects in the display interface according to the layout information of the display interface; and selecting the operable objects in the visual emphasis area of the display interface as target operation objects according to the positions of the operable objects in the display interface.
Because the non-combat state has low requirement on the real-time performance of the game program operation, and a user has an incentive to operate the operable object in other application programs, in the steps, the target operation object associated with the game program or the live program can be screened out from a visual angle according to the layout information of the display interface, so that the action logic corresponding to the target operation object can be triggered in time subsequently, and the real-time performance of the interface operation is ensured.
The operable objects in the visual emphasis area of the display interface include, but are not limited to: operable objects with a distance from the center of the display interface less than a threshold value, operable objects at the uppermost layer of the display interface, and operable objects at the right part of the operable area. In practical application, the display interface comprises an operable area. Assuming that the probability of selecting an operable object in the right part of the operable area by the user is higher than the probability of selecting an operable object in the left part of the operable area, the operable object in the right part of the operable area can be preferentially selected as the target object according to the positions of the plurality of operable objects in the display interface, so as to be closer to the use habit of the user.
Optionally, the visual emphasis area may be preset according to the user behavior data, or may be actively set by the user. For example, the visual emphasis area may be set according to user preference data.
Finally, after selecting a target operation object from the plurality of selectable operation objects through any of the above embodiments, at 104, an action logic corresponding to the target operation object is triggered, so that the interface operation is executed. The action logic corresponding to the target operation object refers to the function logic or the interface effect of the application program in which the target operation object is located. Taking the game program as an example, the action logic corresponding to the attack prop is, for example, an interface effect which causes damage to the attack object, and the action logic corresponding to the blood-enriching prop is, for example, a treatment effect implemented on the blood-enriching object and an attribute adjustment logic of the blood-enriching object, for example, a corresponding value corresponding to the blood-enriching prop is added.
In an optional embodiment with respect to the display interface, assume that the display interface is loaded with a first display window and a second display window. It is assumed that the target operation object belongs to the first display window, and the interface operation is an interface operation on the target operation object in the first display window. Assume that the first display window corresponds to a first application.
Based on the above assumption, after the target operation object is screened out in 103, it may also be determined whether the target operation object is associated with a second application program corresponding to the second display window, and if the target operation object is associated with the second application program, the association operation of the interface operation is executed in the second application program.
In the above step, the associated operations for executing the interface operation in the second application program include, but are not limited to: message reminding operation, regional broadcast operation, world broadcast operation and operation for adjusting attribute of operable object. Specifically, the operation of adjusting the attribute of the operable object may be adjusting the number of items corresponding to the associated operation, or the like.
For example, assume that the first application is a live program and the second application is a game program. It is assumed that the second display window is a display window of the game program. Based on the above assumption, when the gift item in the game program is sent out in the live program, the regional broadcast operation can be executed in the game program, so as to generate a corresponding regional broadcast message for a certain player to notify that the gift item is sent out. In this case, in response to the association operation of sending out the gift items, the number of gift items in the backpack of the player is reduced by 1 in the game program.
Optionally, a prompt message may be presented in the display window of the game program, where the prompt message is used to reflect the operation result of the association operation. For example, after the number of gift items in the backpack of the player is reduced by 1 in the game program, a corresponding prompt message is displayed in the display window of the game program to inform the player of the result of the reduction of the number of gift items in the backpack. Or a prompt message for thanking to send out the property gift can be displayed in the display window of the game program.
Thus, by the interface logic execution method shown in fig. 1, the problem of interface logic execution errors occurring when the same interface operation corresponds to multiple interface logics can be avoided, the interface effect intended by the user is achieved, and the user experience is improved.
The interface logic performing apparatus of one or more embodiments of the present invention will be described in detail below. Those skilled in the art will appreciate that the interface logic performing means may be constructed using commercially available hardware components configured by the steps taught in the present solution.
Fig. 3 is a schematic structural diagram of an interface logic execution apparatus according to an embodiment of the present invention. The device is applied to various display interfaces, and as shown in fig. 3, the interface logic execution device comprises: a transceiver module 11 and a processing module 12.
The receiving and sending module 11 is used for receiving interface operations sent to the display interface;
the processing module 12 is configured to obtain an operable object corresponding to the interface operation in the display interface; if the number of the operable objects corresponding to the interface operation is multiple, selecting a target operation object displayed in the first display window or the second display window from the multiple operable objects according to a prediction strategy; and triggering the action logic corresponding to the target operation object so as to execute the interface operation.
Optionally, the first display window corresponds to a first application program, and the second display window corresponds to a second application program.
If the target operation object shown in the first display window is selected from a plurality of operation objects, the processing module 12 is further configured to: if the target operation object displayed in the first display window is selected from a plurality of operation objects, the method further comprises the following steps: judging whether the target operation object is associated with the second application program; and if the target operation object is associated with the second application program, executing the associated operation of the interface operation in the second application program.
Optionally, the first application program is a live program, the second application program is a game program, and the second display window is a display window of the game program.
If the target operation object is associated with the game program, the processing module 12 is further configured to: and displaying a prompt message in a display window of the game program, wherein the prompt message is used for reflecting the operation result of the associated operation.
Optionally, the first display window corresponds to a game program. When the processing module 12 selects a target operation object shown in the first display window or the second display window from the plurality of operable objects according to a preset policy, the processing module is specifically configured to:
determining the current state of a player operation object in the game program; a target object that matches the current state of the player object is selected from the plurality of objects.
Optionally, when determining the current state of the player operation object in the game program, the processing module 12 is specifically configured to:
acquiring historical behavior data of a player operation object in the game program; and determining the current state of the operation object of the player based on the historical behavior data.
Alternatively, if the current state of the player operation object is a fighting state, the processing module 12 is specifically configured to, when a target operation object matching the current state of the player operation object is selected from the plurality of operable objects:
and selecting the operable object of the game program as the target operable object from a plurality of operable objects.
Optionally, when the operable object of the game program is selected from the plurality of operable objects as the target operable object, the processing module 12 is specifically configured to:
determining an operation type corresponding to each of at least two operable objects of the game program; acquiring a target operation type corresponding to the fighting state; and selecting an operable object corresponding to the target operation type from at least two operable objects as the target operation object.
Optionally, the second display window corresponds to a live program. If the current state of the player operation object is a non-combat state, the processing module 12 is specifically configured to, when a target operation pair matching the current state of the player operation object is selected from the plurality of operable objects:
acquiring a plurality of historical interface operations sent to the display interface; determining an association of the interface operation with the plurality of historical interface operations; if the interface operation is associated with historical interface operation sent to the game program, selecting an operable object associated with the game program from a plurality of operable objects as the target operation object; or if the interface operation is related to historical interface operation sent to the live program, selecting an operable object related to the live program from a plurality of operable objects as the target operation object.
Optionally, if the interface operation is associated with a continuous interface operation issued to the game program or the live program, the target operation object is an operable object associated with the continuous interface operation among a plurality of operable objects.
Optionally, the interface operates a game task associated with the game program.
When the processing module 12 selects an operable object associated with the game program from the plurality of operable objects as the target operable object, it is specifically configured to:
judging whether the interface operation is related to a to-be-triggered plot or an to-be-acquired object in the game task; and selecting an operable object associated with the scenario to be triggered or the object to be acquired from a plurality of operable objects as the target operable object based on the judgment result.
Alternatively, if the current state of the player operation object is a non-combat state, the processing module 12 is specifically configured to, when a target operation object matching the current state of the player operation object is selected from the plurality of operable objects:
determining the positions of a plurality of operable objects in the display interface according to the layout information of the display interface; and selecting the operable objects in the visual emphasis area of the display interface as the target operating objects according to the positions of the operable objects in the display interface.
Optionally, the operable object in the visual emphasis area of the display interface includes one of:
operable objects with a distance from the center of the display interface less than a threshold value, operable objects at the uppermost layer of the display interface, and operable objects at the right part of the operable area.
The interface logic executing apparatus shown in fig. 3 may execute the methods provided in the foregoing embodiments, and portions not described in detail in this embodiment may refer to the related descriptions of the foregoing embodiments, which are not described herein again.
In one possible design, the interface logic performing apparatus shown in fig. 3 may be implemented as an electronic device.
As shown in fig. 4, the electronic device may include: a processor 21 and a memory 22. Wherein the memory 22 has stored thereon executable code, which when executed by the processor 21, at least makes the processor 21 capable of implementing the interface logic execution method as provided in the previous embodiments. The electronic device may further include a communication interface 23 for communicating with other devices or a communication network.
In addition, an embodiment of the present invention provides a non-transitory machine-readable storage medium having stored thereon executable code, which, when executed by a processor of a wireless router, causes the processor to execute the interface logic execution method provided in the foregoing embodiments.
The system, method and apparatus of the embodiments of the present invention can be implemented as pure software (e.g., a software program written in Java), as pure hardware (e.g., a dedicated ASIC chip or FPGA chip), or as a system combining software and hardware (e.g., a firmware system storing fixed code or a system with a general-purpose memory and a processor), as desired.
Another aspect of the invention is a computer-readable medium having computer-readable instructions stored thereon that, when executed, perform a method of embodiments of the invention.
While various embodiments of the present invention have been described above, the above description is intended to be illustrative, not exhaustive, and not limited to the disclosed embodiments. Many modifications and variations will be apparent to those of ordinary skill in the art without departing from the scope and spirit of the described embodiments. The scope of the claimed subject matter is limited only by the attached claims.

Claims (11)

1. An interface logic execution method, comprising:
receiving interface operation sent to a display interface;
acquiring a plurality of operable objects corresponding to the interface operation in the display interface, wherein the operable objects are displayed in a first display window and a second display window loaded in the display interface, and the first display window and the second display window respectively correspond to different application programs;
selecting a target operation object displayed in the first display window or the second display window from the plurality of operable objects according to a prediction strategy;
and triggering the action logic corresponding to the target operation object so as to execute the interface operation.
2. The interface logic performing method of claim 1, wherein the first display window corresponds to a game program;
the selecting a target operation object displayed in the first display window or the second display window from the plurality of operable objects according to a preset strategy comprises:
determining the current state of a player operation object in the game program;
and selecting a target operation object matched with the current state of the player operation object from the plurality of operable objects.
3. The interface logic executing method according to claim 2, wherein the determining a current state of a player operation object in the game program comprises:
acquiring historical behavior data of a player operation object in the game program, and determining the current state of the player operation object based on the historical behavior data; or
The method comprises the steps of obtaining configuration information of a player operation object in the game program, and determining the current state of the player operation object based on the configuration information.
4. The interface logic executing method according to claim 2, wherein if the current state of the player operation object is a fighting state, selecting a target operation object from the plurality of operation objects, which matches the current state of the player operation object, comprises:
and selecting an operable object with an operation type corresponding to the target operation type of the fighting state from a plurality of operable objects as the target operation object.
5. The interface logic executing method according to claim 2, wherein if the current state of the player object is a non-combat state, selecting a target object from the plurality of objects that matches the current state of the player object comprises:
determining the positions of a plurality of operable objects in the display interface according to the layout information of the display interface;
and selecting the operable objects in the visual emphasis area of the display interface as the target operating objects according to the positions of the operable objects in the display interface.
6. The interface logic execution method of claim 1, wherein selecting a target operand from the plurality of operands that is presented in the first display window or the second display window according to a prediction policy comprises:
acquiring a plurality of historical interface operations sent to the display interface;
determining an association of the interface operation with the plurality of historical interface operations;
if the interface operation is related to historical interface operation sent to a first application program corresponding to the first display window, selecting an operable object related to the first application program from a plurality of operable objects as the target operation object; or
And if the interface operation is related to the historical interface operation sent to a second application program corresponding to the second display window, selecting an operable object related to the second application program from a plurality of operable objects as the target operation object.
7. The interface logic execution method of claim 6, wherein the first application is a game program and the interface operates a game task associated with the game program;
selecting an operable object associated with the first application program from the plurality of operable objects as the target operable object, wherein the selecting comprises:
judging whether the interface operation is related to a to-be-triggered plot or an to-be-acquired object in the game task;
and selecting an operable object associated with the scenario to be triggered or the object to be acquired from a plurality of operable objects as the target operable object based on the judgment result.
8. The method of claim 1, wherein if the target object displayed in the first display window is selected from the plurality of objects according to a prediction strategy, the method further comprises:
judging whether the target operation object is associated with an application program corresponding to the second display window;
and if the target operation object is associated with the application program corresponding to the second display window, executing the associated operation of the interface operation in the application program corresponding to the second display window.
9. An interface logic execution apparatus, comprising:
the receiving and sending module is used for receiving interface operation sent to the display interface;
the processing module is used for acquiring a plurality of operable objects corresponding to the interface operation in the display interface, the operable objects are displayed in a first display window and a second display window loaded in the display interface, and the first display window and the second display window respectively correspond to different application programs; selecting a target operation object displayed in the first display window or the second display window from the plurality of operation objects according to a prediction strategy; and triggering the action logic corresponding to the target operation object so as to execute the interface operation.
10. An electronic device, comprising a memory and a processor, wherein the memory stores a computer program operable on the processor, and the processor implements the interface logic execution method according to any one of claims 1 to 8 when executing the computer program.
11. A computer-readable medium having non-volatile program code executable by a processor, wherein the program code causes the processor to perform the interface logic performing method of any of claims 1 to 8.
CN202111165928.1A 2020-11-20 2020-11-20 Interface logic execution method and device, electronic equipment and medium Pending CN113918067A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202111165928.1A CN113918067A (en) 2020-11-20 2020-11-20 Interface logic execution method and device, electronic equipment and medium

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN202011312410.1A CN112486381B (en) 2020-11-20 2020-11-20 Interface logic execution method and device, electronic equipment and medium
CN202111165928.1A CN113918067A (en) 2020-11-20 2020-11-20 Interface logic execution method and device, electronic equipment and medium

Related Parent Applications (1)

Application Number Title Priority Date Filing Date
CN202011312410.1A Division CN112486381B (en) 2020-11-20 2020-11-20 Interface logic execution method and device, electronic equipment and medium

Publications (1)

Publication Number Publication Date
CN113918067A true CN113918067A (en) 2022-01-11

Family

ID=74932332

Family Applications (2)

Application Number Title Priority Date Filing Date
CN202011312410.1A Active CN112486381B (en) 2020-11-20 2020-11-20 Interface logic execution method and device, electronic equipment and medium
CN202111165928.1A Pending CN113918067A (en) 2020-11-20 2020-11-20 Interface logic execution method and device, electronic equipment and medium

Family Applications Before (1)

Application Number Title Priority Date Filing Date
CN202011312410.1A Active CN112486381B (en) 2020-11-20 2020-11-20 Interface logic execution method and device, electronic equipment and medium

Country Status (1)

Country Link
CN (2) CN112486381B (en)

Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103019761A (en) * 2011-07-22 2013-04-03 三星电子株式会社 Method of arranging user interface objects in a portable terminal and an apparatus thereof
US20130300688A1 (en) * 2012-05-09 2013-11-14 Sony Corporation Information processing apparatus, information processing method, and program
CN103412713A (en) * 2013-06-28 2013-11-27 北京君正集成电路股份有限公司 Management method of intelligent device for having control over a plurality of windows simultaneously
US8926430B1 (en) * 2013-10-10 2015-01-06 DeNA Co., Ltd. Game system, game program, and method for providing game switchable between manual mode and automatic mode
JP2015046752A (en) * 2013-08-28 2015-03-12 京セラドキュメントソリューションズ株式会社 Information output device and information output method
CN104978739A (en) * 2015-04-29 2015-10-14 腾讯科技(深圳)有限公司 Image object selection method and apparatus
CN107433036A (en) * 2017-06-21 2017-12-05 网易(杭州)网络有限公司 The choosing method and device of object in a kind of game
CN108379837A (en) * 2018-02-01 2018-08-10 网易(杭州)网络有限公司 Information processing method and device, storage medium, electronic equipment
CN111176529A (en) * 2019-12-31 2020-05-19 维沃移动通信有限公司 Key display method and electronic equipment
CN111760272A (en) * 2020-06-30 2020-10-13 网易(杭州)网络有限公司 Game information display method and device, computer storage medium and electronic equipment

Family Cites Families (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP5316387B2 (en) * 2009-12-04 2013-10-16 ソニー株式会社 Information processing apparatus, display method, and program
US9619120B1 (en) * 2014-06-30 2017-04-11 Google Inc. Picture-in-picture for operating systems
CN107153537B (en) * 2017-04-01 2020-10-16 北京安云世纪科技有限公司 Information display method and device based on multi-task interface and mobile terminal
CN107203305A (en) * 2017-05-03 2017-09-26 努比亚技术有限公司 It is switched fast method, mobile terminal and the computer-readable recording medium of application
WO2019037359A1 (en) * 2017-08-24 2019-02-28 华为技术有限公司 Split-screen display method, device and terminal
CN109471603A (en) * 2017-09-07 2019-03-15 华为终端(东莞)有限公司 A kind of interface display method and device
CN108694073B (en) * 2018-05-11 2023-01-17 腾讯科技(深圳)有限公司 Control method, device and equipment of virtual scene and storage medium
CN110180169B (en) * 2019-05-30 2020-11-10 网易(杭州)网络有限公司 Method and device for displaying fighting picture in game, storage medium and electronic equipment
CN110764670B (en) * 2019-11-06 2021-04-23 网易(杭州)网络有限公司 Method and device for processing name information of virtual object, medium and electronic equipment
CN111208929B (en) * 2020-01-03 2021-11-02 广州虎牙科技有限公司 Response method, device and equipment of multi-level interface and storage medium
CN111399743B (en) * 2020-03-18 2022-05-27 网易(杭州)网络有限公司 Display control method and device in game

Patent Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103019761A (en) * 2011-07-22 2013-04-03 三星电子株式会社 Method of arranging user interface objects in a portable terminal and an apparatus thereof
US20130300688A1 (en) * 2012-05-09 2013-11-14 Sony Corporation Information processing apparatus, information processing method, and program
CN103412713A (en) * 2013-06-28 2013-11-27 北京君正集成电路股份有限公司 Management method of intelligent device for having control over a plurality of windows simultaneously
JP2015046752A (en) * 2013-08-28 2015-03-12 京セラドキュメントソリューションズ株式会社 Information output device and information output method
US8926430B1 (en) * 2013-10-10 2015-01-06 DeNA Co., Ltd. Game system, game program, and method for providing game switchable between manual mode and automatic mode
CN104978739A (en) * 2015-04-29 2015-10-14 腾讯科技(深圳)有限公司 Image object selection method and apparatus
CN107433036A (en) * 2017-06-21 2017-12-05 网易(杭州)网络有限公司 The choosing method and device of object in a kind of game
CN108379837A (en) * 2018-02-01 2018-08-10 网易(杭州)网络有限公司 Information processing method and device, storage medium, electronic equipment
CN111176529A (en) * 2019-12-31 2020-05-19 维沃移动通信有限公司 Key display method and electronic equipment
CN111760272A (en) * 2020-06-30 2020-10-13 网易(杭州)网络有限公司 Game information display method and device, computer storage medium and electronic equipment

Also Published As

Publication number Publication date
CN112486381A (en) 2021-03-12
CN112486381B (en) 2021-11-30

Similar Documents

Publication Publication Date Title
CN111185004B (en) Game control display method, electronic device and storage medium
US11776352B2 (en) Graphical user interface for a gaming system
CN107930122B (en) Information processing method, device and storage medium
US11559736B2 (en) Response method, apparatus and terminal to a control
CN111388998A (en) Display control method of game virtual weapon control, electronic equipment and storage medium
WO2022121528A1 (en) Interaction information processing method and apparatus, terminal, storage medium, and program product
CN112486382B (en) Interface logic execution method and device, electronic equipment and medium
US20230330536A1 (en) Object control method and apparatus for virtual scene, electronic device, computer program product, and computer-readable storage medium
CN111450527A (en) Information processing method and device
US9174132B2 (en) Electronic game device, electronic game processing method, and non-transitory computer-readable storage medium storing electronic game program
US20230330525A1 (en) Motion processing method and apparatus in virtual scene, device, storage medium, and program product
US20230350554A1 (en) Position marking method, apparatus, and device in virtual scene, storage medium, and program product
CN113198177A (en) Game control display method and device, computer storage medium and electronic equipment
CN112486381B (en) Interface logic execution method and device, electronic equipment and medium
CN110465084B (en) Game interface switching method and device, terminal equipment and storage medium
CN110647268A (en) Control method and control device for display window in game
JP2020058666A (en) Game program, method, and information processing device
CN116510287A (en) Game control method, game control device, electronic equipment and storage medium
CN115243093B (en) Video bullet screen processing method and device, storage medium and electronic device
JP2020058667A (en) Game program, method, and information processing device
JP2020058668A (en) Game program, method, and information processing device
CN116459519A (en) Method and device for controlling virtual character in game, storage medium and electronic device
CN112328162A (en) Method, system, platform and storage medium for sliding touch screen
CN117482528A (en) Method, device, equipment and storage medium for processing summarized information of virtual scene
CN116832438A (en) Virtual object control method, device, terminal and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination