CN114712849B - Cross-platform application operation method and device, electronic equipment and storage medium - Google Patents
Cross-platform application operation method and device, electronic equipment and storage medium Download PDFInfo
- Publication number
- CN114712849B CN114712849B CN202210529407.8A CN202210529407A CN114712849B CN 114712849 B CN114712849 B CN 114712849B CN 202210529407 A CN202210529407 A CN 202210529407A CN 114712849 B CN114712849 B CN 114712849B
- Authority
- CN
- China
- Prior art keywords
- interface
- interface element
- interactive
- interactable
- target game
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
Images
Classifications
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F13/00—Video games, i.e. games using an electronically generated display having two or more dimensions
- A63F13/40—Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment
- A63F13/42—Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment by mapping the input signals into game commands, e.g. mapping the displacement of a stylus on a touch screen to the steering angle of a virtual vehicle
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F13/00—Video games, i.e. games using an electronically generated display having two or more dimensions
- A63F13/50—Controlling the output signals based on the game progress
- A63F13/52—Controlling the output signals based on the game progress involving aspects of the displayed game scene
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0487—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
- G06F3/0488—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F9/00—Arrangements for program control, e.g. control units
- G06F9/06—Arrangements for program control, e.g. control units using stored programs, i.e. using an internal store of processing equipment to receive or retain programs
- G06F9/44—Arrangements for executing specific programs
- G06F9/451—Execution arrangements for user interfaces
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F2300/00—Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
- A63F2300/30—Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by output arrangements for receiving control signals generated by the game device
- A63F2300/308—Details of the user interface
Landscapes
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Theoretical Computer Science (AREA)
- General Engineering & Computer Science (AREA)
- Human Computer Interaction (AREA)
- Software Systems (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- User Interface Of Digital Computer (AREA)
Abstract
The embodiment of the application provides a cross-platform application operation method and device, electronic equipment and a storage medium, relates to the technical field of computers, and is applied to first terminal equipment. The method comprises the following steps: acquiring an interface element database corresponding to a target game application; responding to a focus moving instruction input by a user through a remote controller, and determining a second interactive interface element corresponding to the focus moving instruction from the interactive interface elements corresponding to the first interface; and displaying the preset cursor symbol and the second interactive interface element in a correlation manner, and determining the second interface which needs to be displayed currently according to the focus moving instruction. According to the method and the device, the target game application does not need to be redesigned, the target game application can be guaranteed to run on the first terminal, waste of human resources is avoided, and the cost is low.
Description
Technical Field
The present application relates to the field of computer technologies, and in particular, to a cross-platform application operating method and apparatus, an electronic device, and a storage medium.
Background
With the continuous development of science and technology, many high-quality mobile phone games are also widely developed. However, most mobile games cannot be run on the smart television. In the related art, if a mobile game is to be run on a television, the interface, operation and other parts of the mobile game need to be redesigned, and more than 1/3 of codes are modified, so that more human resources are consumed, and the cost is higher.
Disclosure of Invention
Various aspects of the present application provide a cross-platform application operation method, an apparatus, an electronic device, and a storage medium, which can ensure that a target game application can run on a first terminal without redesigning the target game application, thereby avoiding waste of human resources and reducing cost.
The embodiment of the application provides a cross-platform application operation method, which is applied to a first terminal device, wherein the first terminal device supports a remote controller interaction mode, and a target game application running on a second terminal device supporting a touch control interaction mode is installed in the first terminal device, and the method comprises the following steps:
acquiring an interface element database corresponding to the target game application, wherein interactive interface elements corresponding to all interfaces of the target game application are stored in the interface element database, and an interactive interface element serving as an initial focus is configured in the interactive interface element corresponding to each interface;
responding to the starting of the target game application by a user, and displaying a preset cursor symbol and a first interactive interface element which is taken as an initial focus in a first interface to be displayed in a related mode in the first interface;
responding to a focus moving instruction input by the user through a remote controller, and determining a second interactive interface element corresponding to the focus moving instruction from the interactive interface elements corresponding to the first interface;
and displaying the preset cursor symbol and the second interactive interface element in a correlation manner, and determining a second interface which needs to be displayed currently according to the focus moving instruction.
The embodiment of the present application further provides a cross-platform application operating device, which is applied to a first terminal device, where the first terminal device supports a remote controller interaction manner, and a target game application running on a second terminal device supporting a touch interaction manner is installed in the first terminal device, where the device includes:
an obtaining module, configured to obtain an interface element database corresponding to the target game application, where an interactable interface element corresponding to each interface of the target game application is stored in the interface element database, and an interactable interface element serving as an initial focus is configured in the interactable interface element corresponding to each interface;
the display module is used for responding to the starting of the target game application by the user and displaying a preset cursor symbol and a first interactive interface element which is taken as an initial focus in a first interface needing to be displayed in a correlated mode;
the determining module is used for responding to a focus moving instruction input by the user through a remote controller, and determining a second interactive interface element corresponding to the focus moving instruction from the interactive interface elements corresponding to the first interface;
an operation module, configured to associate and display the preset cursor symbol and the second interactive interface element, and determine a second interface to be displayed currently according to the focus movement instruction
The embodiment of the application is applied to the first terminal equipment, the first terminal equipment supports a remote controller interaction mode, and the first terminal equipment is internally provided with the target game application running on the second terminal equipment supporting a touch control interaction mode. According to the method, the interactive interface elements corresponding to the interfaces of the target game application stored in the element database can be obtained by obtaining the interface element database corresponding to the target game application. Responding to the starting of a user on a target game application, in a first interface needing to be displayed currently, displaying a preset cursor symbol and a first interactive interface element which is used as an initial focus in the first interface in an associated mode, achieving the association of the preset cursor symbol and the first interactive interface element, and responding to a focus moving instruction input by the user through a remote controller, namely determining a second interactive interface element corresponding to the focus moving instruction from the interactive interface elements corresponding to the first interface. At the moment, the preset cursor symbol and the second interactive interface element are displayed in a correlation mode, the second interface needing to be displayed at present can be determined according to the focus moving instruction, cross-platform operation of the target game application is further completed, the target game application can be guaranteed to run on the first terminal without redesigning the target game application, waste of human resources is avoided, and cost is low.
Drawings
The accompanying drawings, which are included to provide a further understanding of the application and are incorporated in and constitute a part of this application, illustrate embodiment(s) of the application and together with the description serve to explain the application and not to limit the application. In the drawings:
fig. 1 is a schematic flowchart of an operation method of a cross-platform application according to an embodiment of the present application;
FIG. 2 is a flowchart illustrating obtaining an interface element database corresponding to a target game application according to an embodiment of the present disclosure;
fig. 3 is a schematic structural diagram of a cross-platform application operating device according to an embodiment of the present application;
fig. 4 is a schematic structural diagram of an electronic device according to an embodiment of the present application.
Detailed Description
To make the objects, technical solutions and advantages of the present application more clear, the technical solutions of the present application will be clearly and completely described below with reference to specific embodiments of the present application and the accompanying drawings. It should be apparent that the described embodiments are only a few embodiments of the present application, and not all embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present application.
With the continuous development of science and technology, many high-quality mobile phone games are also widely developed. However, most mobile games cannot be run on the smart television. At present, if a mobile game is to be run on a television, the interface, operation and other parts of the mobile game need to be redesigned, and more than 1/3 of codes are modified, so that more human resources are consumed, and the cost is higher.
In view of this, an embodiment of the present application provides a cross-platform application operating method, which is applied to a first terminal device, where the first terminal device supports a remote controller interaction manner, and a target game application running on a second terminal device that supports a touch interaction manner is installed in the first terminal device. According to the method, the interactive interface elements corresponding to the interfaces of the target game application stored in the element database can be obtained by obtaining the interface element database corresponding to the target game application. Responding to the starting of a user on a target game application, in a first interface needing to be displayed currently, displaying a preset cursor symbol and a first interactive interface element which is used as an initial focus in the first interface in an associated mode, achieving the association of the preset cursor symbol and the first interactive interface element, and responding to a focus moving instruction input by the user through a remote controller, namely determining a second interactive interface element corresponding to the focus moving instruction from the interactive interface elements corresponding to the first interface. At the moment, the preset cursor symbol and the second interactive interface element are displayed in a correlation mode, the second interface needing to be displayed at present can be determined according to the focus moving instruction, cross-platform operation of the target game application is further completed, the target game application can be guaranteed to run on the first terminal without redesigning the target game application, waste of human resources is avoided, and cost is low.
Fig. 1 is a schematic flowchart of a cross-platform application operating method according to an embodiment of the present application, where the method is applied to a first terminal device, the first terminal device supports a remote controller interaction manner, and a target game application running on a second terminal device supporting a touch interaction manner is installed in the first terminal device. The target game application may be a multidimensional stereo game, and specifically, for example, may be a Unity3D game. The first terminal device may be a terminal device supporting a remote controller interaction mode, such as a smart television, a projection device, and the like, and the second device may be a terminal device having a touch control interaction mode, such as a smart phone, a tablet computer, and the like, which are not listed here. As shown in fig. 1, the method includes:
It should be understood that there are multiple interfaces in the target gaming application. In this embodiment, the interface elements include not only the interactive interface elements corresponding to the interfaces of the target game application, but also the non-interactive interface elements corresponding to the interfaces of the target game application. For example, in a gaming application, interface elements may include: trees, clouds, buildings, pavements, figures, etc. The interactive interface element refers to an interface element capable of interacting with a user, that is, after the user triggers an operation on the interface element, the interface element can feed back the operation of the user. Specifically, when a user clicks on a character, the character initiates a conversation, which states that the character is an interactable interface element. Similarly, when the user triggers the operation on the interface element, the interface element does not feed back the operation of the user, and then the interface element is proved to be a non-interactive interface element.
Since all subsequent interactions with the user are interactive interface elements, the non-interactive interface elements need to be filtered out, and an interface element database which does not contain the non-interactive interface elements is obtained. Based on this, fig. 2 provides a flowchart for acquiring an interface element database corresponding to a target game application, and as shown in fig. 2, acquiring an interface element database corresponding to a target game application includes:
In this embodiment, all interface elements corresponding to any interface include the interactable interface element and the non-interactable interface element.
In this embodiment, the display characteristics include at least: rendering mode, rendering camera, level of the layer to which it belongs, level within the layer. The following is illustrated by way of example: for the rendering mode, it is assumed that the interface element is a "button", and that this rendering mode indicates whether this "button" is 3D-drawn or plane-drawn. For the rendering camera, it should be understood that the displayed form of each interface element is different under different viewing angles, and the rendering camera is required to acquire the picture at each angle. For the level of the belonging layer, it is assumed that the target game application includes a background layer, a map layer, and an operation layer. Wherein, the background layer comprises trees, clouds and the like; the map layer comprises buildings, pavements, game figures and the like; the operation layer comprises buttons, direction keys and the like. The levels of the background layer, map layer and operation layer may be determined according to the importance of the interaction with the user, for example, if the user plays a game using a first terminal (e.g., a television) and the operation layer is more important than the background layer, the level of the operation layer is higher than the background layer. For levels within a layer, such as a road surface and a game piece in a map layer, for example, at the same level, in a game where the game piece is more important than the road surface, then it may be determined that the level of the game piece is higher than the road surface.
In practical application, after the rendering mode, the rendering camera, the level of the layer to which the rendering camera belongs, and the level in the layer are comprehensively considered, the levels of the interface elements are sorted, and only the interface element with the highest level is left, that is, the interface elements with the display characteristics meeting the set conditions are screened out. After comprehensively considering the rendering mode, the rendering camera, the level of the layer to which the interface element belongs and the level in the layer, the levels of the interface elements are sorted, and the interface elements can be configured in advance by a developer according to the game scene when game application development is carried out. The above-described examples of levels are for ease of understanding only and are not intended to be limiting.
And step 203, acquiring the event types respectively associated with the interface elements with the screened display characteristics meeting the set conditions.
It should be understood that the purpose of the present application is to finally retain only interface elements that can interact with the user for the user to operate. Therefore, after the interface elements with the display characteristics meeting the set conditions are screened out in step 202, the interface elements need to be further screened out, and event types associated with the screened interface elements with the display characteristics meeting the set conditions are obtained, that is, interface elements capable of interacting with the user (associated event types) are screened out.
Specifically, in the target game application, it is assumed that the interface elements include roads, characters, trees, etc., wherein the roads and trees generally only serve a decorative role, and simply, the user clicks it, which is not reactive. In contrast, after a user clicks on a persona, the persona typically provides feedback about the user's clicking action, such as popping up a dialog, making some action, etc.
And 204, screening the interface elements with the event type being the interactive operation type from the screened interface elements with the display characteristics meeting the set conditions according to the event type, and using the interface elements with the event type being the interactive operation type as the interactive interface elements corresponding to any interface.
Based on the example in step 203, after the user clicks on the persona, the persona generally performs feedback on the user's clicking operation, such as popping up a dialog, making some action, and the like, and it may be determined that the event type associated with the persona is the interactive operation type. And the manner for triggering the interactive operation can be clicking, touching, dragging and the like.
In practical applications, the interface element database may be obtained from a game engine of the target game application.
And 102, responding to the starting of the target game application by the user, and displaying the preset cursor symbol and a first interactive interface element which is taken as an initial focus in the first interface to be displayed in a correlated manner.
It should be understood that, after the user starts the target game application, a preset cursor symbol appears in the first interface that needs to be displayed currently, and the preset cursor symbol is displayed in association with the first interactive interface element that is the initial focus in the first interface, and a book is written from a visual perspective, that is, the preset cursor symbol is provided on the first interactive interface element, so that the user can accurately know on which interface element the current focus is located. The preset cursor may have various shapes, for example, it may be square, circular, etc., which are not listed here. It should be noted that the preset cursor corresponds to the focal point, and moves along with the movement of the focal point.
The method for determining the initial focus may be configured in advance by a developer of the target game application, or may be set by a user in a customized manner. Specifically, when the current interface of the target gaming application is a "login interface," then the initial focus is set by default on the "login" button (it should be understood that the "login" button is an interface element).
And 103, responding to a focus moving instruction input by a user through a remote controller, and determining a second interactive interface element corresponding to the focus moving instruction from the interactive interface elements corresponding to the first interface.
In practical applications, after a user inputs a focus moving instruction through a remote controller, in order to enable a focus to move from a first interactive interface element to a second interactive interface element, it is necessary to determine the second interactive interface element corresponding to the focus moving instruction in the interactive interface elements corresponding to the first interface.
The specific method for determining the second interactive interface element is as follows:
and according to the moving direction corresponding to the focus moving instruction and the set query angle range, determining a second interactive interface element which is matched with the moving direction and the query angle range and is closest to the first interactive interface element from the interactive interface elements corresponding to the first interface.
For example, when the user presses the right key in the remote control, this indicates that the user wants to move the focus to the interface element to the right of the current interface element. At this time, the system may find, according to the right key pressed by the user, an interface element that is closest to the right side of the current interface element within a sector range of 30 ° -150 ° (e.g., 60 °, 90 °, 150 °) as a second interactable interface element.
However, in practical applications, it may happen that two or more interface elements on the right side of the remote control are closest to the current interface element where the focus is located. At this time, a specific method of determining the second interactable interface element is as follows:
and if the interactable interface elements corresponding to the first interface comprise at least two interactable interface elements corresponding to the focus movement instruction, determining a second interactable interface element from the at least two interactable interface elements according to the function corresponding to the first interface and the functions of the at least two interactable interface elements in the first interface.
Specifically, for example, when the first interface is a "create role interface," if there are two interface elements closest to the right side of the current interface element where the focus is located, for example, there are two "buttons," which are a "create role" button and a "delete role" button, respectively, then, since the first interface is a "create role interface," it is determined that the interface element corresponding to the "create role" button is the second interactive interface element preferentially at this time.
In this embodiment of the present application, in order to facilitate subsequent call of the interactive interface element, the method further includes:
responding to the starting of the target game application by the user, and acquiring an interactive interface element corresponding to the first interface from an interface element database;
adding the interactive interface elements corresponding to the first interface into the setting container assembly;
determining a second interactive interface element corresponding to the focus moving instruction from the interactive interface elements corresponding to the first interface, including:
and determining a second interactive interface element corresponding to the focus moving instruction from the interactive interface elements corresponding to the first interface contained in the container component.
In particular, the container component may be a string of stored codes.
And 104, displaying the preset cursor symbol and the second interactive interface element in a correlation mode, and determining a second interface needing to be displayed currently according to the focus moving instruction.
In practical application, after a second interactive interface element corresponding to the focus moving instruction is determined from the interactive interface elements corresponding to the first interface, the preset cursor symbol and the second interactive interface element are displayed in a correlation mode, and the second interface needing to be displayed currently can be determined according to the focus moving instruction.
For example, when the user moves the focus from the "delete role" button to the "create role" button according to the focus moving instruction, the interface changes, that is, the second interface to be displayed currently is determined. And after entering the second interface, repeating the steps and waiting for the next operation instruction of the user.
In this embodiment of the present application, to avoid a situation that a user cannot operate due to overlapping interface elements, the method further includes:
acquiring an interactive interface element corresponding to the second interface from an interface element database;
and updating the interactive interface elements corresponding to the second interface into the container component, and removing the interactive interface elements corresponding to the first interface from the container component.
For example, if the first interface is a presentation interface of an interactive interface element "backpack", and the "backpack" includes "treasure box", "good fortune bag", "weapon", etc., the "treasure box", "good fortune bag", "weapon" is the interactive interface element in the first interface, and the "treasure box" also includes "ruby", "emerald", "gold coin", etc. When the method is used, after a user clicks the treasure box in the first interface, a new interface (namely, a second interface) can be popped up, the second interface covers the position of the treasure box, and then interactive interface elements such as ruby, emerald, gold coin and the like can be directly displayed in the second interface. At this point, the "treasure box" can be removed from the container assembly. It should be appreciated that if the "treasure box" is not removed from the container assembly, then selection of the interactable interface element in "ruby", "emerald", "gold" is prone to error when clicking on the interactable interface element that is co-located with the "treasure box".
To sum up, the application is applied to a first terminal device, the first terminal device supports a remote controller interaction mode, and a target game application running on a second terminal device supporting a touch control interaction mode is installed in the first terminal device. According to the method, the interactive interface elements corresponding to the interfaces of the target game application stored in the element database can be obtained by obtaining the interface element database corresponding to the target game application. Responding to the starting of a user on a target game application, in a first interface needing to be displayed currently, displaying a preset cursor symbol and a first interactive interface element which is used as an initial focus in the first interface in an associated mode, achieving the association of the preset cursor symbol and the first interactive interface element, and responding to a focus moving instruction input by the user through a remote controller, namely determining a second interactive interface element corresponding to the focus moving instruction from the interactive interface elements corresponding to the first interface. At the moment, the preset cursor symbol and the second interactive interface element are displayed in a correlation mode, the second interface needing to be displayed at present can be determined according to the focus moving instruction, cross-platform operation of the target game application is further completed, the target game application can be guaranteed to run on the first terminal without redesigning the target game application, waste of human resources is avoided, and cost is low.
Taking the first terminal as the smart television and the second terminal as the smart phone as an example, in practical application, as long as the content of the above scheme is packaged into the program package and the program package is implanted into the target game application, the complex cross-platform transplantation work can be completed by only a few simple steps, thereby ensuring that the target game application can be smoothly applied to the smart television and greatly improving the working efficiency.
Based on the same inventive concept, the embodiment of the present application further provides a cross-platform application operating apparatus, as in the following embodiments. Because the principle of the cross-platform application operating device for solving the problem is similar to the cross-platform application operating method, the implementation of the cross-platform application operating device can refer to the implementation of the cross-platform application operating method, and repeated details are not repeated. As used hereinafter, the term "unit" or "module" may be a combination of software and/or hardware that implements a predetermined function. Although the means described in the embodiments below are preferably implemented in software, an implementation in hardware or a combination of software and hardware is also possible and contemplated.
Fig. 3 is a schematic structural diagram of a cross-platform application operating device according to an embodiment of the present application, where the cross-platform application operating device is applied to a first terminal device, the first terminal device supports a remote controller interaction mode, and a target game application running on a second terminal device supporting a touch interaction mode is installed in the first terminal device, as shown in fig. 3, the device includes:
the obtaining module 301 is configured to obtain an interface element database corresponding to the target game application, where the interface element database stores interactive interface elements corresponding to interfaces of the target game application, where an interactive interface element serving as an initial focus is configured in the interactive interface element corresponding to each interface.
The display module 302 is configured to, in response to the starting of the target game application by the user, display, in the first interface that needs to be displayed currently, a preset cursor in association with a first interactive interface element that is an initial focus in the first interface.
The determining module 303 is configured to determine, in response to a focus movement instruction input by a user through a remote controller, a second interactable interface element corresponding to the focus movement instruction from the interactable interface elements corresponding to the first interface.
And the operation module 304 is configured to associate and display the preset cursor symbol and the second interactive interface element, and determine a second interface to be displayed currently according to the focus movement instruction.
In this embodiment of the application, the determining module 303 is further configured to:
and according to the moving direction corresponding to the focus moving instruction and the set query angle range, determining a second interactive interface element which is matched with the moving direction and the query angle range and is closest to the first interactive interface element from the interactive interface elements corresponding to the first interface.
In this embodiment of the application, the determining module 303 is further configured to:
and if the interactable interface elements corresponding to the first interface comprise at least two interactable interface elements corresponding to the focus movement instruction, determining a second interactable interface element from the at least two interactable interface elements according to the function corresponding to the first interface and the functions of the at least two interactable interface elements in the first interface.
In an embodiment of the present application, the apparatus is further configured to:
responding to the starting of the target game application by the user, and acquiring an interactive interface element corresponding to the first interface from an interface element database;
adding the interactive interface elements corresponding to the first interface into the setting container assembly;
determining a second interactive interface element corresponding to the focus moving instruction from the interactive interface elements corresponding to the first interface, including:
and determining a second interactive interface element corresponding to the focus moving instruction from the interactive interface elements corresponding to the first interface contained in the container component.
In an embodiment of the present application, the apparatus is further configured to:
acquiring an interactive interface element corresponding to the second interface from an interface element database;
and updating the interactive interface elements corresponding to the second interface into the container component, and removing the interactive interface elements corresponding to the first interface from the container component.
In this embodiment of the application, the obtaining module 301 is further configured to:
aiming at any interface, all interface elements corresponding to any interface are obtained;
screening interface elements with display characteristics meeting set conditions from all interfaces according to the display characteristics associated with each interface element in all interface elements;
acquiring event types associated with the screened interface elements with the display characteristics meeting set conditions;
and according to the event type, screening the interface elements with the event type being the interactive operation type from the interface elements with the screened display characteristics meeting the set conditions as the interactive interface elements corresponding to any interface.
In order to achieve the above object, according to another aspect of the present application, there is also provided an electronic device. The electronic device can be a smart television, a smart projection, and the like. As shown in fig. 4, the electronics include a processor; a memory for storing processor-executable instructions; wherein the processor is configured to execute the instructions to implement the cross-platform application operating method described above.
The processor may be a Central Processing Unit (CPU). The Processor may also be other general purpose processors, digital Signal Processors (DSPs), application Specific Integrated Circuits (ASICs), field Programmable Gate Arrays (FPGAs) or other Programmable logic devices, discrete Gate or transistor logic devices, discrete hardware components, or a combination thereof.
The memory, which is a non-transitory computer readable storage medium, may be used to store non-transitory software programs, non-transitory computer executable programs, and units, such as the corresponding program units in the above-described method embodiments of the present application. The processor executes various functional applications of the processor and the processing of the work data by executing the non-transitory software programs, instructions and modules stored in the memory, that is, the method in the above method embodiment is realized.
The memory may include a storage program area and a storage data area, wherein the storage program area may store an operating system, an application program required for at least one function; the storage data area may store data created by the processor, and the like. Further, the memory may include high speed random access memory, and may also include non-transitory memory, such as at least one disk storage device, flash memory device, or other non-transitory solid state storage device. In some embodiments, the memory optionally includes memory located remotely from the processor, and such remote memory may be coupled to the processor via a network. Examples of such networks include, but are not limited to, the internet, intranets, local area networks, mobile communication networks, and combinations thereof.
The one or more units are stored in the memory and when executed by the processor perform the method of the above embodiments.
An embodiment of the present application further provides a computer-readable storage medium, in which a computer program for executing the above method is stored.
As will be appreciated by one skilled in the art, embodiments of the present application may be provided as a method, system, or computer program product. Accordingly, the present application may take the form of an entirely hardware embodiment, an entirely software embodiment or an embodiment combining software and hardware aspects. Furthermore, the present application may take the form of a computer program product embodied on one or more computer-usable storage media (including, but not limited to, disk storage, CD-ROM, optical storage, and the like) having computer-usable program code embodied therein.
The present application is described with reference to flowchart illustrations and/or block diagrams of methods, apparatus (systems), and computer program products according to embodiments of the application. It will be understood that each flow and/or block of the flow diagrams and/or block diagrams, and combinations of flows and/or blocks in the flow diagrams and/or block diagrams, can be implemented by computer program instructions. These computer program instructions may be provided to a micro-control unit of a general purpose computer, special purpose computer, embedded processor, or other programmable data processing apparatus to produce a machine, such that the instructions, which execute via the micro-control unit of the computer or other programmable data processing apparatus, create means for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be stored in a computer-readable memory that can direct a computer or other programmable data processing apparatus to function in a particular manner, such that the instructions stored in the computer-readable memory produce an article of manufacture including instruction means which implement the function specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be loaded onto a computer or other programmable data processing apparatus to cause a series of operational steps to be performed on the computer or other programmable apparatus to produce a computer implemented process such that the instructions which execute on the computer or other programmable apparatus provide steps for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
In a typical configuration, a computing device includes one or more micro control units (CPUs), input/output interfaces, network interfaces, and memory.
The memory may include forms of volatile memory in a computer readable medium, random Access Memory (RAM) and/or non-volatile memory, such as Read Only Memory (ROM) or flash memory (flash RAM). Memory is an example of a computer-readable medium.
Computer-readable media, including both permanent and non-permanent, removable and non-removable media, may implement the information storage by any method or technology. The information may be computer readable instructions, data structures, modules of a program, or other data. Examples of computer storage media include, but are not limited to, phase change memory (PRAM), static Random Access Memory (SRAM), dynamic Random Access Memory (DRAM), other types of Random Access Memory (RAM), read Only Memory (ROM), electrically Erasable Programmable Read Only Memory (EEPROM), flash memory or other memory technology, compact disc read only memory (CD-ROM), digital Versatile Discs (DVD) or other optical storage, magnetic cassettes, magnetic tape magnetic disk storage or other magnetic storage devices, or any other non-transmission medium that can be used to store information that can be accessed by a computing device. As defined herein, a computer readable medium does not include a transitory computer readable medium such as a modulated data signal and a carrier wave.
It should also be noted that the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising a … …" does not exclude the presence of another identical element in a process, method, article, or apparatus that comprises the element.
The above description is only an example of the present application and is not intended to limit the present application. Various modifications and changes may occur to those skilled in the art. Any modification, equivalent replacement, improvement or the like made within the spirit and principle of the present application shall be included in the scope of the claims of the present application.
Claims (8)
1. A cross-platform application operation method is applied to a first terminal device, the first terminal device supports a remote controller interaction mode, and a target game application running on a second terminal device supporting a touch control interaction mode is installed in the first terminal device, and is characterized by comprising the following steps:
acquiring an interface element database corresponding to the target game application, where an interactive interface element corresponding to each interface of the target game application is stored in the interface element database, where an interactive interface element serving as an initial focus is configured in the interactive interface element corresponding to each interface, and the acquiring the interface element database corresponding to the target game application includes: aiming at any interface, acquiring all interface elements corresponding to the interface; according to the display characteristics associated with each interface element in all the interface elements, screening out the interface elements with the display characteristics meeting set conditions from all the interfaces; acquiring the event types respectively associated with the interface elements with the screened display characteristics meeting the set conditions; according to the event type, screening interface elements with the event type being an interactive operation type from the screened interface elements with the display characteristics meeting set conditions as interactive interface elements corresponding to any interface, wherein the display characteristics at least comprise: rendering mode, rendering camera, level of the layer to which it belongs, level within the layer;
responding to the starting of the target game application by a user, and displaying a preset cursor symbol and a first interactive interface element which is taken as an initial focus in a first interface to be displayed in a related mode in the first interface;
responding to a focus moving instruction input by the user through a remote controller, and determining a second interactive interface element corresponding to the focus moving instruction from the interactive interface elements corresponding to the first interface;
and displaying the preset cursor symbol and the second interactive interface element in a correlation manner, and determining a second interface which needs to be displayed currently according to the focus moving instruction.
2. The method of claim 1, wherein the determining a second interactable interface element corresponding to the focus movement instruction from the interactable interface elements corresponding to the first interface comprises:
and according to the moving direction corresponding to the focus moving instruction and a set query angle range, determining a second interactive interface element which is matched with the moving direction and the query angle range and is closest to the first interactive interface element from the interactive interface elements corresponding to the first interface.
3. The method according to claim 1 or 2, wherein the determining a second interactable interface element corresponding to the focus movement instruction from the interactable interface elements corresponding to the first interface comprises:
if the interactable interface elements corresponding to the first interface include at least two interactable interface elements corresponding to the focus movement instruction, determining the second interactable interface element from the at least two interactable interface elements according to the function corresponding to the first interface and the functions of the at least two interactable interface elements corresponding to the first interface.
4. The method of claim 1, further comprising:
responding to the starting of the target game application by a user, and acquiring an interactive interface element corresponding to the first interface from the interface element database;
adding the interactive interface elements corresponding to the first interface into a setting container component;
the determining, from the interactable interface elements corresponding to the first interface, a second interactable interface element corresponding to the focus movement instruction includes:
determining a second interactable interface element corresponding to the focus movement instruction from the interactable interface elements corresponding to the first interface contained in the container component.
5. The method of claim 4, further comprising:
acquiring an interactive interface element corresponding to the second interface from the interface element database;
and updating the interactive interface element corresponding to the second interface into the container component, and removing the interactive interface element corresponding to the first interface from the container component.
6. A cross-platform application operating device is applied to a first terminal device, the first terminal device supports a remote controller interaction mode, and a target game application running on a second terminal device supporting a touch control interaction mode is installed in the first terminal device, and the cross-platform application operating device is characterized by comprising:
an obtaining module, configured to obtain an interface element database corresponding to the target game application, where an interactable interface element corresponding to each interface of the target game application is stored in the interface element database, where an interactable interface element serving as an initial focus is configured in the interactable interface element corresponding to each interface, and the obtaining of the interface element database corresponding to the target game application includes: aiming at any interface, acquiring all interface elements corresponding to the interface; according to the display characteristics associated with each interface element in all the interface elements, screening out the interface elements with the display characteristics meeting set conditions from all the interfaces; acquiring event types associated with the screened interface elements with the display characteristics meeting set conditions; according to the event type, screening out interface elements with the event type being an interactive operation type from the interface elements with the screened display characteristics meeting set conditions as interactive interface elements corresponding to any interface, wherein the display characteristics at least comprise: rendering mode, rendering camera, level of the layer to which it belongs, level within the layer;
the display module is used for responding to the starting of the target game application by the user and displaying a preset cursor symbol and a first interactive interface element which is taken as an initial focus in a first interface needing to be displayed in a correlated mode;
the determining module is used for responding to a focus moving instruction input by the user through a remote controller, and determining a second interactive interface element corresponding to the focus moving instruction from the interactive interface elements corresponding to the first interface;
and the operation module is used for displaying the preset cursor symbol and the second interactive interface element in a correlation manner, and determining a second interface which needs to be displayed currently according to the focus movement instruction.
7. An electronic device, comprising:
a processor;
a memory for storing the processor-executable instructions;
wherein the processor is configured to execute the instructions to implement the cross-platform application method of operation of any of claims 1 to 5.
8. A computer-readable storage medium, wherein instructions in the computer-readable storage medium, when executed by a processor of an electronic device, enable the electronic device to perform the cross-platform application operating method of any of claims 1 to 5.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202210529407.8A CN114712849B (en) | 2022-05-16 | 2022-05-16 | Cross-platform application operation method and device, electronic equipment and storage medium |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202210529407.8A CN114712849B (en) | 2022-05-16 | 2022-05-16 | Cross-platform application operation method and device, electronic equipment and storage medium |
Publications (2)
Publication Number | Publication Date |
---|---|
CN114712849A CN114712849A (en) | 2022-07-08 |
CN114712849B true CN114712849B (en) | 2022-10-21 |
Family
ID=82231663
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202210529407.8A Active CN114712849B (en) | 2022-05-16 | 2022-05-16 | Cross-platform application operation method and device, electronic equipment and storage medium |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN114712849B (en) |
Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2015114117A1 (en) * | 2014-01-31 | 2015-08-06 | King.Com Limited | Controlling a user interface of a computer device |
CN105808123A (en) * | 2016-03-16 | 2016-07-27 | 闵进芳 | Intelligent terminal interaction system and method based on remote control device |
CN110187953A (en) * | 2019-06-05 | 2019-08-30 | 北京视游互动科技有限公司 | A kind of operation method and device of application program |
CN112511874A (en) * | 2020-11-12 | 2021-03-16 | 北京视游互动科技有限公司 | Game control method, smart television and storage medium |
Family Cites Families (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN104536670B (en) * | 2014-09-16 | 2018-04-20 | 华为技术有限公司 | Exchange method and relevant apparatus based on user interface |
CN106075904B (en) * | 2016-06-07 | 2019-01-04 | 腾讯科技(深圳)有限公司 | Method and device, terminal, the system of cross-platform game fighting |
US10929003B1 (en) * | 2019-08-12 | 2021-02-23 | Microsoft Technology Licensing, Llc | Cross-platform drag and drop user experience |
-
2022
- 2022-05-16 CN CN202210529407.8A patent/CN114712849B/en active Active
Patent Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2015114117A1 (en) * | 2014-01-31 | 2015-08-06 | King.Com Limited | Controlling a user interface of a computer device |
CN105808123A (en) * | 2016-03-16 | 2016-07-27 | 闵进芳 | Intelligent terminal interaction system and method based on remote control device |
CN110187953A (en) * | 2019-06-05 | 2019-08-30 | 北京视游互动科技有限公司 | A kind of operation method and device of application program |
CN112511874A (en) * | 2020-11-12 | 2021-03-16 | 北京视游互动科技有限公司 | Game control method, smart television and storage medium |
Also Published As
Publication number | Publication date |
---|---|
CN114712849A (en) | 2022-07-08 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US11954811B2 (en) | Augmented reality platform | |
CN107463367B (en) | Transition animation realization method and device | |
Sheikh et al. | Smartphone: Android Vs IOS | |
CN106547420B (en) | Page processing method and device | |
US20160188363A1 (en) | Method, apparatus, and device for managing tasks in multi-task interface | |
AU2017226395A1 (en) | Information display method and device | |
CN109074278B (en) | Validating stateful dynamic links in mobile applications | |
CN109842818A (en) | A kind of video broadcasting method, device, computer equipment and storage medium | |
CN104484352A (en) | Method, device and equipment for quickly searching application program | |
US20160103576A1 (en) | Navigating application interface | |
US11790344B2 (en) | Method and apparatus for displaying identification code of application | |
CN110134452B (en) | Object processing method and device | |
CN110262710A (en) | A kind of display methods of interactive interface, device and equipment | |
CN115691004A (en) | Cargo access method, system and device | |
CN110580124A (en) | Image display method and device | |
CN111324398B (en) | Method, device, terminal and storage medium for processing latest content | |
CN109240678B (en) | Code generation method and device | |
CN110262749A (en) | A kind of web page operation method, apparatus, container, equipment and medium | |
CN114397989A (en) | Parameter value setting method and device, electronic equipment and storage medium | |
CN111897607A (en) | Application interface loading and interaction method, device and storage medium | |
CN115373558A (en) | Screen projection method, device, equipment and storage medium | |
CN110968513B (en) | Recording method and device of test script | |
CN114712849B (en) | Cross-platform application operation method and device, electronic equipment and storage medium | |
CN111309411B (en) | Schedule display method and device | |
CN114518821A (en) | Application icon management method and device and electronic equipment |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |