CN115033313A - Terminal application control method, terminal equipment and chip system - Google Patents

Terminal application control method, terminal equipment and chip system Download PDF

Info

Publication number
CN115033313A
CN115033313A CN202110205358.8A CN202110205358A CN115033313A CN 115033313 A CN115033313 A CN 115033313A CN 202110205358 A CN202110205358 A CN 202110205358A CN 115033313 A CN115033313 A CN 115033313A
Authority
CN
China
Prior art keywords
terminal device
page
interface
target
target application
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202110205358.8A
Other languages
Chinese (zh)
Inventor
刘成
李�杰
刘敏
孙玺临
程银柱
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Huawei Technologies Co Ltd
Original Assignee
Huawei Technologies Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Huawei Technologies Co Ltd filed Critical Huawei Technologies Co Ltd
Priority to CN202110205358.8A priority Critical patent/CN115033313A/en
Priority to PCT/CN2021/140198 priority patent/WO2022179275A1/en
Publication of CN115033313A publication Critical patent/CN115033313A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F9/00Arrangements for program control, e.g. control units
    • G06F9/06Arrangements for program control, e.g. control units using stored programs, i.e. using an internal store of processing equipment to receive or retain programs
    • G06F9/44Arrangements for executing specific programs
    • G06F9/451Execution arrangements for user interfaces
    • G06F9/452Remote windowing, e.g. X-Window System, desktop virtualisation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/90Details of database functions independent of the retrieved data types
    • G06F16/95Retrieval from the web
    • G06F16/955Retrieval from the web using information identifiers, e.g. uniform resource locators [URL]
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F9/00Arrangements for program control, e.g. control units
    • G06F9/06Arrangements for program control, e.g. control units using stored programs, i.e. using an internal store of processing equipment to receive or retain programs
    • G06F9/44Arrangements for executing specific programs
    • G06F9/451Execution arrangements for user interfaces

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Human Computer Interaction (AREA)
  • Software Systems (AREA)
  • Databases & Information Systems (AREA)
  • Data Mining & Analysis (AREA)
  • Information Transfer Between Computers (AREA)

Abstract

The application is suitable for the technical field of terminals, and provides a terminal application control method, terminal equipment and a chip system. The method comprises the following steps: the method comprises the steps that a first terminal device sets a label for a page element in a target application interface, and collects the content of the page element to obtain page layout information; the method comprises the steps that a first terminal device obtains the type and sharing permission of a second terminal device; the server determines a target page template according to the type of the second terminal device, renders a page corresponding to the target page template based on the page layout information, and generates an address identifier of the rendered page; and the second terminal equipment opens the address identifier through the browser and displays a target page adaptive to the interface of the browser, wherein the target page comprises a target application interface. The method can display the interface of the target application in the second terminal device supporting the browser, the second terminal device can not support the target application, and the browser is supported to display the interface of the target application.

Description

Terminal application control method, terminal equipment and chip system
Technical Field
The present application relates to the field of terminal technologies, and in particular, to a method for controlling a terminal application, a terminal device, and a chip system.
Background
At present, an APP (Application) for cooperatively controlling a mobile phone source end on a plurality of clients comprises two schemes of cooperative control of the APP of the plurality of clients and screen projection cooperative control. The cooperative control of the APP of the multiple clients refers to the installation of the APP, which is the same as the APP of the mobile phone source end, on the multiple clients, and the opening of the APP page of the client is realized through a method of sharing links and copying characters. The screen projection cooperative control refers to that the application of the mobile phone source end is projected to the client end through a screen projection technology, the application is operated on the client end, and the mobile phone source end can execute a corresponding instruction.
However, the above method for controlling an APP of a mobile phone source has a high requirement on a client, and requires multiple clients to download and install the same APP adapted to the screen size of the client. And controlling the APP of the mobile phone source end on the client not supported by the APP cannot be realized.
Disclosure of Invention
The application provides a terminal application control method, a terminal device and a chip system, which can display an interface of a target application in the terminal device supporting a browser, the terminal device can not support the target application, and the browser is supported to display the interface of the target application.
In order to achieve the purpose, the technical scheme is as follows:
in a first aspect, an embodiment of the present application provides a terminal application control method, including: the method comprises the steps that first terminal equipment sets a first label for a page element in a target application interface, and acquires the content of the page element to obtain page layout information, wherein the page layout information comprises the first label and the content of the page element; the first terminal equipment acquires the type of the second terminal equipment and sends the page layout information and the type of the second terminal equipment to a server; the server determines a target page template according to the type of the second terminal device, renders a page corresponding to the target page template based on the page layout information, and generates an address identifier of the rendered page; the first terminal equipment acquires the address identification and sends the address identification to the second terminal equipment; and the second terminal equipment opens the address identifier through a browser and displays a target page adaptive to the interface of the browser, wherein the target page comprises the target application interface.
According to the method and the device for page layout information, the tags are set for the page elements in the target application interface, the content of the page elements is collected, and page layout information is obtained. And then, the server determines a target page template according to the type of the second terminal device, renders a page corresponding to the target page template based on the page layout information, and generates an address identifier of the rendered page. And the first terminal equipment sends the address identification to the second terminal equipment. And the second terminal equipment opens the address identifier through the browser and displays a target page adaptive to the interface of the browser, wherein the target page comprises a target application interface. According to the method and the device, the interface of the target application can be displayed in the second terminal device supporting the browser, and the interface of the target application in the first terminal device can be shared in the second terminal device for displaying. Moreover, the second terminal device may not support the target application, and the browser may be supported to display an interface of the target application.
The server may construct a server for an interface in the following description.
In combination with the first aspect, in some embodiments, the method further comprises: the second terminal device monitors an operation event acting on the target page through a browser and sends the operation event to the server, wherein the operation event comprises data, types, operation modes and window titles of the operation event; the server sends the operation event to the first terminal equipment; and the first terminal equipment updates the page element of the target application according to the operation event.
According to the embodiment of the application, if the user applies the preset operation on the browser interface of the second terminal device, the second terminal device can monitor the operation event through the browser and send the operation event to the server, and then the server sends the operation event to the first terminal device. The first terminal device can update the page element of the target application according to the operation event, so that the reverse control of the second terminal device on the target application of the first terminal device can be realized.
For example, after the second terminal device receives an operation (e.g., a click event of a control) applied to the browser interface, the browser listens for the operation event, the second terminal device sends the operation event and a tag of the operation event to the server, and the server sends the operation event and the tag of the operation event to the first terminal device. And the first terminal equipment determines a control corresponding to the operation event in the target application according to the label of the operation event, and executes the operation event according to the determined control. The operation event may include data, type, operation manner, and window title of the operation event.
Illustratively, when the interface of the target application of the first terminal device changes, the first terminal device sends the page layout change information of the target application to the server, and the server determines the page change information of the target page template according to the page layout change information of the target application. And the server sends the page change information to the second terminal equipment, and the second terminal equipment synchronously refreshes the page change information to the loaded page through the browser.
In a second aspect, an embodiment of the present application provides a terminal application control method, which is applied to a first terminal device, and the method includes: setting a first label for a page element in a target application interface, and acquiring the content of the page element to obtain page layout information, wherein the page layout information comprises the first label and the content of the page element; acquiring the type of second terminal equipment, and sending the page layout information and the type of the second terminal equipment to a server; receiving the address identifier sent by the server, wherein the page address identifier is obtained by the server according to a rendered page, the rendered page is obtained by the server determining a target page template according to the type of the second terminal device, and the target page template is rendered according to the page layout information; and sending the address identifier to the second terminal device, so that the second terminal device opens the address identifier through a browser and displays a target page adapted to the browser interface, wherein the target page comprises the target application interface.
According to the terminal application control method, the tags are set for the page elements in the target application interface, the content of the page elements is collected, and page layout information is obtained. And then, the server determines a target page template according to the type of the second terminal device, renders a page corresponding to the target page template based on the page layout information, and generates an address identifier of the rendered page. And the first terminal equipment sends the address identification to the second terminal equipment. And the second terminal equipment opens the address identifier through the browser and displays a target page adaptive to the interface of the browser, wherein the target page comprises a target application interface. According to the method and the device, the interface of the target application can be displayed in the second terminal device supporting the browser, and the interface of the target application in the first terminal device can be shared in the second terminal device for displaying. Moreover, the second terminal device may not support the target application, and the browser may be supported to display an interface of the target application. The second terminal device can display the interface of the target application in a self-adaptive mode according to the browser interface.
The page element may be a control in the target application interface, and after the label is set for the control, the text content or the picture content of the control needs to be collected. For example, the target application interface includes controls such as "determine", "next", "cancel", and the like, and if the corresponding tags are set for the controls, text contents "determine", "next", "cancel", and the like of each control are collected. The page element with the tag and the content of the page element are recorded as page layout information.
In some embodiments, the type of second terminal device may include a PC, tablet, cell phone, watch, and other devices.
In combination with the second aspect, in some embodiments, the method further comprises: receiving operation event data sent by the server, wherein the operation event data comprises data, types, operation modes and window titles of operation events, and the operation event data is the operation events which are received by the server and act on the target page; generating a control instruction according to the operation event data; and executing the control instruction to update the page element of the target application.
Specifically, the operation event data is an operation event sent by the second terminal device to the server, and the operation event is an operation event acted on the target page and monitored by the second terminal device through the browser.
According to the method and the device for the browser operation, if the user applies the preset operation on the browser interface of the second terminal device, the browser can monitor the operation event, the second terminal device sends the operation event to the server, and then the server sends the operation event to the first terminal device. The first terminal device can update the page element of the target application according to the operation event, so that the reverse control of the second terminal device on the target application of the first terminal device can be realized.
With reference to the second aspect, in some embodiments, the second terminal device includes a plurality of types of second terminal devices, the address identifier is multiple, and each page address corresponds to one type of second terminal device; the sending the address identifier to the second terminal device includes: adding a unique identifier to each address identifier; and sending the added unique address identification to the corresponding second terminal equipment.
For the case of multiple second terminal devices, the embodiment of the application can render one target page template for each type of the second terminal device to obtain one address identifier, and then add one unique identifier to each address identifier, so that each address identifier can be used only once. And then, the address identifiers added with the unique identifiers are respectively sent to corresponding second terminal equipment, so that the second terminal equipment opens the respective address identifiers through a browser and displays the interface of the target application in the browser interface.
Illustratively, the sharing authority of each second terminal device may be different, and the sharing authority may include displaying an interface of the target application, and displaying the interface of the target application and controlling the target application, and controlling the target application. And if the sharing authority is the interface for displaying the target application, the server does not receive the operation event acted on the browser interface of the second terminal equipment. And if the sharing authority is to display the interface of the target application and control the target application, the server receives the operation event acting on the browser interface of the second terminal equipment.
In some embodiments, the second terminal device may include a second terminal device a and a second terminal device B, where the sharing right of the second terminal device a is an interface for displaying the target application, and the sharing right of the second terminal device B is a control target application. Then, the first terminal device may send the address identifier to the second terminal device a, and send the control information of the target application to the second terminal device B, so that the second terminal device B can control the display of the target page by the second terminal device a.
The target application may be an application capable of playing a video, and the control information of the target application is control time information. The second terminal device B receives touch operation acting on the interface of the second terminal device B, determines target time for obtaining the touch operation, generates a control instruction according to the touch operation and the target time, and sends the control instruction to the second terminal device A. And the second terminal equipment A responds to the control instruction and stops playing the video or continues playing the video.
In a third aspect, an embodiment of the present application provides a terminal application control method, which is applied to a server, and the method includes: receiving page layout information and a type of second terminal equipment, wherein the page layout information and the type of the second terminal equipment are sent by first terminal equipment, the page layout information comprises a first label and content of a page element in a target application interface in the first terminal equipment, and the first label is a label set for the page element; determining a target page template according to the type of the second terminal equipment, and rendering the target page template according to the page layout information; after the target page template is rendered, generating an address identifier of the rendered page; and sending the address identifier to the first terminal device, so that the first terminal device sends the address identifier to the second terminal device, the second terminal device opens the address identifier through a browser, and displays a target page adapted to an interface of the browser, wherein the target page comprises the target application interface.
According to the terminal control method, the tags are set for the page elements in the target application interface, the content of the page elements is collected, and page layout information is obtained. And then, the server determines a target page template according to the type of the second terminal device, renders a page corresponding to the target page template based on the page layout information, and generates an address identifier of the rendered page. And the first terminal equipment sends the address identification to the second terminal equipment. And the second terminal equipment opens the address identifier through the browser and displays a target page adaptive to the interface of the browser, wherein the target page comprises a target application interface. According to the method and the device, the interface of the target application can be displayed in the second terminal device supporting the browser, and the interface of the target application in the first terminal device can be shared in the second terminal device for displaying. Moreover, the second terminal device may not support the target application, and the browser may be supported to display an interface of the target application. The second terminal equipment can display the interface of the target application in a self-adaptive mode according to the browser interface.
With reference to the third aspect, in some embodiments, the page element is a first control in the target application interface, and one first control corresponds to one first tab; the types of the second terminal equipment are multiple, each type corresponds to at least one target page template, each target page template comprises a plurality of second controls, and each second control corresponds to one second label; the rendering the target page template according to the page layout information includes: determining second controls corresponding to the first controls according to a preset label corresponding relation between the first labels and the second labels; and associating the content of the first control with the corresponding second control, and rendering the target page template.
Illustratively, the target page templates corresponding to different types of the second terminal devices are different. For example, the types of second terminal devices may include PCs, tablets, cell phones, watches, and other devices. The PC corresponds to a target page template 1, the tablet corresponds to a target page template 2, the mobile phone corresponds to a target page template 3, and the watch corresponds to a target page template 4.
In one scenario, the page corresponding to the target page template 1 is page 1, a plurality of second controls in page 1 may be provided, and each second control corresponds to one second label. The correspondence between the first tag and the second tag may be set in advance. The first labels comprise first labels corresponding to the N first controls, the second labels comprise second labels corresponding to the M second controls, and the label corresponding relation between each first label and each second label can be preset according to the corresponding relation between each first control and each second control. For example, a first label corresponding to the ImageView control corresponds to a second label corresponding to the img control; a first label corresponding to the TextView control corresponds to a second label corresponding to the p control; and the first label corresponding to the Button control corresponds to the second label corresponding to the Button control.
In combination with the third aspect, in some embodiments, the method further includes: receiving an operation event acted on the browser interface of the second terminal device, wherein the operation event comprises data, type, operation mode and window title of the operation event; and sending the operation event to the first terminal device, so that the first terminal device updates the page element of the target application according to the operation event.
For example, the operation event data is an operation event sent by the second terminal device to the server, and the operation event is an operation event acted on a browser interface and monitored by the second terminal device through a browser.
With reference to the third aspect, in some embodiments, the method further comprises: receiving the sharing authority of the second terminal equipment sent by the first terminal equipment; the sharing permission comprises an interface for displaying the target application, and the interface for displaying the target application and controlling the target application; if the sharing authority is to display the interface of the target application, the step of receiving the operation event acted on the browser interface of the second terminal equipment is not executed; and if the sharing permission is to display the interface of the target application and control the target application, executing the step of receiving the operation event acted on the browser interface of the second terminal equipment.
With reference to the third aspect, in some embodiments, the second terminal device includes a second terminal device a and a second terminal device B, the sharing right of the second terminal device a is an interface for displaying the target application, and the sharing right of the second terminal device B is an interface for controlling the target application; the sending the address identifier to the first terminal device includes: and sending the address identifier and the control information of the target application to a first terminal device, so that the first terminal device sends the address identifier to a second terminal device A, and sends the control information to a second terminal device B, so that the second terminal device B can control the second terminal device A to display the target page.
In a fourth aspect, an embodiment of the present application provides a terminal device, including: one or more processors and memory; the memory coupled with the one or more processors, the memory to store computer program code, the computer program code comprising computer instructions; the computer instructions, when executed by the one or more processors, cause the terminal device to perform the method of any one of the first aspects, or the method of any one of the second aspects, or the method of any one of the third aspects.
In a fifth aspect, an embodiment of the present application provides a server, including: one or more processors and memory; the memory coupled with the one or more processors, the memory to store computer program code, the computer program code comprising computer instructions; the computer instructions, when executed by the one or more processors, cause the terminal device to perform the method of any one of the first aspects, or the method of any one of the second aspects, or the method of any one of the third aspects.
In a sixth aspect, embodiments of the present application provide a chip system, which includes a processor coupled with a memory, and the processor executes a computer program stored in the memory to implement the method according to any one of the first aspect, the method according to any one of the second aspect, or the method according to any one of the third aspect. The chip system can be a single chip or a chip module consisting of a plurality of chips.
In a seventh aspect, an embodiment of the present application provides a chip system, which includes a memory and a processor, where the processor executes a computer program stored in the memory to implement the method according to any one of the first aspect, the method according to any one of the second aspect, or the method according to any one of the third aspect. The chip system can be a single chip or a chip module consisting of a plurality of chips.
In an eighth aspect, embodiments of the present application provide a computer program product, which, when run on a terminal device, causes the terminal device to perform the method of any one of the first aspect, or the method of any one of the second aspect, or the method of any one of the third aspect.
In a ninth aspect, embodiments of the present application provide a computer-readable storage medium, which stores a computer program, and which, when executed by a processor, implements the method according to any one of the first aspect, or the method according to any one of the second aspect, or the method according to any one of the third aspect.
It is to be understood that the terminal device of the fourth aspect, the server of the fifth aspect, the chip system of the sixth aspect and the seventh aspect, the computer program product of the eighth aspect, and the computer-readable storage medium of the ninth aspect are provided for executing the method of the first aspect, or executing the method of the second aspect, or executing the method of the third aspect. Therefore, the beneficial effects achieved by the method can refer to the beneficial effects in the corresponding method, and are not described herein again.
Drawings
Fig. 1 is a schematic view of a scene used in a terminal application control method according to an embodiment of the present application;
fig. 2 is a schematic view of a scenario provided by an embodiment of the present application;
fig. 3 is a schematic process diagram that is provided in the embodiment of the present application and that is displayed in an interface shared by a target application of a first terminal device to an interface of a second terminal device;
fig. 4 is a schematic process diagram of a second terminal device performing inverse control on a target application according to the embodiment of the present application;
fig. 5 is a schematic flowchart of a terminal application control method according to an embodiment of the present application;
fig. 6 is a schematic diagram of a "buy" APP interface in the mobile phone a according to the embodiment of the present application;
fig. 7 is a schematic diagram illustrating a "buy" APP interface displayed in a tablet computer according to an embodiment of the present application;
FIG. 8 is a diagram illustrating a "buy" APP interface displayed in a PC according to an embodiment of the present application;
fig. 9 is a schematic diagram illustrating a "buy" APP interface displayed in the mobile phone B according to the embodiment of the present application;
FIG. 10 is a schematic view of a scenario provided by an embodiment of the present application;
fig. 11(a) and fig. 11(b) are schematic views of scenes provided in an embodiment of the present application;
fig. 12 is a schematic hardware architecture diagram of a terminal device according to an embodiment of the present application;
fig. 13 is a block diagram of a software structure of a terminal device according to an embodiment of the present application.
Detailed Description
In the following description, for purposes of explanation and not limitation, specific details are set forth, such as particular system structures, techniques, etc. in order to provide a thorough understanding of the embodiments of the present application. It will be apparent, however, to one skilled in the art that the present application may be practiced in other embodiments that depart from these specific details. In other instances, detailed descriptions of well-known systems, devices, circuits, and methods are omitted so as not to obscure the description of the present application with unnecessary detail.
It will be understood that the terms "comprises" and/or "comprising," when used in this specification and the appended claims, specify the presence of stated features, integers, steps, operations, elements, and/or components, but do not preclude the presence or addition of one or more other features, integers, steps, operations, elements, components, and/or groups thereof.
It should also be understood that the term "and/or" as used in this specification and the appended claims refers to and includes any and all possible combinations of one or more of the associated listed items.
As used in this specification and the appended claims, the term "if" may be interpreted contextually as "when", "upon" or "in response to a determination" or "in response to a detection". Similarly, the phrase "if it is determined" or "if a [ described condition or event ] is detected" may be interpreted contextually to mean "upon determining" or "in response to determining" or "upon detecting [ described condition or event ]" or "in response to detecting [ described condition or event ]".
Furthermore, in the description of the present application and the appended claims, the terms "first," "second," "third," and the like are used for distinguishing between descriptions and not necessarily for describing a relative importance or importance.
Reference throughout this specification to "one embodiment" or "some embodiments," or the like, means that a particular feature, structure, or characteristic described in connection with the embodiment is included in one or more embodiments of the present application. Thus, appearances of the phrases "in one embodiment," "in some embodiments," "in other embodiments," or the like, in various places throughout this specification are not necessarily all referring to the same embodiment, but rather "one or more but not all embodiments" unless specifically stated otherwise. The terms "comprising," "including," "having," and variations thereof mean "including, but not limited to," unless expressly specified otherwise.
In addition, the references to "a plurality" in the embodiments of the present application should be interpreted as two or more.
The steps involved in the terminal application control method provided in the embodiment of the present application are merely examples, and not all the steps are necessarily executed steps, or the content in each information or message is not always necessary, and may be increased or decreased as needed in the use process.
The same steps or messages with the same functions in the embodiments of the present application may be referred to with each other between different embodiments.
The service scenario described in the embodiment of the present application is for more clearly illustrating the technical solution of the embodiment of the present application, and does not form a limitation on the technical solution provided in the embodiment of the present application, and as a person having ordinary skill in the art knows that along with the evolution of a network architecture and the appearance of a new service scenario, the technical solution provided in the embodiment of the present application is also applicable to similar technical problems.
In the related art, a method for cooperative control of multiple client APPs is provided. For example, the client a responds to a page sharing operation applied by a user in the APP, and generates a sharing text and a target page corresponding to the sharing text. And the client A sends the sharing text and the target page to the client B through the instant messaging technology. And after the client B acquires the sharing text, copying the sharing text. The user can apply a shared text pasting operation in the APP of the client B, the client B responds to the shared text pasting operation, whether the text in the shared text pasting operation is consistent with the text in the target page or not is detected, and if yes, the target page to be shared by the client A is opened.
However, in the related art, the requirement for the client is high, multiple clients are required to download and install the same APP adapted to the screen size of the client, and the clients incapable of installing the APP cannot realize the cooperative control of the multiple clients APP. Moreover, once the related technologies cannot realize counter-control, essentially only the same page is opened for sharing, and any operation on the page cannot be transmitted to other clients. Moreover, in the related art, the way that the client a shares the target page with other clients is complicated, the sharing text needs to be generated by clicking on a certain page of the APP of the client a, the sharing text is transmitted to other clients through the instant messaging application, the sharing text is copied by other clients, and the client a opens the APP and can jump to the target page.
The second related art provides a screen projection cooperative control method. And the client A marks a page and an instruction in the APP and sends the page and the instruction to the client B. And the client B restores the tagged instruction, converts the instruction on the client A into the instruction on the client B and constructs a picture of the client A. The user can apply operations such as clicking, sliding and the like on the picture of the client A constructed on the client B through a mouse so as to simulate the touch event of the user on the client A. The client B monitors a mouse event and transmits the event to the client A in the form of an instruction stream, the client A restores the instruction stream sent by the client B and executes the restored instruction, and therefore the client A is controlled on the client B.
However, in the second related art, one client a can only project a page to one client B and one client B controls the client a reversely, but the scenario of one client a to a plurality of clients B cannot be realized. Moreover, the instruction stream sent by the client B to the client a is a touch event generated by the user applying an operation on the client B, and the formation of the touch event requires the actual operation on the client B by the user, so that after the user applies a certain operation on the client B, the client a may respond to the operation for a long time.
Based on the foregoing problems, an embodiment of the present application provides a terminal application control method. The first terminal device sets a tag for a page element needing to be shared in the target application, and obtains the type and sharing permission of a second terminal device, wherein the second terminal device is the terminal device sharing the target application. And the first terminal equipment sends the type and the page layout information of the second terminal equipment to the server, wherein the page layout information is obtained according to the page elements. The server determines a target page template according to the type of the second terminal device, renders an Html page of the target page template based on the page layout information, and then sends a Uniform Resource Locator (URL) address of the rendered target page template to the first terminal device. And the first terminal equipment sends the URL address to the second terminal equipment, and the second terminal equipment opens the URL address through a browser or WebView, wherein page controls in the browser or WebView correspond to controls in the target application one to one through labels.
After the second terminal device receives an operation (for example, a click event of a control) acting on the browser or the WebView, the second terminal device responds to the operation and sends an event corresponding to the operation and a label of the event to the first terminal device. And the first terminal equipment determines a control corresponding to the event in the target application according to the label of the event, and executes the event according to the determined control.
When the page of the target application of the first terminal device changes, the first terminal device sends page layout change information of the target application to the server, and the server renders the constructed Html DOM tree again according to the page layout change information of the target application and generates a new URL address. And the first terminal equipment sends the new URL address to the second terminal equipment. And the second terminal equipment reloads the rendering page according to the new URL address.
For example, in the embodiment of the present application, each of the first terminal device and the second terminal device may be a Personal Computer (PC), a mobile phone, a tablet computer, a wearable device, an in-vehicle device, an Augmented Reality (AR)/Virtual Reality (VR) device, a notebook computer, a super-mobile personal computer (UMPC), a netbook, a Personal Digital Assistant (PDA), or other terminal devices, and the embodiment of the present application does not limit the specific type of the terminal device.
Fig. 1 is a scene schematic diagram of an application control method of a terminal according to an embodiment of the present disclosure. Referring to fig. 1, the scenario includes a first terminal device, a second terminal device, and a server.
And a target APP is installed on the first terminal device and comprises an interface UI. The first terminal equipment is also provided with a Dues server, and the Dues server comprises a content acquisition module, a content scheduling module and a content preprocessing module. The content collection module is used for setting tags for page elements in the target APP, for example, setting tags for page elements needing to be shared and page elements of all event executions. And after the content acquisition module sets a label for the page element to be shared by the target APP, obtaining a plurality of marked page elements. The content scheduling module is used for scheduling the marked page elements to the content preprocessing module. The content preprocessing module packs the marked page elements according to a specific format, so that the marked page elements can be transmitted to a following module in a fixed data structure. The content preprocessing module can process the content of the Rview, convert the Rview into JSON language, process the content of the Light and update the behavior list according to the result of the processing of the content of the Light.
The Dues server is also provided with a data conversion module and a Web server. The Web server is used for providing functions of selecting the shared terminal equipment and setting the authority of the shared terminal equipment for the user. Wherein, the sharing terminal device may include: PCs, tablets, cell phones, watches, and other devices. The first four terminal devices are respectively provided with a specific Html template, and the Html template can set a UI corresponding to the terminal device. When the user selects other equipment, the interface of the target application in the first terminal equipment can be scaled and adapted in an equal ratio according to the browser resolution of the other equipment.
The authority for sharing the terminal device includes: shared only, controlled only, shared and controlled. The authority of the shared terminal device is only shared, which means that the current terminal device can only browse a page shared by a target application in the first terminal device and cannot reversely control the target application, and the authority is generally used for a display device with a large screen. The authority of the shared terminal device is only controlled, which means that the current terminal device can only be used as some simple control functions, for example, as a remote controller, and cannot display a page shared by a target application. The sharing and the control are the combination of the two authorities, namely, a page shared by the target application can be displayed, and the target application can also be reversely controlled, wherein the authority is usually used for a mobile phone, a flat panel device and the like.
And the data conversion module is used for performing data conversion on the data output by the content preprocessing module and sending the converted data to the Web server. For example, the data output by the content preprocessing module is a language for describing the UI resource tree ViewTree, and the data conversion module converts the language for describing the UI resource tree ViewTree into JSON language which can be understood and processed by the interface construction server.
And the Web server processes the converted data according to the selected sharing terminal equipment (namely, the second terminal equipment) to obtain drawing data with a preset data structure. For example, the Web server adds the field information related to the type and the authority information of the second terminal device to the converted data, so as to obtain the drawing data with the preset data structure. And the Web server sends the selected shared terminal equipment, the equipment authority and the drawing data to an interface construction server through a Socket. The drawing data may contain tag information, type information, and content, and for example, a data structure of the drawing data may be as shown in the following code:
Figure BDA0002950556910000081
Figure BDA0002950556910000091
the interface construction server comprises a data analysis module, a layout drawing module, a page control drawing module, an interface synthesis module and a page rendering module.
The data analysis module is used for analyzing the drawing data to obtain layout information and page control information.
The layout drawing module is configured to adopt a uniform flex layout as a CSS (Cascading Style Sheets) Style of the Html page layout, and set corresponding interface layout parameters for the layout information (including, for example, the layout container attribute) obtained through the analysis.
The page control drawing module is used for determining a control corresponding relation between the label of the control in the target application and the label of the control in the Html. For example, the first terminal device is a mobile phone, the page control drawing module determines that a label of an ImageView control in the target application corresponds to a label of an img control of the Html, a label of a TextView control in the target application corresponds to a label of a p control of the Html, a label of a Button control in the target application corresponds to a label of an Input/Button control of the Html, and the like.
And the interface synthesis module is used for performing association synthesis on the corresponding relation between the interface layout parameters and the controls.
The page rendering module is used for determining a corresponding Html page according to the shared terminal device and the device permission, and rendering the Html page based on the corresponding relation between the interface layout parameters and the controls, for example, rendering and analyzing corresponding events in the Html control label and the Html control label.
The interface construction server is used for sending the URL address of the rendered Html page to the first terminal device.
In addition, the interface construction server also comprises a Socket monitoring module. And after the Socket monitoring module is activated, the Socket monitoring module is used for monitoring a new command and layout data sent by the first terminal equipment.
After acquiring the URL address of the rendered Html page, the first terminal device encrypts the URL address, for example, the URL address may be encrypted by using a one-way salt value. In addition, the first terminal device generates a unique suffix for the URL address by the distributed ID, the unique suffix being used to ensure that each URL address can only be used once. If the number of the second terminal devices is multiple, after the first terminal device obtains the URL address of the rendered Html page, multiple suffixes are generated for the URL address through the distributed ID, and the URL addresses of different suffixes are sent to different second terminal devices. And the second terminal equipment opens a sharing page according to the received URL address, and a Web browser of the second terminal equipment sends a get request to obtain a corresponding Html page so as to complete the mapping of the source Activity.
The reverse control process of the second terminal equipment to the first terminal equipment is as follows: the method comprises the steps that a browser page of second terminal equipment receives an interactive event, the interactive event is transmitted to an interface construction server through WebSocket, a Socket monitoring module of the interface construction server monitors all Html tags with interactive event tags, when a user conducts interactive operation on a Web browser of the second terminal equipment, the interactive event of the user is serialized into JSON data, and the JSON data comprise data change, event types, event operation modes, and a windowTitle corresponding to the first terminal equipment. The data structure of this data is shown in the following code:
Figure BDA0002950556910000092
Figure BDA0002950556910000101
for example, the interface construction server transmits the JSON data to a behavior receiving module of the Dues server through Socket, and the behavior receiving module sends the JSON data to a behavior scheduling module. And the behavior scheduling module analyzes and converts the JSON data, repacks the JSON data into an instruction sequence required by the execution of the target APP process, and types the instruction sequence into the process of the target APP. And the process of the target APP executes the instruction operation according to the page element root and the label to complete the reverse control of the target APP.
Referring to fig. 2, the sending end may be a first terminal device, and the receiving end may be an interface construction server. And in the life cycle of the page elements, the sending end marks the page elements and sets a label. And then, the sending end collects the content of each page element, packages the collected content of each page element and sends the packaged content of each page element to the receiving end. And the receiving end analyzes the received data to obtain the content of each page element, and updates the content of the corresponding page element according to the tag.
Referring to fig. 3, the process of sharing the interface of the target application of the first terminal device into the interface of the second terminal device for display may include: after the target APP in the first terminal device starts to be cooperatively controlled, the Dues server sets tags for page elements needing to be shared and page elements executed by all events in the target APP to obtain a plurality of marked page elements, and packs the plurality of marked page elements to obtain UI data of the target APP. And the Dues server sends the UI data of the target APP to the interface construction server. The interface construction server draws the Html page according to the UI data of the target APP, generates a URL address of the Html page, and details on how to draw the Html page according to the UI data of the target APP are omitted here. And the target APP sends the acquired URL address of the Html page to the second terminal device. And the second terminal equipment opens the URL address through a Web browser, and displays an interface of the target APP matched with the second terminal equipment according to the corresponding relation between the control of the Html page and the control of the first terminal equipment.
And after the interface of the target APP of the first terminal device is changed, the Dues server acquires the change information of the interface of the target APP and sends the change information of the interface of the target APP to the interface construction server. And the interface construction server updates the Html page according to the change information of the target APP interface. After that, the second terminal device updates the interface displayed in the Web browser according to the updated Html page.
Referring to fig. 4, the process of the second terminal device performing inverse control on the target APP may include: after the second terminal device receives an operation (for example, a click event of a control) acting on the browser interface, the browser monitors the operation event, the second terminal device sends the operation event and a label of the operation event to the interface construction server, and the interface construction server receives the operation event acting on the Web browser interface of the second terminal device to obtain an operation event sequence. The interface construction server converts the operation event sequence into JSON data, wherein the JSON data comprises data change, event types, event operation modes, a window title WindowTitle of the first terminal equipment and the like. And the interface construction server sends the JSON data to the Dues server, the Dues server analyzes and converts the JSON data, repacks the JSON data into an instruction sequence required by the execution of the target APP process, and inputs the instruction sequence into the process of the target APP. For example, the interface construction server may send JSON data to a behavior receiving module of the Dues server, and the behavior receiving module sends the JSON data to a behavior scheduling module. And the behavior scheduling module analyzes and converts the JSON data, repacks the JSON data into an instruction sequence required by the execution of the target APP process, and injects the instruction sequence into the process of the target APP. And the process of the target APP executes the instruction operation according to the page element root and the label, and updates the page element of the target APP, thereby realizing the counter control of the target APP.
Fig. 5 is a flowchart illustrating a terminal application control method according to an embodiment of the present application. Referring to fig. 5, the terminal application control method may include the steps of:
step 101, a first terminal device sets a first tag for a page element to be shared in a target application interface, and acquires content of the page element to obtain page layout information.
The page element may be a control in the target application interface, and after the label is set for the control, text content or picture content of the control needs to be collected. For example, the target application interface includes controls such as "determine", "next", "cancel", and the like, and if the corresponding tags are set for the controls, text contents "determine", "next", "cancel", and the like of each control are collected. The page layout information includes a first tab, a page element having a tab, and content of the page element.
And 102, the first terminal equipment acquires the type and the sharing authority of the second terminal equipment.
The type of the second terminal device may include a PC, a tablet, a mobile phone, a watch, and other devices, among others.
The rights of the second terminal device may include: the method comprises the steps of displaying a target application interface, controlling a target application, displaying the target application and the interface, and controlling the target application. The permission of the second terminal device is to display a target application interface, which means that the second terminal device can only display a page shared by the target application and cannot reversely control the target application, and the permission can be generally used for a display device with a large screen. The authority of the shared terminal device is used for controlling the target application, which means that the second terminal device can only be used as some simple control functions, for example, as a remote controller, and cannot display the page shared by the target application, and the authority can be generally used for lightweight devices such as watches. The permission of the second terminal device is to display the target application and control the target application, which means that the second terminal device can display the page shared by the target application and can also perform inverse control on the target application, and the permission can be generally used for devices such as mobile phones and tablet devices.
And 103, the first terminal device sends the type and the page layout information of the second terminal device to an interface construction server.
The interface construction server can be provided with various page templates. Each page template corresponds to one type of the second terminal equipment, and each page template has a corresponding UI. For example, the page template may be an Html template.
For example, if the type of the second terminal device is PC, the Html template has a UI corresponding to the PC; if the type of the second terminal device is a flat panel, the Html template has a UI corresponding to the flat panel; if the type of the second terminal device is a mobile phone, the Html template has a UI corresponding to the mobile phone; if the type of the second terminal device is a watch, the Html template has a UI corresponding to the watch.
If the type of the second terminal device is other devices, scaling and adapting the interface of the target application in the first terminal device according to the browser resolution of the other devices without setting the Html template corresponding to the other devices.
And 104, the interface construction server determines a target page template according to the type of the second terminal equipment, and renders a page corresponding to the target page template based on the page layout information to obtain the URL address of the rendered page.
The page corresponding to the target page template can be provided with a plurality of second controls, each second control corresponds to one second label, and the position of each second control on the page can be preset. For example, the page module may be a JS template, and the second controls corresponding to different page modules may be different.
For example, the rendering of the page of the target page template by the interface construction server based on the page layout information may include: acquiring a first label of a first control of a target APP in the page layout information, wherein the first control is any one of the controls of the target APP; determining a second label corresponding to the first label; determining a second control corresponding to the determined second label in the page corresponding to the target page template; and associating the content of the first control with the second control, and rendering the target page template.
In this embodiment, the target page templates corresponding to different types of the second terminal device are different. For example, the types of second terminal devices may include PCs, tablets, cell phones, watches, and other devices. The PC corresponds to a target page template 1, the tablet corresponds to a target page template 2, the mobile phone corresponds to a target page template 3, and the watch corresponds to a target page template 4.
In one scenario, the page corresponding to the target page template 1 is page 1, a plurality of second controls in page 1 may be provided, and each second control corresponds to one second label. The correspondence between the first tag and the second tag may be set in advance. The first labels include first labels corresponding to the N first controls, the second labels include second labels corresponding to the M second controls, and label correspondence between each first label and each second label may be preset according to correspondence between each first control and each second control, and then the label correspondence between each first label and each second label may be directly used in step 104. For example, a first label corresponding to the ImageView control corresponds to a second label corresponding to the img control; a first label corresponding to the TextView control corresponds to a second label corresponding to the p control; and the first label corresponding to the Button control corresponds to the second label corresponding to the Button control.
For how to establish the above-mentioned tag correspondence relationship between the target page template 2 and the target page template 4, please refer to the scene of the target page template 1, which is not described herein again.
And after the target page template is rendered, obtaining a rendered page, wherein the rendered page is an interface of a target APP in the first terminal device. And then, the interface construction server generates a URL address of the rendered page and sends the URL address to the first terminal equipment.
And step 105, the first terminal equipment sends the URL address to the second terminal equipment.
In some embodiments, the first terminal device may encrypt the URL address. For example, the first terminal device may encrypt the URL address by using a one-way salt encryption.
In some embodiments, the first terminal device may generate a unique suffix for the URL address, which is added to the end of the URL address so that the URL address can only be used once by the second terminal device.
In the case where the number of the second terminal devices is plural, each second terminal device corresponds to one URL address, and the first terminal device may generate a unique suffix for each URL address, and add the unique suffix to the end of the corresponding URL address.
And 106, the second terminal device opens the URL address and displays a target page adaptive to the interface of the browser, wherein the target page comprises a target application interface.
The second terminal device can open the URL address in a Web browser or WebView mode and the like, and displays a page containing a target APP interface in a Web browser window. Moreover, the page can be adaptively adjusted according to the browser window.
According to the terminal application control method, the tags are set for the page elements in the target application interface, and the content of the page elements is collected to obtain the page layout information. And then, the server determines a target page template according to the type of the second terminal device, renders a page corresponding to the target page template based on the page layout information, and generates an address identifier of the rendered page. And the first terminal equipment sends the address identification to the second terminal equipment. And the second terminal equipment opens the address identifier through the browser and displays a target page adaptive to the interface of the browser, wherein the target page comprises a target application interface. According to the method and the device, the interface of the target application can be displayed in the second terminal device supporting the browser, and the interface of the target application in the first terminal device can be shared in the second terminal device for displaying. Moreover, the second terminal device may not support the target application, and the browser may be supported to display an interface of the target application. The second terminal device can display the interface of the target application in a self-adaptive mode according to the browser interface.
Example one
The following describes an embodiment of the present application by taking a target application as "buy" APP, a first terminal device as a mobile phone a, and a second terminal device as a PC, a tablet, and a mobile phone B as examples. The number of the second terminal devices may be one or more, and the types of the plurality of second terminal devices may be different or the same.
Fig. 6 is a schematic diagram of "buy" APP interface in the mobile phone a. Referring to FIG. 6, a plurality of first controls, such as first controls 11 through 17, are included in the buy APP interface. The first control 11 may be an identification control of "buy" APP, and the content of the control may be a picture containing a "buy" word. The first control 12 may be a search control for entering search content. The first control 13 may be a category control for classifying items, and may include categories such as "living goods", "electronic products", "food products", and the like. The first control 14, the first control 15, and the first control 16 may be presentation controls for presenting merchandise. The first control 17 may be a display control for displaying the product, and the first control 17 may be multiple. Each of the first controls 11-17 corresponds to a first label.
Fig. 7 is a schematic diagram showing a buy APP interface in a tablet computer. Referring to fig. 7, a plurality of second controls, for example, the second controls 21 to 27, are included in the APP "buy" interface displayed by the tablet computer through the browser. The second control 21 may be an identification control of "buy" APP, and the content of the control may be a picture containing a "buy" word. The second control 22 may be a search control for entering search content. The second control 23 may be a category control for classifying items, which may include categories such as "articles of daily use," "electronic products," "food products," and so on. The second control 24, the first control 25, and the first control 26 may be presentation controls for presenting merchandise. The second control 27 may be a display control for displaying the merchandise, and the second control 27 may be plural. Each of the second controls 21-27 corresponds to a second tab. The first label and the second label may have a preset label correspondence relationship.
The process of displaying the interface of buying the APP in the mobile phone a into the tablet computer may include: setting a first label for each of the first controls 11 to 17 in the "buy" APP interface in the mobile phone A, acquiring the content of each first control, obtaining UI data of the "buy" APP in the mobile phone A, and sending the UI data and the type of the second terminal device to the interface construction server as a tablet computer. And the interface construction server determines the target page template as a target page template corresponding to the tablet computer according to the type of the second terminal equipment. The target page template comprises a plurality of second controls, each second control corresponds to one second label, and the position of each second label in the target page template can be preset. And the interface construction server associates the content of the first control corresponding to the first label with the corresponding second control according to the label corresponding relation between the first label and the second label, and renders the target page template to obtain the target page. And then, the interface construction server sends the URL address of the target page to the mobile phone A, and the mobile phone A encrypts the URL address, adds a unique suffix and the like, and sends the URL address to the tablet personal computer. And the tablet computer opens the URL address through a Web browser and displays an APP buying interface.
The user operates "buying" APP in the panel computer, and the process that the panel computer carries out the counter control to "buying" APP in cell-phone A can include: the method comprises the steps that a user applies preset operation to a certain commodity in a browser interface of a tablet computer, the browser of the tablet computer monitors a preset operation event and sends the preset operation event to an interface construction server, the interface construction server converts the preset operation event into JSON data, and the JSON data comprise data change, event types, event operation modes, WindowTitle in a mobile phone and the like. The interface construction server sends the JSON data to a Dues server of the mobile phone, the Dues server analyzes and converts the JSON data, repacks the JSON data into an instruction sequence required by the purchase of the APP process, and the instruction sequence is typed into the purchase of the APP process. The process of buying the APP executes the instruction operation according to the page element root and the tag, and updates the page element of buying the APP, so that the counter control of buying the APP is realized.
In some embodiments, the interface length-width ratio of the target page template corresponding to the tablet pc is greatly different from the interface length-width ratio of the APP "bought" by the mobile phone a, and therefore the arrangement manner of the first controls 14, 15, and 17 in the interface of the APP "bought" by the mobile phone a is different from the arrangement manner of the second controls 24, 25, and 27 in the target page template. As shown in fig. 6 and 7, the first controls 14 and 15 are arranged vertically, and the second controls 24 and 25 are arranged laterally side by side. The plurality of first controls 17 are arranged in a plurality of rows, and each row has two first controls 17, and the plurality of second controls 27 is arranged in a plurality of rows, and each row has five second controls 17.
In one scenario, the mobile phone a may share the APP interface of "buying" to a plurality of second terminal devices, for example, to a tablet, a PC, and a mobile phone B. For example, when the mobile phone a shares currently browsed clothes with a plurality of second terminal devices, when the user a performs operations such as sliding or clicking a page in the mobile phone a, the browser interface of each second terminal device may also be changed accordingly. Similarly, the user B may also apply sliding or click on the current page in the browser interface of the second terminal device, so as to tell the user a that the piece of clothing is nice, and the APP interface "buying" in the mobile phone a may also be changed accordingly.
FIG. 8 is a diagram showing a "buy" APP interface in a PC. Referring to fig. 8, the PC displays a "buy" APP interface via the browser that includes a plurality of second controls, such as second controls 31-37. The second control 31 may be an identification control of "buy" APP, and the content of the control may be a picture containing a "buy" word. The second control 32 may be a search control for entering search content. The second control 33 may be a category control for item classification, which may include categories such as "living goods," "electronic products," "food products," and so forth, for example. The second control 34, the first control 35, and the first control 36 may be presentation controls for presenting merchandise. The second control 37 may be a display control for displaying the product, and the second control 37 may be plural. Each of the second controls 31-37 corresponds to a second tab.
Fig. 9 is a schematic diagram showing an APP interface of "buy" in the mobile phone B. Referring to fig. 9, a plurality of second controls, for example, second controls 41 to 47, are included in the APP "buy" interface displayed by the browser of the cell phone B. The second control 41 may be an identification control of "buy" APP, and the content of the control may be a picture containing a "buy" word. The second control 42 may be a search control for entering search content. The second control 43 may be a category control for item classification, which may include categories such as "living goods," "electronic products," "food products," and so forth, for example. The second control 44, the first control 45, and the first control 46 may be presentation controls for presenting merchandise. The second control 47 may be a display control for displaying the product, and the second control 47 may be plural. Each of the second controls 41-47 corresponds to a second tab.
Example two
The following describes embodiments of the present application by taking a target application as a video application, a first terminal device as a mobile phone, and a second terminal device as a large-screen device (e.g., a flat panel or a television) and a lightweight device (e.g., a watch).
Referring to fig. 10, when playing video a through the video application of the mobile phone, the mobile phone sends video information to the tablet, and the tablet displays the video. For a specific process of the tablet displaying the video through the browser, refer to embodiment one, and details are not described herein. The mobile phone sends the control time corresponding to the video to the watch, and the watch controls the video played in the tablet according to the user operation acting on the watch interface, such as pausing the playing, continuing the playing and the like.
In this embodiment, it is not necessary that the first terminal device and the second terminal device are in the same physical space or in the same local area network.
EXAMPLE III
The following description will be given taking an example in which a target application is a collaborative game application, a first terminal device is a mobile phone, and a second terminal device is a touch screen device (e.g., a tablet). Wherein the second terminal device does not support the collaborative game application.
Referring to fig. 11(a), when the user a plays a game through the collaborative game application of the mobile phone, the mobile phone sends a game screen to the tablet, and the tablet displays the video through the browser. For the specific process of the tablet displaying the video through the browser, please refer to embodiment one, which is not described herein again.
Referring to fig. 11(B), the user B applies an operation in the browser interface of the tablet, the tablet sends the operation to the mobile phone, and the mobile phone displays the operation in the collaborative game application. For a specific process of displaying the operation in the cooperative game application, refer to embodiment one, and are not described herein again.
In this embodiment, for a second terminal device that does not support a target application, the first terminal device may share an application interface with the second terminal device, and the second terminal device may control the target application in the first terminal device.
Exemplarily, fig. 12 is a schematic hardware architecture diagram of the terminal device 100 provided in the embodiment of the present application. The terminal device 100 may be the first terminal device described above, or may be the second terminal device described above.
The terminal device 100 may include a processor 110, an external memory interface 120, an internal memory 121, a Universal Serial Bus (USB) interface 130, a charging management module 140, a power management module 141, a battery 142, an antenna 1, an antenna 2, a mobile communication module 150, a wireless communication module 160, an audio module 170, a speaker 170A, a receiver 170B, a microphone 170C, an earphone interface 170D, a sensor module 180, a button 190, a motor 191, an indicator 192, a camera 193, a display screen 194, a Subscriber Identity Module (SIM) card interface 195, and the like. The sensor module 180 may include a pressure sensor 180A, a gyroscope sensor 180B, an air pressure sensor 180C, a magnetic sensor 180D, an acceleration sensor 180E, a distance sensor 180F, a proximity light sensor 180G, a fingerprint sensor 180H, a temperature sensor 180J, a touch sensor 180K, an ambient light sensor 180L, a bone conduction sensor 180M, and the like.
It is to be understood that the illustrated structure of the embodiment of the present invention does not specifically limit the terminal device 100. In other embodiments of the present application, the terminal device 100 may include more or fewer components than shown, or combine certain components, or split certain components, or a different arrangement of components. The illustrated components may be implemented in hardware, software, or a combination of software and hardware.
Processor 110 may include one or more processing units, such as: the processor 110 may include an Application Processor (AP), a modem processor, a Graphics Processing Unit (GPU), an Image Signal Processor (ISP), a controller, a memory, a video codec, a Digital Signal Processor (DSP), a baseband processor, and/or a neural-Network Processing Unit (NPU), etc. The different processing units may be separate devices or may be integrated into one or more processors.
The controller may be a neural center and a command center of the terminal device 100, among others. The controller can generate an operation control signal according to the instruction operation code and the timing signal to complete the control of instruction fetching and instruction execution.
A memory may also be provided in processor 110 for storing instructions and data. In some embodiments, the memory in the processor 110 is a cache memory. The memory may hold instructions or data that have just been used or recycled by the processor 110. If the processor 110 needs to reuse the instruction or data, it can be called directly from memory. Avoiding repeated accesses reduces the latency of the processor 110, thereby increasing the efficiency of the system.
In some embodiments, processor 110 may include one or more interfaces. The interface may include an integrated circuit (I2C) interface, an integrated circuit built-in audio (I2S) interface, a Pulse Code Modulation (PCM) interface, a universal asynchronous receiver/transmitter (UART) interface, a Mobile Industry Processor Interface (MIPI), a general-purpose input/output (GPIO) interface, a Subscriber Identity Module (SIM) interface, and/or a Universal Serial Bus (USB) interface, etc.
The I2C interface is a bi-directional synchronous serial bus that includes a serial data line (SDA) and a Serial Clock Line (SCL). In some embodiments, processor 110 may include multiple sets of I2C buses. The processor 110 may be coupled to the touch sensor 180K, the charger, the flash, the camera 193, etc. through different I2C bus interfaces, respectively. For example: the processor 110 may be coupled to the touch sensor 180K through an I2C interface, so that the processor 110 and the touch sensor 180K communicate through an I2C bus interface, thereby implementing the touch function of the terminal device 100.
The I2S interface may be used for audio communication. In some embodiments, processor 110 may include multiple sets of I2S buses. The processor 110 may be coupled to the audio module 170 via an I2S bus to enable communication between the processor 110 and the audio module 170. In some embodiments, the audio module 170 may communicate audio signals to the wireless communication module 160 via the I2S interface, enabling answering of calls via a bluetooth headset.
The PCM interface may also be used for audio communication, sampling, quantizing and encoding analog signals. In some embodiments, audio module 170 and wireless communication module 160 may be coupled by a PCM bus interface. In some embodiments, the audio module 170 may also transmit audio signals to the wireless communication module 160 through the PCM interface, so as to implement a function of answering a call through a bluetooth headset. Both the I2S interface and the PCM interface may be used for audio communication.
The UART interface is a universal serial data bus used for asynchronous communications. The bus may be a bidirectional communication bus. It converts the data to be transmitted between serial communication and parallel communication. In some embodiments, a UART interface is generally used to connect the processor 110 with the wireless communication module 160. For example: the processor 110 communicates with a bluetooth module in the wireless communication module 160 through a UART interface to implement a bluetooth function. In some embodiments, the audio module 170 may transmit the audio signal to the wireless communication module 160 through a UART interface, so as to realize the function of playing music through a bluetooth headset.
MIPI interfaces may be used to connect processor 110 with peripheral devices such as display screen 194, camera 193, and the like. The MIPI interface includes a Camera Serial Interface (CSI), a Display Serial Interface (DSI), and the like. In some embodiments, the processor 110 and the camera 193 communicate through a CSI interface to implement the photographing function of the terminal device 100. The processor 110 and the display screen 194 communicate through the DSI interface to implement the display function of the terminal device 100.
The GPIO interface may be configured by software. The GPIO interface may be configured as a control signal and may also be configured as a data signal. In some embodiments, a GPIO interface may be used to connect the processor 110 with the camera 193, the display 194, the wireless communication module 160, the audio module 170, the sensor module 180, and the like. The GPIO interface may also be configured as an I2C interface, an I2S interface, a UART interface, a MIPI interface, and the like.
The USB interface 130 is an interface conforming to the USB standard specification, and may be a Mini USB interface, a Micro USB interface, a USB Type C interface, or the like. The USB interface 130 may be used to connect a charger to charge the terminal device 100, and may also be used to transmit data between the terminal device 100 and a peripheral device. And the method can also be used for connecting a headset and playing audio through the headset. The interface may also be used to connect other terminal devices, such as AR devices and the like.
It should be understood that the interface connection relationship between the modules according to the embodiment of the present invention is only an exemplary illustration, and does not limit the structure of the terminal device 100. In other embodiments of the present application, the terminal device 100 may also adopt different interface connection manners or a combination of multiple interface connection manners in the above embodiments.
The charging management module 140 is configured to receive charging input from a charger. The charger may be a wireless charger or a wired charger. In some wired charging embodiments, the charging management module 140 may receive charging input from a wired charger via the USB interface 130. In some wireless charging embodiments, the charging management module 140 may receive a wireless charging input through a wireless charging coil of the terminal device 100. The charging management module 140 may also supply power to the terminal device through the power management module 141 while charging the battery 142.
The power management module 141 is used to connect the battery 142, the charging management module 140 and the processor 110. The power management module 141 receives input from the battery 142 and/or the charge management module 140 and provides power to the processor 110, the internal memory 121, the external memory, the display 194, the camera 193, the wireless communication module 160, and the like. The power management module 141 may also be used to monitor parameters such as battery capacity, battery cycle count, battery state of health (leakage, impedance), etc. In some other embodiments, the power management module 141 may also be disposed in the processor 110. In other embodiments, the power management module 141 and the charging management module 140 may be disposed in the same device.
The wireless communication function of the terminal device 100 may be implemented by the antenna 1, the antenna 2, the mobile communication module 150, the wireless communication module 160, a modem processor, a baseband processor, and the like.
The antennas 1 and 2 are used for transmitting and receiving electromagnetic wave signals. Each antenna in terminal device 100 may be used to cover a single or multiple communication bands. Different antennas can also be multiplexed to improve the utilization of the antennas. For example: the antenna 1 may be multiplexed as a diversity antenna of a wireless local area network. In other embodiments, the antenna may be used in conjunction with a tuning switch.
The mobile communication module 150 may provide a solution including 2G/3G/4G/5G wireless communication applied on the terminal device 100. The mobile communication module 150 may include at least one filter, a switch, a power amplifier, a Low Noise Amplifier (LNA), and the like. The mobile communication module 150 may receive the electromagnetic wave from the antenna 1, filter, amplify, etc. the received electromagnetic wave, and transmit the electromagnetic wave to the modem processor for demodulation. The mobile communication module 150 may also amplify the signal modulated by the modem processor, and convert the signal into electromagnetic wave through the antenna 1 to radiate the electromagnetic wave. In some embodiments, at least some of the functional modules of the mobile communication module 150 may be disposed in the processor 110. In some embodiments, at least some of the functional modules of the mobile communication module 150 may be provided in the same device as at least some of the modules of the processor 110.
The modem processor may include a modulator and a demodulator. The modulator is used for modulating a low-frequency baseband signal to be transmitted into a medium-high frequency signal. The demodulator is used for demodulating the received electromagnetic wave signal into a low-frequency baseband signal. The demodulator then passes the demodulated low frequency baseband signal to a baseband processor for processing. The low frequency baseband signal is processed by the baseband processor and then passed to the application processor. The application processor outputs a sound signal through an audio device (not limited to the speaker 170A, the receiver 170B, etc.) or displays an image or video through the display screen 194. In some embodiments, the modem processor may be a stand-alone device. In other embodiments, the modem processor may be provided in the same device as the mobile communication module 150 or other functional modules, independent of the processor 110.
The wireless communication module 160 may provide solutions for wireless communication applied to the terminal device 100, including Wireless Local Area Networks (WLANs) (e.g., wireless fidelity (Wi-Fi) networks), bluetooth (bluetooth, BT), Global Navigation Satellite System (GNSS), Frequency Modulation (FM), Near Field Communication (NFC), Infrared (IR), and the like. The wireless communication module 160 may be one or more devices integrating at least one communication processing module. The wireless communication module 160 receives electromagnetic waves via the antenna 2, performs frequency modulation and filtering processing on electromagnetic wave signals, and transmits the processed signals to the processor 110. The wireless communication module 160 may also receive a signal to be transmitted from the processor 110, perform frequency modulation and amplification on the signal, and convert the signal into electromagnetic waves through the antenna 2 to radiate the electromagnetic waves.
In some embodiments, the antenna 1 of the terminal device 100 is coupled to the mobile communication module 150 and the antenna 2 is coupled to the wireless communication module 160 so that the terminal device 100 can communicate with the network and other devices through wireless communication technology. The wireless communication technology may include global system for mobile communications (GSM), General Packet Radio Service (GPRS), Code Division Multiple Access (CDMA), Wideband Code Division Multiple Access (WCDMA), time-division code division multiple access (TD-SCDMA), long term evolution (long term evolution, LTE), BT, GNSS, WLAN, NFC, FM, and/or IR technologies, among others. GNSS may include Global Positioning System (GPS), global navigation satellite system (GLONASS), beidou satellite navigation system (BDS), quasi-zenith satellite system (QZSS), and/or Satellite Based Augmentation System (SBAS).
The terminal device 100 implements a display function by the GPU, the display screen 194, and the application processor. The GPU is a microprocessor for image processing, and is connected to the display screen 194 and an application processor. The GPU is used to perform mathematical and geometric calculations for graphics rendering. The processor 110 may include one or more GPUs that execute program instructions to generate or alter display information.
The display screen 194 is used to display images, video, and the like. The display screen 194 includes a display panel. The display panel may adopt a Liquid Crystal Display (LCD), an organic light-emitting diode (OLED), an active-matrix organic light-emitting diode (active-matrix organic light-emitting diode, AMOLED), a flexible light-emitting diode (FLED), a miniature, a Micro-oeld, a quantum dot light-emitting diode (QLED), and the like. In some embodiments, the terminal device 100 may include 1 or N display screens 194, where N is a positive integer greater than 1.
The terminal device 100 can implement a photographing function through the ISP, the camera 193, the video codec, the GPU, the display screen 194, and the application processor, etc.
The ISP is used to process the data fed back by the camera 193. For example, when a photo is taken, the shutter is opened, light is transmitted to the camera photosensitive element through the lens, the optical signal is converted into an electrical signal, and the camera photosensitive element transmits the electrical signal to the ISP for processing and converting into an image visible to naked eyes. The ISP can also carry out algorithm optimization on the noise, brightness and skin color of the image. The ISP can also optimize parameters such as exposure, color temperature and the like of a shooting scene. In some embodiments, the ISP may be provided in camera 193.
The camera 193 is used to capture still images or video. The object generates an optical image through the lens and projects the optical image to the photosensitive element. The photosensitive element may be a Charge Coupled Device (CCD) or a complementary metal-oxide-semiconductor (CMOS) phototransistor. The light sensing element converts the optical signal into an electrical signal, which is then passed to the ISP where it is converted into a digital image signal. And the ISP outputs the digital image signal to the DSP for processing. The DSP converts the digital image signal into image signal in standard RGB, YUV and other formats. In some embodiments, the terminal device 100 may include 1 or N cameras 193, N being a positive integer greater than 1.
The digital signal processor is used for processing digital signals, and can process digital image signals and other digital signals. For example, when the terminal device 100 selects a frequency bin, the digital signal processor is used to perform fourier transform or the like on the frequency bin energy.
Video codecs are used to compress or decompress digital video. The terminal device 100 may support one or more video codecs. In this way, the terminal device 100 can play or record video in a plurality of encoding formats, such as: moving Picture Experts Group (MPEG) 1, MPEG2, MPEG3, MPEG4, and the like.
The NPU is a neural-network (NN) computing processor, which processes input information quickly by referring to a biological neural network structure, for example, by referring to a transfer mode between neurons of a human brain, and can also learn by itself continuously. The NPU can implement applications such as intelligent recognition of the terminal device 100, for example: image recognition, face recognition, speech recognition, text understanding, and the like.
The external memory interface 120 may be used to connect an external memory card, such as a Micro SD card, to extend the storage capability of the terminal device 100. The external memory card communicates with the processor 110 through the external memory interface 120 to implement a data storage function. For example, files such as music, video, etc. are saved in the external memory card.
The internal memory 121 may be used to store computer executable program code, which includes instructions. The processor 110 executes various functional applications of the terminal device 100 and data processing by executing instructions stored in the internal memory 121. The internal memory 121 may include a program storage area and a data storage area. The storage program area may store an operating system, an application program (such as a sound playing function, an image playing function, etc.) required by at least one function, and the like. The storage data area may store data (such as audio data, a phonebook, etc.) created during use of the terminal device 100, and the like. In addition, the internal memory 121 may include a high-speed random access memory, and may further include a nonvolatile memory, such as at least one magnetic disk storage device, a flash memory device, a universal flash memory (UFS), and the like.
The terminal device 100 may implement an audio function through the audio module 170, the speaker 170A, the receiver 170B, the microphone 170C, the earphone interface 170D, and the application processor. Such as music playing, recording, etc.
The audio module 170 is used to convert digital audio information into an analog audio signal output and also to convert an analog audio input into a digital audio signal. The audio module 170 may also be used to encode and decode audio signals. In some embodiments, the audio module 170 may be disposed in the processor 110, or some functional modules of the audio module 170 may be disposed in the processor 110.
The speaker 170A, also called a "horn", is used to convert the audio electrical signal into an acoustic signal. The terminal device 100 can listen to music through the speaker 170A, or listen to a handsfree call.
The receiver 170B, also called "earpiece", is used to convert the electrical audio signal into an acoustic signal. When the terminal device 100 answers a call or voice information, it is possible to answer a voice by bringing the receiver 170B close to the human ear.
The microphone 170C, also referred to as a "microphone," is used to convert sound signals into electrical signals. When making a call or sending voice information, the user can input a voice signal into the microphone 170C by uttering a voice signal by the mouth of the user near the microphone 170C. The terminal device 100 may be provided with at least one microphone 170C. In other embodiments, the terminal device 100 may be provided with two microphones 170C, which may implement a noise reduction function in addition to collecting sound signals. In other embodiments, the terminal device 100 may further include three, four or more microphones 170C to collect sound signals, reduce noise, identify sound sources, and implement directional recording functions.
The headphone interface 170D is used to connect a wired headphone. The headset interface 170D may be the USB interface 130, or may be a 3.5mm open mobile electronic device platform (OMTP) standard interface, a cellular telecommunications industry association (cellular telecommunications industry association of the USA, CTIA) standard interface.
The pressure sensor 180A is used for sensing a pressure signal, and can convert the pressure signal into an electrical signal. In some embodiments, the pressure sensor 180A may be disposed on the display screen 194. The pressure sensor 180A can be of a wide variety, such as a resistive pressure sensor, an inductive pressure sensor, a capacitive pressure sensor, and the like. The capacitive pressure sensor may be a sensor comprising at least two parallel plates having an electrically conductive material. When a force acts on the pressure sensor 180A, the capacitance between the electrodes changes. The terminal device 100 determines the intensity of the pressure from the change in the capacitance. When a touch operation is applied to the display screen 194, the terminal device 100 detects the intensity of the touch operation from the pressure sensor 180A. The terminal device 100 may also calculate the touched position from the detection signal of the pressure sensor 180A. In some embodiments, the touch operations that are applied to the same touch position but different touch operation intensities may correspond to different operation instructions. For example: and when the touch operation with the touch operation intensity smaller than the first pressure threshold value acts on the short message application icon, executing an instruction for viewing the short message. And when the touch operation with the touch operation intensity larger than or equal to the first pressure threshold value acts on the short message application icon, executing an instruction of newly building the short message.
The gyro sensor 180B may be used to determine the motion attitude of the terminal device 100. In some embodiments, the angular velocity of terminal device 100 about three axes (i.e., x, y, and z axes) may be determined by gyroscope sensor 180B. The gyro sensor 180B may be used for photographing anti-shake. Illustratively, when the shutter is pressed, the gyro sensor 180B detects the shake angle of the terminal device 100, calculates the distance to be compensated for the lens module according to the shake angle, and allows the lens to counteract the shake of the terminal device 100 through a reverse movement, thereby achieving anti-shake. The gyroscope sensor 180B may also be used for navigation, somatosensory gaming scenes.
The air pressure sensor 180C is used to measure air pressure. In some embodiments, the terminal device 100 calculates an altitude from the barometric pressure measured by the barometric pressure sensor 180C to assist in positioning and navigation.
The magnetic sensor 180D includes a hall sensor. The terminal device 100 may detect the opening and closing of the flip holster using the magnetic sensor 180D. In some embodiments, when the terminal device 100 is a flip, the terminal device 100 may detect the opening and closing of the flip according to the magnetic sensor 180D. And then according to the detected opening and closing state of the leather sheath or the opening and closing state of the flip, the characteristics of automatic unlocking of the flip and the like are set.
The acceleration sensor 180E can detect the magnitude of acceleration of the terminal device 100 in various directions (generally, three axes). The magnitude and direction of gravity can be detected when the terminal device 100 is stationary. The method can also be used for recognizing the posture of the terminal equipment, and is applied to horizontal and vertical screen switching, pedometers and other applications.
A distance sensor 180F for measuring a distance. The terminal device 100 may measure the distance by infrared or laser. In some embodiments, shooting a scene, the terminal device 100 may range using the distance sensor 180F to achieve fast focus.
The proximity light sensor 180G may include, for example, a Light Emitting Diode (LED) and a light detector, such as a photodiode. The light emitting diode may be an infrared light emitting diode. The terminal device 100 emits infrared light to the outside through the light emitting diode. The terminal device 100 detects infrared reflected light from a nearby object using a photodiode. When sufficient reflected light is detected, it can be determined that there is an object near the terminal device 100. When insufficient reflected light is detected, the terminal device 100 can determine that there is no object near the terminal device 100. The terminal device 100 can utilize the proximity light sensor 180G to detect that the user holds the terminal device 100 close to the ear for talking, so as to automatically turn off the screen to achieve the purpose of saving power. The proximity light sensor 180G may also be used in a holster mode, a pocket mode automatically unlocking and locking the screen.
The ambient light sensor 180L is used to sense the ambient light level. The terminal device 100 may adaptively adjust the brightness of the display screen 194 according to the perceived ambient light level. The ambient light sensor 180L may also be used to automatically adjust the white balance when taking a picture. The ambient light sensor 180L may also cooperate with the proximity light sensor 180G to detect whether the terminal device 100 is in a pocket, in order to prevent accidental touches.
The fingerprint sensor 180H is used to collect a fingerprint. The terminal device 100 can utilize the collected fingerprint characteristics to realize fingerprint unlocking, access to an application lock, fingerprint photographing, fingerprint incoming call answering and the like.
The temperature sensor 180J is used to detect temperature. In some embodiments, the terminal device 100 executes a temperature processing policy using the temperature detected by the temperature sensor 180J. For example, when the temperature reported by the temperature sensor 180J exceeds the threshold, the terminal device 100 performs a reduction in performance of a processor located near the temperature sensor 180J, so as to reduce power consumption and implement thermal protection. In other embodiments, the terminal device 100 heats the battery 142 when the temperature is below another threshold to avoid the terminal device 100 being abnormally shut down due to low temperature. In other embodiments, when the temperature is below a further threshold, the terminal device 100 performs a boost on the output voltage of the battery 142 to avoid abnormal shutdown due to low temperature.
The touch sensor 180K is also referred to as a "touch panel". The touch sensor 180K may be disposed on the display screen 194, and the touch sensor 180K and the display screen 194 form a touch screen, which is also called a "touch screen". The touch sensor 180K is used to detect a touch operation applied thereto or nearby. The touch sensor can communicate the detected touch operation to the application processor to determine the touch event type. Visual output associated with the touch operation may be provided via the display screen 194. In other embodiments, the touch sensor 180K may be disposed on the surface of the terminal device 100, different from the position of the display screen 194.
The bone conduction sensor 180M may acquire a vibration signal. In some embodiments, the bone conduction sensor 180M may acquire a vibration signal of the human vocal part vibrating the bone mass. The bone conduction sensor 180M may also contact the human pulse to receive the blood pressure pulsation signal. In some embodiments, bone conduction sensor 180M may also be provided in a headset, integrated into a bone conduction headset. The audio module 170 may analyze a voice signal based on the vibration signal of the bone mass vibrated by the sound part acquired by the bone conduction sensor 180M, so as to implement a voice function. The application processor can analyze heart rate information based on the blood pressure beating signals acquired by the bone conduction sensor 180M, and the heart rate detection function is realized.
The keys 190 include a power-on key, a volume key, and the like. The keys 190 may be mechanical keys. Or may be touch keys. The terminal device 100 may receive a key input, and generate a key signal input related to user setting and function control of the terminal device 100.
The motor 191 may generate a vibration cue. The motor 191 may be used for incoming call vibration cues, as well as for touch vibration feedback. For example, touch operations applied to different applications (e.g., photographing, audio playing, etc.) may correspond to different vibration feedback effects. The motor 191 may also respond to different vibration feedback effects for touch operations applied to different areas of the display screen 194. Different application scenes (such as time reminding, receiving information, alarm clock, game and the like) can also correspond to different vibration feedback effects. The touch vibration feedback effect may also support customization.
Indicator 192 may be an indicator light that may be used to indicate a state of charge, a change in charge, or a message, missed call, notification, etc.
The SIM card interface 195 is used to connect a SIM card. The SIM card can be brought into and out of contact with the terminal device 100 by being inserted into the SIM card interface 195 or being pulled out of the SIM card interface 195. The terminal device 100 may support 1 or N SIM card interfaces, where N is a positive integer greater than 1. The SIM card interface 195 may support a Nano SIM card, a Micro SIM card, a SIM card, etc. The same SIM card interface 195 can be inserted with multiple cards at the same time. The types of the plurality of cards may be the same or different. The SIM card interface 195 may also be compatible with different types of SIM cards. The SIM card interface 195 is also compatible with external memory cards. The terminal device 100 interacts with the network through the SIM card to implement functions such as communication and data communication. In some embodiments, the terminal device 100 employs esims, namely: an embedded SIM card. The eSIM card may be embedded in the terminal device 100 and cannot be separated from the terminal device 100.
Fig. 13 is a block diagram of a software configuration of the terminal device 100 according to the embodiment of the present application.
The layered architecture divides the software into several layers, each layer having a clear role and division of labor. The layers communicate with each other through a software interface. In some embodiments, the Android system is divided into four layers, an application layer, an application framework layer, an Android runtime (Android runtime) and system library, and a kernel layer from top to bottom.
The application layer may include a series of application packages.
As shown in fig. 13, the application package may include camera, gallery, calendar, phone call, map, navigation, WLAN, bluetooth, music, video, short message, etc. applications.
The application framework layer provides an Application Programming Interface (API) and a programming framework for the application program of the application layer. The application framework layer includes a number of predefined functions.
As shown in FIG. 13, the application framework layers may include a window manager, content provider, view system, phone manager, resource manager, notification manager, and the like.
The window manager is used for managing window programs. The window manager can obtain the size of the display screen, judge whether a status bar exists, lock the screen, intercept the screen and the like.
The content provider is used to store and retrieve data and make it accessible to applications. The data may include video, images, audio, calls made and received, browsing history and bookmarks, phone books, etc.
The view system includes visual controls such as controls to display text, controls to display pictures, and the like. The view system may be used to build applications. The display interface may be composed of one or more views. For example, the display interface including the short message notification icon may include a view for displaying text and a view for displaying pictures.
The phone manager is used to provide the communication function of the terminal device 100. Such as management of call status (including on, off, etc.).
The resource manager provides various resources for the application, such as localized strings, icons, pictures, layout files, video files, and the like.
The notification manager enables the application to display notification information in the status bar, can be used to convey notification-type messages, can disappear automatically after a short dwell, and does not require user interaction. Such as a notification manager used to inform download completion, message alerts, etc. The notification manager may also be a notification that appears in the form of a chart or scroll bar text at the top status bar of the system, such as a notification of a background running application, or a notification that appears on the screen in the form of a dialog window. For example, text information is prompted in the status bar, a prompt tone is given, the terminal device vibrates, an indicator light flickers, and the like.
The Android Runtime comprises a core library and a virtual machine. The Android runtime is responsible for scheduling and managing an Android system.
The core library comprises two parts: one part is a function which needs to be called by java language, and the other part is a core library of android.
The application layer and the application framework layer run in a virtual machine. And executing java files of the application program layer and the application program framework layer into a binary file by the virtual machine. The virtual machine is used for performing the functions of object life cycle management, stack management, thread management, safety and exception management, garbage collection and the like.
The system library may include a plurality of functional modules. For example: surface managers (surface managers), Media Libraries (Media Libraries), three-dimensional graphics processing Libraries (e.g., OpenGL ES), 2D graphics engines (e.g., SGL), and the like.
The surface manager is used to manage the display subsystem and provide fusion of 2D and 3D layers for multiple applications.
The media library supports a variety of commonly used audio, video format playback and recording, and still image files, among others. The media library may support a variety of audio-video encoding formats, such as MPEG4, h.264, MP3, AAC, AMR, JPG, PNG, and the like.
The three-dimensional graphic processing library is used for realizing three-dimensional graphic drawing, image rendering, synthesis, layer processing and the like.
The 2D graphics engine is a drawing engine for 2D drawing.
The kernel layer is a layer between hardware and software. The inner core layer at least comprises a display driver, a camera driver, an audio driver and a sensor driver.
Optionally, an embodiment of the present application further provides a terminal device, including: one or more processors and memory. The memory is coupled to the one or more processors and is operable to store computer program code comprising computer instructions. The computer instructions, when executed by the one or more processors, cause the terminal device to perform one or more steps of any of the methods described above.
Optionally, the present application further provides a computer-readable storage medium, which stores instructions that, when executed on a computer or a processor, cause the computer or the processor to execute one or more steps of any one of the methods described above.
Optionally, the present application further provides a computer program product containing instructions, which when run on a computer or a processor, causes the computer or the processor to perform one or more steps of any of the methods described above.
Optionally, an embodiment of the present application further provides a chip system, where the chip system may include a memory and a processor, and the processor executes a computer program stored in the memory to implement one or more steps of any of the methods described above. The chip system can be a single chip or a chip module consisting of a plurality of chips.
Optionally, an embodiment of the present application further provides a chip system, where the chip system may include a processor, the processor is coupled with a memory, and the processor executes a computer program stored in the memory to implement one or more steps of any of the above methods. The chip system can be a single chip or a chip module consisting of a plurality of chips.
In the above embodiments, the implementation may be wholly or partially realized by software, hardware, firmware, or any combination thereof. When implemented in software, may be implemented in whole or in part in the form of a computer program product. The computer program product includes one or more computer instructions. The procedures or functions according to the embodiments of the present application are all or partially generated when the computer program instructions are loaded and executed on a computer. The computer may be a general purpose computer, a special purpose computer, a network of computers, or other programmable device. The computer instructions may be stored on or transmitted over a computer-readable storage medium. The computer instructions may be transmitted from one website site, computer, server, or data center to another website site, computer, server, or data center via wired (e.g., coaxial cable, fiber optics, digital subscriber line) or wireless (e.g., infrared, wireless, microwave, etc.). The computer-readable storage medium can be any available medium that can be accessed by a computer or a data storage device, such as a server, a data center, etc., that incorporates one or more of the available media. The usable medium may be a magnetic medium (e.g., a floppy disk, a hard disk, a magnetic tape), an optical medium (e.g., a DVD), or a semiconductor medium (e.g., a Solid State Disk (SSD)), among others.
One of ordinary skill in the art will appreciate that all or part of the processes in the methods of the above embodiments may be implemented by hardware related to instructions of a computer program, which may be stored in a computer-readable storage medium, and when executed, may include the processes of the above method embodiments. And the aforementioned storage medium includes: various media capable of storing program codes, such as ROM or RAM, magnetic or optical disks, etc.
Finally, it should be noted that: the above description is only an embodiment of the present application, but the scope of the present application is not limited thereto, and any changes or substitutions within the technical scope of the present disclosure should be covered by the scope of the present application. Therefore, the protection scope of the present application shall be subject to the protection scope of the claims.

Claims (13)

1. A terminal application control method is characterized by comprising the following steps:
the method comprises the steps that first terminal equipment sets a first label for a page element in a target application interface and collects the content of the page element to obtain page layout information, wherein the page layout information comprises the first label and the content of the page element;
the first terminal equipment acquires the type and the sharing authority of the second terminal equipment and sends the page layout information, the type and the sharing authority of the second terminal equipment to a server;
the server determines a target page template according to the type of the second terminal device, renders a page corresponding to the target page template based on the page layout information, and generates an address identifier of the rendered page;
the first terminal equipment acquires the address identification and sends the address identification to the second terminal equipment;
and the second terminal equipment opens the address identifier through a browser and displays a target page adaptive to the interface of the browser, wherein the target page comprises the target application interface.
2. The method of claim 1, further comprising:
the second terminal device monitors an operation event acting on the target page through a browser and sends the operation event to the server, wherein the operation event comprises data, types, operation modes and window titles of the operation event;
the server sends the operation event to the first terminal equipment;
and the first terminal equipment updates the page element of the target application according to the operation event.
3. A terminal application control method is applicable to a first terminal device, and comprises the following steps:
setting a first label for a page element in a target application interface, and acquiring the content of the page element to obtain page layout information, wherein the page layout information comprises the first label and the content of the page element;
acquiring the type of second terminal equipment, and sending the page layout information and the type of the second terminal equipment to a server;
receiving an address identifier sent by a server, wherein the address identifier is obtained by the server according to a rendered page, the rendered page is obtained by the server determining a target page template according to the type of the second terminal device, and the target page template is rendered according to the page layout information;
and sending the address identifier to the second terminal device, so that the second terminal device opens the address identifier through a browser and displays a target page adapted to an interface of the browser, wherein the target page comprises the target application interface.
4. The method of claim 3, wherein the method comprises:
receiving an operation event sent by the server, wherein the operation event comprises data, type, operation mode and window title of the operation event, and the operation event is the operation event which is received by the server and acts on the target page;
generating a control instruction according to the operation event;
and executing the control instruction to update the page element of the target application.
5. The method according to claim 3, wherein the second terminal device comprises a plurality of types of second terminal devices, the address identification is multiple, and each page address corresponds to one type of second terminal device;
the sending the address identifier to the second terminal device includes:
adding a unique identifier to each address identifier;
and sending the added unique address identification to the corresponding second terminal equipment.
6. The method according to claim 3, wherein the second terminal device comprises a second terminal device A and a second terminal device B, the sharing right of the second terminal device A is an interface for displaying the target application, and the sharing right of the second terminal device B is an interface for controlling the target application;
the method further comprises the following steps:
and sending the address identifier to a second terminal device A, and sending the control information of the target application to a second terminal device B, so that the second terminal device B can control the display of the target page by the second terminal device A.
7. A terminal application control method is applicable to a server, and the method comprises the following steps:
receiving page layout information and a type of second terminal equipment, wherein the page layout information and the type of the second terminal equipment are sent by first terminal equipment, the page layout information comprises a first label and content of a page element in a target application interface in the first terminal equipment, and the first label is a label set for the page element;
determining a target page template according to the type of the second terminal equipment, and rendering the target page template according to the page layout information;
after the target page template is rendered, generating an address identifier of the rendered page;
and sending the address identifier to the first terminal device, so that the first terminal device sends the address identifier to the second terminal device, the second terminal device opens the address identifier through a browser, and displays a target page adapted to an interface of the browser, wherein the target page comprises the target application interface.
8. The method of claim 7, wherein the page elements are first controls in the target application interface, one of the first controls corresponding to one of the first tabs; the types of the second terminal equipment are multiple, each type corresponds to at least one target page template, each target page template comprises a plurality of second controls, and each second control corresponds to one second label;
the rendering the target page template according to the page layout information includes:
determining second controls corresponding to the first controls according to a preset label corresponding relation between the first labels and the second labels;
and associating the content of the first control with the corresponding second control, and rendering the target page template.
9. The method of claim 7, further comprising:
receiving an operation event acted on a browser interface of the second terminal device, wherein the operation event comprises data, type, operation mode and window title of the operation event;
and sending the operation event to the first terminal device, so that the first terminal device updates the page element of the target application according to the operation event.
10. The method of claim 9, further comprising:
receiving the sharing authority of the second terminal equipment sent by the first terminal equipment; the sharing permission comprises an interface for displaying the target application, and the interface for displaying the target application and controlling the target application;
if the sharing permission is to display the interface of the target application, the step of receiving the operation event acted on the browser interface of the second terminal device is not executed;
and if the sharing permission is to display the interface of the target application and control the target application, executing the step of receiving the operation event acted on the browser interface of the second terminal equipment.
11. The method according to claim 7, wherein the second terminal device comprises a second terminal device A and a second terminal device B, the sharing right of the second terminal device A is an interface for displaying the target application, and the sharing right of the second terminal device B is an interface for controlling the target application;
the sending the address identifier to the first terminal device includes:
and sending the address identifier and the control information of the target application to a first terminal device, so that the first terminal device sends the address identifier to a second terminal device A, and sends the control information to a second terminal device B, so that the second terminal device B can control the second terminal device A to display the target page.
12. A terminal device, comprising: one or more processors and memory;
the memory coupled with the one or more processors, the memory to store computer program code, the computer program code comprising computer instructions;
the computer instructions, when executed by the one or more processors, cause the terminal device to perform the method of any of claims 3-6, or the method of any of claims 7-11.
13. A chip system, characterized in that the chip system comprises a processor coupled with a memory, the processor executing a computer program stored in the memory to implement the method of any of claims 3 to 6, or the method of any of claims 7 to 11.
CN202110205358.8A 2021-02-24 2021-02-24 Terminal application control method, terminal equipment and chip system Pending CN115033313A (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
CN202110205358.8A CN115033313A (en) 2021-02-24 2021-02-24 Terminal application control method, terminal equipment and chip system
PCT/CN2021/140198 WO2022179275A1 (en) 2021-02-24 2021-12-21 Terminal application control method, terminal device, and chip system

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110205358.8A CN115033313A (en) 2021-02-24 2021-02-24 Terminal application control method, terminal equipment and chip system

Publications (1)

Publication Number Publication Date
CN115033313A true CN115033313A (en) 2022-09-09

Family

ID=83047781

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110205358.8A Pending CN115033313A (en) 2021-02-24 2021-02-24 Terminal application control method, terminal equipment and chip system

Country Status (2)

Country Link
CN (1) CN115033313A (en)
WO (1) WO2022179275A1 (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115981588A (en) * 2023-03-16 2023-04-18 中国邮电器材集团有限公司 Multi-terminal data display method, equipment and system

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115964011B (en) * 2023-03-16 2023-06-06 深圳市湘凡科技有限公司 Method and related device for displaying application interface based on multi-screen cooperation
CN117130699A (en) * 2023-04-06 2023-11-28 荣耀终端有限公司 Interface display method and electronic equipment

Family Cites Families (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20110258535A1 (en) * 2010-04-20 2011-10-20 Scribd, Inc. Integrated document viewer with automatic sharing of reading-related activities across external social networks
CN104346329B (en) * 2013-07-23 2018-07-06 腾讯科技(深圳)有限公司 Realize method, apparatus and equipment that the uniform resource locator page is shared
CN109033466B (en) * 2018-08-31 2019-12-03 掌阅科技股份有限公司 Page sharing method calculates equipment and computer storage medium

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115981588A (en) * 2023-03-16 2023-04-18 中国邮电器材集团有限公司 Multi-terminal data display method, equipment and system
CN115981588B (en) * 2023-03-16 2023-09-26 中国邮电器材集团有限公司 Multi-terminal data display method, device and system

Also Published As

Publication number Publication date
WO2022179275A1 (en) 2022-09-01

Similar Documents

Publication Publication Date Title
CN109814766B (en) Application display method and electronic equipment
CN110471639B (en) Display method and related device
WO2020259452A1 (en) Full-screen display method for mobile terminal, and apparatus
WO2021000804A1 (en) Display method and apparatus in locked state
WO2020253758A1 (en) User interface layout method and electronic device
CN111316199B (en) Information processing method and electronic equipment
CN113961157B (en) Display interaction system, display method and equipment
WO2022179275A1 (en) Terminal application control method, terminal device, and chip system
US20230418630A1 (en) Operation sequence adding method, electronic device, and system
WO2022253158A1 (en) User privacy protection method and apparatus
WO2022135157A1 (en) Page display method and apparatus, and electronic device and readable storage medium
CN115333941A (en) Method for acquiring application running condition and related equipment
WO2021190524A1 (en) Screenshot processing method, graphic user interface and terminal
CN115022982B (en) Multi-screen cooperative non-inductive access method, electronic equipment and storage medium
CN114513575B (en) Method for collection processing and related device
CN114168115B (en) Communication system, application downloading method and device
WO2023016347A1 (en) Voiceprint authentication response method and system, and electronic devices
CN115730129A (en) Message pushing method, electronic equipment and system
CN116301510A (en) Control positioning method and electronic equipment
CN114692641A (en) Method and device for acquiring characters
CN115050058A (en) Fingerprint identification method and electronic equipment
CN114518965A (en) Cut and pasted content processing method and device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination