WO2022194005A1 - 一种跨设备同步显示的控制方法及系统 - Google Patents

一种跨设备同步显示的控制方法及系统 Download PDF

Info

Publication number
WO2022194005A1
WO2022194005A1 PCT/CN2022/079942 CN2022079942W WO2022194005A1 WO 2022194005 A1 WO2022194005 A1 WO 2022194005A1 CN 2022079942 W CN2022079942 W CN 2022079942W WO 2022194005 A1 WO2022194005 A1 WO 2022194005A1
Authority
WO
WIPO (PCT)
Prior art keywords
display
control data
remote control
interface
display content
Prior art date
Application number
PCT/CN2022/079942
Other languages
English (en)
French (fr)
Inventor
杨帆
卢曰万
张乐乐
Original Assignee
华为技术有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 华为技术有限公司 filed Critical 华为技术有限公司
Publication of WO2022194005A1 publication Critical patent/WO2022194005A1/zh

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/14Digital output to display device ; Cooperation and interconnection of the display device with other functional units
    • G06F3/1423Digital output to display device ; Cooperation and interconnection of the display device with other functional units controlling a plurality of local displays, e.g. CRT and flat panel display
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/14Digital output to display device ; Cooperation and interconnection of the display device with other functional units
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/14Digital output to display device ; Cooperation and interconnection of the display device with other functional units
    • G06F3/147Digital output to display device ; Cooperation and interconnection of the display device with other functional units using display panels

Definitions

  • the present application relates to the technical field of smart devices, and in particular, to a control method and system for synchronous display across devices.
  • Screen projection technology has been widely used in people's work and life.
  • the user can connect the smart terminal to the monitor and display the content displayed by the smart terminal on the screen. in the display to increase the viewing field.
  • the user in the process of controlling the content displayed on the display, the user needs to operate on the smart terminal. During this process, the user's line of sight needs to be switched back and forth between the smart terminal and the display, and the operation is not very convenient.
  • an embodiment of the present application provides a control method for synchronous display across devices, including:
  • the remote control data is received by a first device having a first display, the first device being coupled to a second device having a second display, the first display synchronously displaying a first display, wherein the first display is displayed on the second display;
  • the second display content is displayed on the first display.
  • the first device may send the remote control data received by the first device to the second device, and the first device synchronously displays the content displayed by the second device.
  • the second device may adjust the displayed first display content according to the remote control data, generate second display content, and send the second display content to the first device.
  • the first display is further configured to display an interface focus
  • the remote control data includes operation data on the interface focus
  • setting the interface focus in the display interface allows the user to visually recognize the position of the focus in the current display interface, so that operations can be performed according to the interface focus.
  • the interface focus is set on a control of the display interface.
  • the interface focus is set on the control of the display interface, so that the interface focus can be focused on a relatively important position in the display interface, thereby improving the user's adjustment efficiency of the display interface.
  • the sending the remote control data to the second device includes:
  • the remote control data is converted into a control command matching the second device at the first device end, then the second device directly executes the control command after receiving the control command.
  • the receiving the second display content includes:
  • the interface focus is superimposed on the position of the second display content.
  • the focus of the interface and the second display content can be superimposed on the first device side.
  • an embodiment of the present application provides a control method for synchronous display across devices, including:
  • the second display of the second device displays the first display content
  • the second device receives remote control data, wherein the remote control data is for a first device having a first display, the second device is coupled to the first device, the first display synchronously displays the first device Display content;
  • the second display content is sent.
  • the first device may send the remote control data received by the first device to the second device, and the first device synchronously displays the content displayed by the second device.
  • the second device may adjust the displayed first display content according to the remote control data, generate second display content, and send the second display content to the first device.
  • the remote control data includes operation data for the interface focus displayed on the first display.
  • setting the interface focus in the display interface allows the user to visually recognize the position of the focus in the current display interface, so that operations can be performed according to the interface focus.
  • the adjusting the first display content according to the remote control data includes:
  • the first display content is adjusted according to the new position to generate the second display content.
  • the second display content is determined according to the new position of the interface focus, so that the display of the interface focus can automatically meet the preset requirements.
  • the sending the second display content includes:
  • the second device may send the second display content of the superimposed interface focus to the first device after completing the superimposition of the interface focus, so that the first device may directly display the second display content.
  • the sending the second display content includes:
  • the second display content and the new position of the interface focus are sent.
  • the first device may implement the superimposition of the interface focus and the second display content.
  • the interface focus includes a control in the first display content.
  • the interface focus is set on the control of the display interface, so that the interface focus can be focused on a relatively important position in the display interface, thereby improving the user's adjustment efficiency of the display interface.
  • the remote control data includes a control instruction matching the second device converted from the original remote control data according to a preset conversion rule.
  • the remote control data includes a control instruction after the first device has been converted according to a preset conversion rule.
  • the adjusting the first display content according to the remote control data includes:
  • the first display content is adjusted by using the control instruction.
  • the second device can complete the conversion of the original remote control data by itself, and generate a matching control instruction.
  • converting the remote control data into a control instruction matching the second device according to a preset conversion rule includes:
  • the remote control data is parsed into a control instruction matching the second device according to the preset parsing rule, and the preset parsing rule is related to the identification information of the first device.
  • the first device when the first device is diversified and the design rules of the remote control data are different, the first device may determine, according to the identification information of the first device, a device that matches the first device. Preset parsing rules, and use the preset parsing rules to parse the remote control data.
  • embodiments of the present application provide a first device, including a first display, a first network module, a first data receiving module, and a first data sending module, wherein,
  • the first network module for coupling with a second device having a second display
  • the first display configured to synchronously display the first display content, wherein the first display content is displayed by the second display, and used to display the second display content;
  • the first data receiving module is used to receive remote control data, and is used to receive the second display content, the second display content is the second device according to the remote control data or the remote control data.
  • the matching control data is obtained by adjusting the first display content
  • the first data sending module is configured to send the remote control data or the control data to the second device.
  • the first display is further configured to display an interface focus
  • the remote control data includes operation data on the interface focus
  • the interface focus is set on a control of the display interface.
  • the first device further includes a first data processing module
  • the first data processing module is used to convert the remote control data into a control instruction matching the second device according to a preset conversion rule
  • the first data sending module is configured to send the control instruction to the second device.
  • the first data receiving module is specifically used for:
  • the interface focus is superimposed on the position of the second display content.
  • the embodiments of the present application provide a terminal device and also provide a second device, where the second device includes a second display, a second network module, a second data receiving module, a second data processing module, and a second data sending module, in,
  • the second display for displaying the first display content
  • the second network module is configured to be coupled with a first device having a first display, and the first display synchronously displays the first display content;
  • the second data receiving module configured to receive remote control data for the first device or control data matching the remote control data
  • the second data processing module configured to adjust the first display content according to the remote control data or the control data, and generate a second display content
  • the second data sending module is used for sending the second display content.
  • the remote control data includes operation data for the interface focus displayed on the first display.
  • the second data processing module is specifically used for:
  • the first display content is adjusted according to the new position to generate the second display content.
  • the second data sending module is specifically used for:
  • the second data sending module is specifically used for:
  • the second display content and the new position of the interface focus are sent.
  • the interface focus includes a control in the first display content.
  • control data includes a control instruction executable by the terminal device obtained by converting the remote control data according to a preset conversion rule.
  • the second data processing module is specifically used for:
  • the first display content is adjusted by using the control instruction.
  • the second data processing module is specifically used for:
  • the remote control data is parsed into a control instruction matching the second device according to the preset parsing rule.
  • embodiments of the present application provide a terminal device, including: a processor; a memory for storing instructions executable by the processor; wherein the processor is configured to implement the above-mentioned first when executing the instructions.
  • a terminal device including: a processor; a memory for storing instructions executable by the processor; wherein the processor is configured to implement the above-mentioned first when executing the instructions.
  • an embodiment of the present application provides a control system for synchronous display across devices, including the first device and the fourth aspect or the fourth aspect in the third aspect or multiple possible implementations of the third aspect.
  • a second device in various possible implementations of the aspect.
  • embodiments of the present application provide a computer program product, comprising computer-readable codes, or a non-volatile computer-readable storage medium carrying computer-readable codes, when the computer-readable codes are stored in an electronic
  • the processor in the electronic device executes the first aspect or one or more of the control methods for synchronous display across devices in the first aspect or multiple possible implementation manners of the first aspect.
  • an embodiment of the present application provides a chip, where the chip includes at least one processor, where the processor is configured to run a computer program or computer instructions stored in a memory to execute any method that may be implemented in any of the foregoing aspects.
  • the chip may further include a memory for storing computer programs or computer instructions.
  • the chip may further include a communication interface for communicating with other modules other than the chip.
  • one or more chips may constitute a chip system.
  • FIG. 1 shows a schematic structural diagram of a control system for synchronous display across devices according to an embodiment of the present application.
  • FIG. 2 shows a schematic structural diagram of the modules of the first device 101 and the remote control device 103 .
  • FIG. 3 shows a working flow chart of the control system for synchronous display across devices.
  • FIG. 4 shows a schematic flowchart of an embodiment of a method for controlling synchronous display across devices.
  • FIG. 5 shows a schematic diagram of a user interface 500 having multiple controls.
  • FIG. 6 shows a schematic diagram of a user interface 600 .
  • FIG. 7 shows a schematic flowchart of another embodiment of a method for controlling synchronous display across devices.
  • FIG. 8 shows a method flowchart of an embodiment of the method for adjusting the first display interface.
  • FIG. 9 shows a schematic diagram of a user interface 900 .
  • FIG. 10 shows a schematic diagram of a user interface 1000 .
  • FIG. 11 shows a schematic diagram of a module structure of an embodiment of the first device 101 and the second device 105 .
  • FIG. 12 shows a schematic structural diagram of a terminal device according to an embodiment of the present application.
  • FIG. 13 shows a block diagram of a software structure of a terminal device according to an embodiment of the present application.
  • “/” may indicate that the objects associated before and after are an “or” relationship, for example, A/B may indicate A or B; “and/or” may be used to describe that there are three types of associated objects A relationship, for example, A and/or B, can mean that A exists alone, A and B exist at the same time, and B exists alone, where A and B can be singular or plural.
  • words such as “first” and “second” may be used to distinguish technical features with the same or similar functions. The words “first”, “second” and the like do not limit the quantity and execution order, and the words “first”, “second” and the like do not limit the difference.
  • words such as “exemplary” or “for example” are used to represent examples, illustrations or illustrations, and any embodiment or design solution described as “exemplary” or “for example” should not be construed are preferred or advantageous over other embodiments or designs.
  • the use of words such as “exemplary” or “such as” is intended to present the relevant concepts in a specific manner to facilitate understanding.
  • the technical feature is distinguished by “first”, “second”, “third”, “A”, “B”, “C” and “D”, etc.
  • the technical features described in the “first”, “second”, “third”, “A”, “B”, “C” and “D” described technical features in no order or order of magnitude.
  • FIG. 1 is a schematic structural diagram of a cross-device synchronous display control system 100 provided by an embodiment of the present application.
  • the system includes a first device 101 , a remote control device 103 of the first device 101 , and a second device 105 .
  • the first device 101 has a first display 107
  • the second device 105 has a second display 109 .
  • the first device 101 may include a device with a display function such as a smart display device, a smart TV, a projection device, etc.
  • the remote control device 103 may include a matching device with the smart display device, the smart TV, and the projection device.
  • Remote control such as TV remote control, display device remote control, projector remote control, etc.
  • the second device 105 may include a smart terminal with a display, such as a smart phone, a tablet computer, a personal digital assistant (PDA), etc.
  • PDA personal digital assistant
  • a data transmission channel is established between the first device 101 and the second device 105, for example, the first device 101 and the second device 105 access the same wireless local area network (Wireless Local Area Networks, WLAN) (such as Wireless Fidelity (Wireless Fidelity, Wi-Fi) network), through the wireless local area network, data can be transferred between the first device 101 and the second device 105.
  • WLAN Wireless Local Area Networks
  • the data transmission channel is not limited to the above-mentioned wireless local area network, for example, it may also include a short-range wireless communication channel, such as bluetooth, infrared, etc., and may even include a wired connection, which is not limited in this application.
  • the second device 105 can transmit the content displayed by the second display 109 to the first device 101 through the data transmission channel, and the first device 101 can display the content transmitted by the second device 105 on the first display 107 .
  • the first device 101 and the second device 105 are connected to the same Wi-Fi network, and based on the mirroring protocol, the second device 105 can send a screenshot image to the first device 101 at a rate of several frames per second, The first device 101 can continuously display the screenshot image transmitted by the second device 105 .
  • the mirroring protocol may include, for example, any protocol that can synchronize the display interface of the second device 105 to the first device 101, such as airplay mirroring, lelink mirroring, etc., which is not limited herein.
  • the smart phone and the smart display device are connected to the same Wi-Fi network. Through the Wi-Fi network, the smart phone can transmit the displayed data content to the smart display device and display it on the display screen to achieve The effect of synchronizing smartphones and smart display devices.
  • a data transmission channel may also be established between the first device 101 and the second device 105 through a screen-casting device, especially for the second device 105 that does not have a mirroring protocol, the screen-casting device may be used as The function expansion module of the second device 105 enables the second device 105 to have the function of synchronous display across devices.
  • FIG. 2 shows an exemplary schematic structural diagram of establishing the data transmission channel by using the screen projection device 201 . As shown in FIG.
  • the screen projection device 201 may be connected to the first device 101, and may specifically include a wired connection, such as connecting the screen projection device 201 to a high definition multimedia interface (High Definition Multimedia Interface, HDMI) of the first device 101, Of course, wireless connection methods such as Bluetooth may also be included, which is not limited here.
  • the screen projection device 201 and the second device 105 can be connected to the routing device 203 at the same time, so that a data transmission channel between the screen projection device 201 and the second device 105 can be established, that is, the connection between the first device 101 and the second device 105 can be established. data transmission channel.
  • the first device 101 and the remote control device 103 are matched with each other, and the user can send a control instruction to the first device 101 by operating the remote control device 103 .
  • the remote control device 103 may include a keyboard 301, an encoding module 303, a modulation module 305, a signal transmitting module 307, etc.
  • the first device 101 may include a signal receiving module 309 (such as a photoelectric conversion amplifier) circuit), demodulation module 311, decoding module 313, execution module 315, etc.
  • the encoding module 303 can encode the selected key, and the modulation module 305 converts the generated code into a modulated wave, and then the signal transmission module 307 converts the generated code into a modulated wave.
  • the modulated wave is emitted.
  • the signal receiving module 309 can use the demodulation module 311 and the decoding module 313 to demodulate and decode the modulated wave, respectively, to generate a pulse signal with a certain frequency. Pulse signals of different frequencies correspond to different control commands respectively.
  • the execution model 315 can execute the corresponding control commands.
  • the signal transmitting module 307 may transmit the modulated wave through short-range radio waves such as infrared rays and Bluetooth, which is not limited in this application.
  • a data transmission channel such as a wired channel, a wireless local area network, a short-range wireless communication channel (such as a wired channel, a wireless local area network, and a short-range wireless communication channel (such as Bluetooth, infrared), etc.
  • the remote control device 103 may send remote control data to the first device 101, and the remote control data may include data such as the above-mentioned modulated wave.
  • the first device 101 may send the remote control data or the control data matching the remote control data to the second device 105.
  • control data may include control instructions executable by the second device 105 obtained by converting the remote control data according to a preset conversion rule.
  • the above parsing process may be performed on the first device 101 or may be performed on the second device 105, which is not limited in this application.
  • the second device 105 may adjust the first display content currently displayed by the second device 105 according to the control instruction parsed by using the remote control data, and generate the second display content.
  • the first display content includes a first display interface
  • the second display content may include a second display interface.
  • the second device 105 may send the second display interface to the first device 101 through the data transmission channel.
  • the first device 101 may display the second display interface.
  • Displaying the second display interface by the first device 101 may include that the first device 101 uses the first display 107 to display the same data content as the second display interface.
  • the first device 101 may, during the process of displaying the second display interface, The size of the second display interface is adapted so that the adapted second display interface matches the size of the first display 107 .
  • the remote control device 103 of the first device 101 can be used to control the second device 105 based on the data transmission channel between the first device 101 and the second device 105 .
  • the user only needs to use the remote control device 103 of the first device 101 to control the adjustment of the interface based on the interface displayed by the first device 101 . From the perspective of the user's sense of use experience, the user's line of sight does not need to switch back and forth between the first device 101 and the second device 105 to complete the adjustment of the display interface.
  • control method for synchronous display across devices described in the present application will be described in detail below with reference to the accompanying drawings.
  • the present application provides method operation steps as shown in the following embodiments or drawings, more or less operation steps may be included in the method based on routine or without creative effort.
  • steps that logically do not have a necessary causal relationship the execution order of these steps is not limited to the execution order provided by the embodiments of the present application.
  • the method can be executed sequentially or in parallel (eg, parallel processor or multi-threaded processing environment) according to the method shown in the embodiment or the accompanying drawings during the actual control process of cross-device synchronous display or when the apparatus is executed.
  • the first device 101 may send the remote control data received from the remote control device 103 to the second device 105 .
  • the remote control data needs to be converted into control data matching with the second device 105 before it can be recognized by the second device 105 .
  • the first device 101 may also convert the remote control data into control data matching the second device 105 before sending it to the second device 105 .
  • the control data includes a control instruction executable by the second device 105 obtained by converting the remote control data according to a preset conversion rule.
  • the preset conversion rule may include the correspondence between the remote control data and the control instructions identifiable by the second device 105 , that is, converting the remote control data identifiable by the first device 101 into the remote control data identifiable by the second device 105 control command.
  • Table 1 shows an example of the preset conversion rule. As shown in Table 1, in the example table of the preset conversion rule, the first column is the remote control data of the first device 101, such as "up key", "down key” ”, “return key”, etc., the corresponding control commands corresponding to the remote control data can be set.
  • the first device 101 can convert the remote control data into a control instruction of "move the user interface down" in the second device 105 according to the preset conversion rules shown in Table 1 .
  • the preset conversion rules are not limited to the examples shown in Table 1, and any conversion relationship between remote control data and control instructions can be set, which is not limited in this application.
  • the remote control data conversion may also be completed by a third-party device, which is coupled to the first device 101.
  • the third-party device may include a cloud server or server cluster, and may also include the first device 101 external devices, such as U disk and other external storage processors.
  • the first device 101 converts the remote control data into control data matching the second device 105.
  • the second device 105 can directly execute the control data, reducing the number of second devices 105. Resource consumption on one side.
  • first devices 101 do not have processing capabilities, for example, display devices of relatively old styles. Based on this, the first device 101 can forward the original remote control data to the second device 105 .
  • the second device 105 may parse the original remote control data into a control instruction matching the second device 105 according to a preset parsing rule.
  • the preset parsing rules may include the preset conversion rules shown in Table 1.
  • the remote control data may include not only the control instructions parsed by the first device 101 shown in Table 1, but also the original modulated waves, etc. , this application does not limit it.
  • the second device 105 converts the remote control data into a control instruction matching the second device according to a preset conversion rule, including:
  • S101 Acquire identification information of the first device 101
  • S103 Parse the remote control data into a control instruction matching the second device 105 according to a preset parsing rule, where the preset parsing rule is related to the identification information of the first device.
  • the second device 105 in the process of converting the remote control data, first, can obtain the identification information of the first device 101 , and specifically, establish data between the first device 101 and the second device 105 During the process of the channel, the second device 105 can obtain the identification information of the first device 101 , and the identification information may include, for example, information such as the brand and model of the first device that can identify its identity.
  • Table 2 shows an exemplary table of the preset parsing rules. As shown in Table 2, for different first devices 101, the parsing rules between the corresponding remote control data and control instructions are different.
  • the remote control data corresponding to the first device 101 identified as 0001 is the "up key”
  • the remote control data corresponding to the second device 101 identified as 0002 is the number key 2
  • the second device 105 may adjust the first display interface currently displayed on the second device 105 according to the control command to generate a second display interface , and send the second display interface to the first device 101 .
  • the first device 101 can use the first display 107 to display the second interface, so as to realize synchronous display with the first device 101 .
  • the first device 101 can display the second display interface in the process of displaying the second display interface.
  • the size of the second display interface is adapted, so that the adapted second display interface matches the size of the first display 107 . From the viewing field of view of the user, the images displayed on the first display 107 and the second display 109 are adapted to the size of the displays.
  • the first display 107 can not only display the display interface synchronized by the second device 105, but also display the interface focus in the display interface.
  • the interface focus includes a focused position in the display interface, and the interface focus in the display interface functions similarly to a mouse cursor.
  • the interface focus can be set at any position on the display interface, including controls, pictures, texts, and the like.
  • the interface focus may be displayed in any style that can be highlighted in the display interface, such as cursor, bounding box, highlight, mask, etc. the interface focus, or adding a bounding box to the picture where the interface focus is located, or highlighting the control where the interface focus is located.
  • the remote control data may include manipulation of the interface focus.
  • the operations include, for example, moving the interface focus up, down, left, and right, and may also include operations on the page element where the interface focus is located, such as opening a link, picture, and the like where the interface focus is located.
  • the interface focus may be set on a control of the display interface.
  • the controls may include visual images provided in the display interface, such as buttons, file editing boxes, and the like.
  • the control may have an execution function or a function of triggering code to run and complete a response through an event.
  • FIG. 5 shows a user interface 500 having a plurality of controls. As shown in FIG. 5 , the interface focus in the user interface 500 is on control 2 .
  • the interface focus is set on the control of the display interface, so that the interface focus can be focused on a relatively important position in the display interface, thereby improving the user's adjustment efficiency of the display interface.
  • the second device 105 when the remote control data includes operation data for the interface focus displayed by the first display 107, after receiving the remote control data, the second device 105 can The first display interface is adjusted to generate a second display interface. Specifically, in one embodiment, the second device 105 adjusts the first display interface according to the remote control data, including:
  • the new position of the interface focus may be determined according to the remote control data.
  • the new position may include, for example, the identifier of the control, the coordinates of the cursor in the display interface, and the like.
  • the control instructions obtained by parsing are all operation instructions for the interface focus. For example, for the remote control data "up key", it means to move the interface focus up, and in the case that the interface focus is set on a control, the remote control data "up key" corresponds to changing the interface focus to the previous control. Control instruction.
  • the first display interface may be adjusted according to the new position to generate a second display interface.
  • a specific adjustment manner may include adjusting the first display interface so that the new position of the interface focus is located in the upper half of the generated second display interface.
  • FIG. 6 An exemplary scenario is described below with reference to the application scenario diagram shown in FIG. 6 , the method flowchart shown in FIG. 7 , and the user interface diagrams shown in FIG. 8 and FIG. 9 .
  • the first device 101 and the second device 105 are displaying the above-mentioned user interface 500 synchronously.
  • the user's line of sight can stay on the first display 107 of the first device 101.
  • the user wants to move the interface focus from the control 2 to the control 4.
  • the user can operate the remote control device 103, such as clicking Down key of remote control device 103. From the perspective of the method execution of the second device 105 , as shown in FIG.
  • the second device 105 receives the remote control data “down key” from the first device 101 .
  • the second device 105 may convert the remote control data "down key” into a control instruction of "interface focus down" according to the preset parsing rule.
  • the second device 105 may determine a new position of the interface focus according to the above control instruction. For example, for the user interface 500 shown in FIG. 5 , it is determined that the new position of the interface focus is control 4 .
  • the new position of the interface focus that is, the position of the control 4 is still in the upper half of the user interface 500 , so the generated user interface remains the user interface 500 .
  • the second device 105 may move the first display interface upward, so that the new position of the interface focus is located in the upper half of the generated second display interface.
  • the interface focus is on the control 6 , and after the interface focus is moved down to the control 7 , it is determined that the control 7 is located in the lower half of the user interface 800 . Therefore, it is necessary to move the user interface 800 upwards to display more content to generate the user interface 900 shown in FIG. 9 .
  • the new position of the interface focus that is, the control 7 is located in the upper half of the user interface 900 .
  • the comfortable feeling of page display can be automatically realized, so that the focus of the page is located at a conspicuous position in the display.
  • the second device 105 when the user chooses to execute the function of the corresponding control, it is also possible to jump from the first display interface to the second display interface.
  • the interface focus may be set on the first control in the new user interface, or may be set on the control at the middle position in the new user interface, which is not limited here. .
  • the second display 109 of the second device 105 may display the interface focus, or may not display the interface focus.
  • the second device 105 can set the interface focus on the second display interface in the case of determining a new position of the interface focus in the second display interface. After the superimposition at the new position is sent to the first device 101, the first device 101 can directly display the second display interface after receiving the second display interface on which the interface focus is superimposed.
  • the manner of superimposing the interface focus on the second display interface may be performed according to a preset superimposition rule.
  • the overlay rule may include, for example, highlighting at a corresponding image position, adding a blinking cursor, adding a bounding box on a corresponding control image, and the like.
  • the superimposition rule may include any method capable of highlighting the interface focus in the second display interface, which is not limited in this application.
  • the interface focus may also be superimposed on the second display interface by the first device 101 .
  • the second device 105 may send the information of the second display interface and the new position to The first device 101 .
  • the first device 101 may superimpose the interface focus on the new position of the second display interface. Since the sizes of the first display 107 and the second display 109 may not match, the first device 101 may adapt the new position information to the position in the first display 107 after receiving the new position information of the interface focus .
  • the user sends the remote control data "Down key" through the remote control 103 , trying to move the position of the interface focus from control 2 down to control 4 .
  • the second device 105 adjusts the user interface 500 to the user interface 1000 shown in FIG. 10 .
  • the position of control 4 remains in the upper half of the user interface, so user interface 1000 and user interface 500 are the same.
  • the second device 105 also determines the position of the interface focus in the user interface 1000 , that is, the position of the control 4 , and sends the position information of the control 4 to the first device 101 .
  • the first device 101 can superimpose the interface focus on the position of the control 4 in the user interface 1000, generating FIG. 10 . effect shown.
  • the present application also proposes a control method for synchronous display across devices from the perspective of the first device 101 , and the control method can be applied to any first device 101 with a display function, such as smart display devices, smart TVs, and projection devices.
  • the first device 101 is coupled to a second device 105 having a second display 109 , and the first display 107 of the first device simultaneously displays the first display content of the second display 109 .
  • the first device may receive remote control data from the corresponding remote control device 103 . Based on the coupling relationship with the second device 105 , the first device 101 sends the remote control data or the control data matching the remote control data to the second device 105 .
  • the first device 101 may also receive second display content, where the second display content includes the display content after the first display content is adjusted according to the remote control data.
  • the first display is further configured to display an interface focus
  • the remote control data includes operation data on the interface focus
  • the interface focus is set on a control of the display interface.
  • control data includes a control instruction executable by the second device obtained by converting the remote control data according to a preset conversion rule.
  • the receiving the second display interface includes:
  • the interface focus is superimposed on the position of the second display interface.
  • the control method can be applied to the second device 105 with any data processing function, especially including smart phones, tablet computers and other devices.
  • the second device 105 with the second display 109 can receive remote control data or control data matching the remote control data, wherein the remote control data is for the first device 101 with the first display 107, the second device 105 Coupled with the first device 101 , the first display 107 displays the first display interface of the second display 109 synchronously.
  • the second device 105 may adjust the first display interface according to the remote control data or the control data, generate a second display interface, and finally send the second display interface to the first device 101 .
  • the remote control data includes operation data for the interface focus displayed on the first display.
  • the adjustment of the first display interface according to the remote control data includes:
  • the first display interface is adjusted according to the new position to generate a second display interface.
  • the sending the second display interface includes:
  • the sending the second display interface includes:
  • the interface focus includes a control in the first display interface.
  • control data includes a control instruction executable by the second device obtained by converting the remote control data according to a preset conversion rule.
  • the adjusting the first display interface according to the remote control data includes:
  • the first display interface is adjusted by using the control instruction.
  • converting the remote control data into a control instruction matching the second device according to a preset conversion rule includes:
  • the remote control data is parsed into a control instruction matching the second device according to the preset parsing rule.
  • FIG. 11 shows a schematic diagram of the module structure of this embodiment of the first device 101.
  • the first device 101 includes a first display 107, The first network module 1301, the first data receiving module 1303, and the first data sending module 1305, wherein,
  • the first network module 1301 is used for coupling with a second device having a second display;
  • the first display 107 is used to synchronously display the first display interface of the second display;
  • the first data receiving module 1303 is used to receive remote control data, and is used to receive a second display interface. the adjusted display interface of the first display interface;
  • the first data sending module 1305 is configured to send the remote control data or the control data to the second device.
  • the first display 107 is further configured to display an interface focus
  • the remote control data includes operation data on the interface focus
  • the interface focus is set on a control of the display interface.
  • the first device 101 further includes a first data processing module 1307,
  • the first data processing module 1307 is used to convert the remote control data into a control instruction matching the second device according to a preset conversion rule
  • the first data sending module 1305 is configured to send the control instruction to the second device.
  • the first data receiving module 1303 is specifically configured to:
  • the interface focus is superimposed on the position of the second display interface.
  • FIG. 11 shows a schematic diagram of a module structure of this embodiment of the second device 105.
  • the second device 105 includes a second display 109, The second network module 1301', the second data receiving module 1303', the second data processing module 1307', and the second data sending module 1305', wherein,
  • the second network module 1301' is used for coupling with the first device 101 having the first display 107, and the first display synchronously displays the first display interface of the second display;
  • the second data receiving module 1303' is configured to receive remote control data for the first device 101 or control data matching the remote control data;
  • the second data processing module 1307' is configured to adjust the first display interface according to the remote control data or the control data, and generate a second display interface;
  • the second data sending module 1305' is configured to send the second display interface.
  • the remote control data includes operation data for the interface focus displayed on the first display.
  • the second data processing module 1307' is specifically used for:
  • the first display interface is adjusted according to the new position to generate a second display interface.
  • the second data sending module 1305' is specifically used for:
  • the second data sending module 1305' is specifically used for:
  • the interface focus includes a control in the first display interface.
  • control data includes a control instruction executable by the terminal device obtained by converting the remote control data according to a preset conversion rule.
  • the second data processing module 1307' is specifically used for:
  • the first display interface is adjusted by using the control instruction.
  • the second data processing module 1307' is specifically used for:
  • the remote control data is parsed into a control instruction matching the second device according to the preset parsing rule.
  • the party that initiates cross-device transmission of data and sends the data may be referred to as the source (source) end, and the party that receives the data may be referred to as the sink (sink) end.
  • the device that is the source end in one pair of relationships may also be the sink end in another pair of relationships, that is, for one terminal device, it may be the source end of another terminal device. , and possibly the receiver of another terminal device.
  • the terminal equipment involved in this application may refer to a device with a wireless connection function.
  • the terminal device of the present application may also have the function of wired connection for communication.
  • the terminal device of the present application can be a touch screen or a non-touch screen.
  • the touch screen can control the terminal device by clicking, sliding, etc. on the display screen with a finger, a stylus pen, etc.
  • the non-touch screen device can be Connect input devices such as mouse, keyboard, touch panel, etc., and control the terminal device through the input device.
  • the device without a screen can be a Bluetooth speaker without a screen.
  • the terminal device of the present application can be a smartphone, a netbook, a tablet computer, a notebook computer, a wearable electronic device (such as a smart bracelet, a smart watch, etc.), a TV, a virtual reality device, an audio system, an electronic ink, etc. .
  • FIG. 12 shows a schematic structural diagram of a terminal device according to an embodiment of the present application. Taking the terminal device as a mobile phone as an example, FIG. 12 shows a schematic structural diagram of the mobile phone 200 .
  • the mobile phone 200 may include a processor 210, an external memory interface 220, an internal memory 221, a USB interface 230, a charging management module 240, a power management module 241, a battery 242, an antenna 1, an antenna 2, a mobile communication module 251, a wireless communication module 252, Audio module 270, speaker 270A, receiver 270B, microphone 270C, headphone jack 270D, sensor module 280, buttons 290, motor 291, indicator 292, camera 293, display screen 294, SIM card interface 295, etc.
  • a processor 210 an external memory interface 220, an internal memory 221, a USB interface 230, a charging management module 240, a power management module 241, a battery 242, an antenna 1, an antenna 2, a mobile communication module 251, a wireless communication module 252, Audio module 270, speaker 270A, receiver 270B, microphone 270C, headphone jack 270D, sensor module 280, buttons 290, motor 291, indicator 292, camera 293, display screen 294, SIM card interface 295, etc.
  • the sensor module 280 may include a gyroscope sensor 280A, an acceleration sensor 280B, a proximity light sensor 280G, a fingerprint sensor 280H, and a touch sensor 280K (of course, the mobile phone 200 may also include other sensors, such as a temperature sensor, a pressure sensor, a distance sensor, and a magnetic sensor. , ambient light sensor, air pressure sensor, bone conduction sensor, etc., not shown in the figure).
  • the structures illustrated in the embodiments of the present application do not constitute a specific limitation on the mobile phone 200 .
  • the mobile phone 200 may include more or less components than shown, or combine some components, or separate some components, or arrange different components.
  • the illustrated components may be implemented in hardware, software, or a combination of software and hardware.
  • the processor 210 may include one or more processing units, for example, the processor 210 may include an application processor (application processor, AP), a modem processor, a graphics processor (graphics processing unit, GPU), an image signal processor (image signal processor, ISP), controller, memory, video codec, digital signal processor (digital signal processor, DSP), baseband processor, and/or Neural-network Processing Unit (NPU) Wait. Wherein, different processing units may be independent devices, or may be integrated in one or more processors.
  • the controller may be the nerve center and command center of the mobile phone 200 . The controller can generate an operation control signal according to the instruction operation code and timing signal, and complete the control of fetching and executing instructions.
  • a memory may also be provided in the processor 210 for storing instructions and data.
  • the memory in processor 210 is cache memory.
  • the memory may hold instructions or data that have just been used or recycled by the processor 210 . If the processor 210 needs to use the instruction or data again, it can be called directly from the memory. Repeated accesses are avoided, and the waiting time of the processor 210 is reduced, thereby improving the efficiency of the system.
  • the processor 210 may execute the control method for synchronous display across devices provided by the embodiments of the present application.
  • the processor 210 may include different devices. For example, when a CPU and a GPU are integrated, the CPU and the GPU may cooperate to execute the control method for synchronous display across devices provided by the embodiments of the present application. For example, some algorithms in the control method for synchronous display across devices are executed by the CPU. , and another part of the algorithm is executed by the GPU to obtain faster processing efficiency.
  • Display screen 294 is used to display images, videos, and the like.
  • Display screen 294 includes a display panel.
  • the display panel can be a liquid crystal display (LCD), an organic light-emitting diode (OLED), an active-matrix organic light-emitting diode or an active-matrix organic light-emitting diode (active-matrix organic light).
  • LED organic light-emitting diode
  • AMOLED organic light-emitting diode
  • FLED flexible light-emitting diode
  • Miniled MicroLed, Micro-oLed, quantum dot light-emitting diode (quantum dot light emitting diodes, QLED) and so on.
  • cell phone 200 may include 1 or N display screens 294, where N is a positive integer greater than 1.
  • the display screen 294 may be used to display information entered by or provided to the user as well as various graphical user interfaces (GUIs).
  • GUIs graphical user interfaces
  • display 294 may display photos, videos, web pages, or documents, and the like.
  • display 294 may display a graphical user interface.
  • the GUI includes a status bar, a hideable navigation bar, a time and weather widget, and an application icon, such as a browser icon.
  • the status bar includes operator name (eg China Mobile), mobile network (eg 4G), time and remaining battery.
  • the navigation bar includes a back button icon, a home button icon, and a forward button icon.
  • the status bar may further include a Bluetooth icon, a Wi-Fi icon, an external device icon, and the like.
  • the graphical user interface may further include a Dock bar, and the Dock bar may include commonly used application icons and the like.
  • the display screen 294 may be an integrated flexible display screen, or a spliced display screen composed of two rigid screens and a flexible screen located between the two rigid screens.
  • the terminal device can establish a connection with other terminal devices through the antenna 1, the antenna 2 or the USB interface, and according to the cross-device synchronous display provided by the embodiment of the present application
  • the synchronous display control method transmits data and controls the display screen 294 to display the corresponding graphical user interface.
  • Camera 293 front camera or rear camera, or a camera that can be both a front camera and a rear camera is used to capture still images or video.
  • the camera 293 may include a photosensitive element such as a lens group and an image sensor, wherein the lens group includes a plurality of lenses (convex or concave) for collecting the light signal reflected by the object to be photographed, and transmitting the collected light signal to the image sensor .
  • the image sensor generates an original image of the object to be photographed according to the light signal.
  • Internal memory 221 may be used to store computer executable program code, which includes instructions.
  • the processor 210 executes various functional applications and data processing of the mobile phone 200 by executing the instructions stored in the internal memory 221 .
  • the internal memory 221 may include a storage program area and a storage data area.
  • the storage program area may store operating system, code of application programs (such as camera application, WeChat application, etc.), and the like.
  • the storage data area may store data created during the use of the mobile phone 200 (such as images and videos collected by the camera application) and the like.
  • the internal memory 221 may also store one or more computer programs 1310 corresponding to the control method for synchronous display across devices provided in the embodiments of the present application.
  • the one or more computer programs 1304 are stored in the aforementioned memory 221 and configured to be executed by the one or more processors 210, the one or more computer programs 1310 comprising instructions that may be used to perform any of the aforementioned The control method for synchronous display across devices according to the embodiment.
  • the internal memory 221 may include high-speed random access memory, and may also include non-volatile memory, such as at least one magnetic disk storage device, flash memory device, universal flash storage (UFS), and the like.
  • non-volatile memory such as at least one magnetic disk storage device, flash memory device, universal flash storage (UFS), and the like.
  • the code of the control method for synchronous display across devices may also be stored in an external memory.
  • the processor 210 may execute the code of the control method of synchronous display across devices stored in the external memory through the external memory interface 220 .
  • the function of the sensor module 280 is described below.
  • the gyro sensor 280A can be used to determine the movement posture of the mobile phone 200 .
  • the angular velocity of cell phone 200 about three axes ie, x, y, and z axes
  • the gyro sensor 280A can be used to detect the current motion state of the mobile phone 200, such as shaking or still.
  • the gyro sensor 280A can be used to detect a folding or unfolding operation acting on the display screen 294 .
  • the gyroscope sensor 280A may report the detected folding operation or unfolding operation to the processor 210 as an event to determine the folding state or unfolding state of the display screen 294 .
  • the acceleration sensor 280B can detect the magnitude of the acceleration of the mobile phone 200 in various directions (generally three axes). That is, the gyro sensor 280A can be used to detect the current motion state of the mobile phone 200, such as shaking or still. When the display screen in the embodiment of the present application is a foldable screen, the acceleration sensor 280B can be used to detect a folding or unfolding operation acting on the display screen 294 . The acceleration sensor 280B may report the detected folding operation or unfolding operation to the processor 210 as an event to determine the folding state or unfolding state of the display screen 294 .
  • Proximity light sensor 280G may include, for example, light emitting diodes (LEDs) and light detectors, such as photodiodes.
  • the light emitting diodes may be infrared light emitting diodes.
  • the mobile phone emits infrared light outward through light-emitting diodes.
  • Phones use photodiodes to detect reflected infrared light from nearby objects. When sufficient reflected light is detected, it can be determined that there is an object near the phone. When insufficient reflected light is detected, the phone can determine that there are no objects near the phone.
  • the proximity light sensor 280G can be arranged on the first screen of the foldable display screen 294, and the proximity light sensor 280G can detect the first screen according to the optical path difference of the infrared signal.
  • the gyroscope sensor 280A (or the acceleration sensor 280B) may send the detected motion state information (such as angular velocity) to the processor 210 .
  • the processor 210 determines, based on the motion state information, whether the current state is the hand-held state or the tripod state (for example, when the angular velocity is not 0, it means that the mobile phone 200 is in the hand-held state).
  • the fingerprint sensor 280H is used to collect fingerprints.
  • the mobile phone 200 can use the collected fingerprint characteristics to realize fingerprint unlocking, accessing application locks, taking photos with fingerprints, answering incoming calls with fingerprints, and the like.
  • Touch sensor 280K also called “touch panel”.
  • the touch sensor 280K may be disposed on the display screen 294, and the touch sensor 280K and the display screen 294 form a touch screen, also called a "touch screen”.
  • the touch sensor 280K is used to detect a touch operation on or near it.
  • the touch sensor can pass the detected touch operation to the application processor to determine the type of touch event.
  • Visual output related to touch operations may be provided through display screen 294 .
  • the touch sensor 280K may also be disposed on the surface of the mobile phone 200 , which is different from the location where the display screen 294 is located.
  • the display screen 294 of the mobile phone 200 displays a main interface, and the main interface includes icons of multiple applications (such as a camera application, a WeChat application, etc.).
  • Display screen 294 displays an interface of a camera application, such as a viewfinder interface.
  • the wireless communication function of the mobile phone 200 can be realized by the antenna 1, the antenna 2, the mobile communication module 251, the wireless communication module 252, the modulation and demodulation processor, the baseband processor, and the like.
  • Antenna 1 and Antenna 2 are used to transmit and receive electromagnetic wave signals.
  • Each antenna in handset 200 may be used to cover a single or multiple communication frequency bands. Different antennas can also be reused to improve antenna utilization.
  • the antenna 1 can be multiplexed as a diversity antenna of the wireless local area network. In other embodiments, the antenna may be used in conjunction with a tuning switch.
  • the mobile communication module 251 can provide a wireless communication solution including 2G/3G/4G/5G, etc. applied on the mobile phone 200 .
  • the mobile communication module 251 may include at least one filter, switch, power amplifier, low noise amplifier (LNA) and the like.
  • the mobile communication module 251 can receive electromagnetic waves from the antenna 1, filter and amplify the received electromagnetic waves, and transmit them to the modulation and demodulation processor for demodulation.
  • the mobile communication module 251 can also amplify the signal modulated by the modulation and demodulation processor, and then turn it into an electromagnetic wave for radiation through the antenna 1 .
  • at least part of the functional modules of the mobile communication module 251 may be provided in the processor 210 .
  • At least part of the functional modules of the mobile communication module 251 may be provided in the same device as at least part of the modules of the processor 210 .
  • the mobile communication module 251 may also be used for information interaction with other terminal devices.
  • the modem processor may include a modulator and a demodulator.
  • the modulator is used to modulate the low frequency baseband signal to be sent into a medium and high frequency signal.
  • the demodulator is used to demodulate the received electromagnetic wave signal into a low frequency baseband signal. Then the demodulator transmits the demodulated low-frequency baseband signal to the baseband processor for processing.
  • the low frequency baseband signal is processed by the baseband processor and passed to the application processor.
  • the application processor outputs sound signals through audio devices (not limited to the speaker 270A, the receiver 270B, etc.), or displays images or videos through the display screen 294 .
  • the modem processor may be a stand-alone device.
  • the modulation and demodulation processor may be independent of the processor 210, and may be provided in the same device as the mobile communication module 251 or other functional modules.
  • the wireless communication module 252 can provide applications on the mobile phone 200 including wireless local area networks (WLAN) (such as wireless fidelity (Wi-Fi) network), bluetooth (BT), global navigation satellite system (global navigation satellite system, GNSS), frequency modulation (frequency modulation, FM), near field communication technology (near field communication, NFC), infrared technology (infrared, IR) and other wireless communication solutions.
  • WLAN wireless local area networks
  • BT wireless fidelity
  • GNSS global navigation satellite system
  • frequency modulation frequency modulation, FM
  • NFC near field communication technology
  • infrared technology infrared, IR
  • the wireless communication module 252 may be one or more devices integrating at least one communication processing module.
  • the wireless communication module 252 receives electromagnetic waves via the antenna 2 , frequency modulates and filters the electromagnetic wave signals, and sends the processed signals to the processor 210 .
  • the wireless communication module 252 can also receive the signal to be sent from the processor 210 , perform frequency modulation on the signal, amplify the signal, and then convert it into an electromagnetic wave for radiation through the antenna 2 .
  • the wireless communication module 252 is used to transmit data between other terminal devices under the control of the processor 210.
  • the processor can control the wireless communication module 252 to send a judgment request to other terminal equipment, and can also receive judgment results made by other terminal equipment based on the above judgment request.
  • the judgment result indicates whether the data to be transmitted can be transmitted to other terminal equipment, and then controls the display.
  • the screen 294 displays the judgment result, provides intuitive visual feedback for the user, avoids wrong operation and repeated operation, and improves the operation efficiency
  • the mobile phone 200 can implement audio functions through an audio module 270, a speaker 270A, a receiver 270B, a microphone 270C, an earphone interface 270D, and an application processor. Such as music playback, recording, etc.
  • the cell phone 200 may receive key 290 input and generate key signal input related to user settings and function control of the cell phone 200 .
  • the mobile phone 200 can use the motor 291 to generate vibration alerts (eg, vibration alerts for incoming calls).
  • the indicator 292 in the mobile phone 200 may be an indicator light, which may be used to indicate a charging state, a change in power, and may also be used to indicate a message, a missed call, a notification, and the like.
  • the SIM card interface 295 in the mobile phone 200 is used to connect the SIM card. The SIM card can be contacted and separated from the mobile phone 200 by inserting into the SIM card interface 295 or pulling out from the SIM card interface 295 .
  • the mobile phone 200 may include more or less components than those shown in FIG. 12 , which are not limited in this embodiment of the present application.
  • the illustrated handset 200 is merely an example, and the handset 200 may have more or fewer components than those shown, two or more components may be combined, or may have different component configurations.
  • the various components shown in the figures may be implemented in hardware, software, or a combination of hardware and software, including one or more signal processing and/or application specific integrated circuits.
  • the software system of the terminal device can adopt a layered architecture, an event-driven architecture, a microkernel architecture, a microservice architecture, or a cloud architecture.
  • the embodiments of the present application take an Android system with a layered architecture as an example to exemplarily describe the software structure of a terminal device.
  • FIG. 13 is a block diagram of a software structure of a terminal device according to an embodiment of the present application.
  • the layered architecture divides the software into several layers, and each layer has a clear role and division of labor. Layers communicate with each other through software interfaces.
  • the Android system is divided into four layers, which are, from top to bottom, an application layer, an application framework layer, an Android runtime (Android runtime) and a system library, and a kernel layer.
  • the application layer can include a series of application packages.
  • the application package may include applications such as phone, camera, gallery, calendar, call, map, navigation, WLAN, Bluetooth, music, video, and short message.
  • the application framework layer provides an application programming interface (application programming interface, API) and a programming framework for applications in the application layer.
  • the application framework layer includes some predefined functions.
  • the application framework layer may include a window manager, a content provider, a view system, a telephony manager, a resource manager, a notification manager, and the like.
  • a window manager is used to manage window programs.
  • the window manager can get the size of the display screen, determine whether there is a status bar, lock the screen, take screenshots, etc.
  • the window manager may also be used to detect whether there is an extension device transmission operation in this embodiment of the present application, such as a drag and drop operation.
  • Content providers are used to store and retrieve data and make these data accessible to applications.
  • the data may include video, images, audio, calls made and received, browsing history and bookmarks, phone book, etc.
  • the view system includes visual controls, such as controls for displaying text, controls for displaying pictures, and so on. View systems can be used to build applications.
  • a display interface can consist of one or more views.
  • the display interface including the short message notification icon may include a view for displaying text and a view for displaying pictures.
  • the telephony manager is used to provide the communication function of the terminal device. For example, the management of call status (including connecting, hanging up, etc.).
  • the resource manager provides various resources for the application, such as localization strings, icons, pictures, layout files, video files and so on.
  • the notification manager enables applications to display notification information in the status bar, which can be used to convey notification-type messages, and can disappear automatically after a brief pause without user interaction. For example, the notification manager is used to notify download completion, message reminders, etc.
  • the notification manager can also display notifications in the status bar at the top of the system in the form of graphs or scroll bar text, such as notifications of applications running in the background, and notifications on the screen in the form of dialog windows. For example, text information is prompted in the status bar, a prompt sound is issued, the terminal device vibrates, and the indicator light flashes.
  • Android Runtime includes core libraries and a virtual machine. Android runtime is responsible for scheduling and management of the Android system.
  • the core library consists of two parts: one is the function functions that the java language needs to call, and the other is the core library of Android.
  • the application layer and the application framework layer run in virtual machines.
  • the virtual machine executes the java files of the application layer and the application framework layer as binary files.
  • the virtual machine is used to perform functions such as object lifecycle management, stack management, thread management, safety and exception management, and garbage collection.
  • a system library can include multiple functional modules. For example: surface manager (surface manager), media library (Media Libraries), 3D graphics processing library (eg: OpenGL ES), 2D graphics engine (eg: SGL), etc.
  • surface manager surface manager
  • media library Media Libraries
  • 3D graphics processing library eg: OpenGL ES
  • 2D graphics engine eg: SGL
  • the Surface Manager is used to manage the display subsystem and provides a fusion of 2D and 3D layers for multiple applications.
  • the media library supports playback and recording of a variety of commonly used audio and video formats, as well as still image files.
  • the media library can support a variety of audio and video encoding formats, such as: MPEG4, H.264, MP3, AAC, AMR, JPG, PNG, etc.
  • the 3D graphics processing library is used to implement 3D graphics drawing, image rendering, compositing, and layer processing.
  • 2D graphics engine is a drawing engine for 2D drawing.
  • the kernel layer is the layer between hardware and software.
  • the kernel layer contains at least display drivers, camera drivers, audio drivers, and sensor drivers.
  • An embodiment of the present application provides a control apparatus for synchronous display across devices, including: a processor and a memory for storing instructions executable by the processor; wherein the processor is configured to implement the above method when executing the instructions .
  • Embodiments of the present application provide a non-volatile computer-readable storage medium on which computer program instructions are stored, and when the computer program instructions are executed by a processor, implement the above method.
  • Embodiments of the present application provide a computer program product, including computer-readable codes, or a non-volatile computer-readable storage medium carrying computer-readable codes, when the computer-readable codes are stored in a processor of an electronic device When running in the electronic device, the processor in the electronic device executes the above method.
  • a computer-readable storage medium may be a tangible device that can hold and store instructions for use by the instruction execution device.
  • the computer-readable storage medium may be, for example, but not limited to, an electrical storage device, a magnetic storage device, an optical storage device, an electromagnetic storage device, a semiconductor storage device, or any suitable combination of the foregoing.
  • Computer-readable storage media include: portable computer disks, hard disks, random access memory (RAM), read only memory (ROM), erasable programmable read-only memory (Electrically Programmable Read-Only-Memory, EPROM or flash memory), static random access memory (Static Random-Access Memory, SRAM), portable compact disk read-only memory (Compact Disc Read-Only Memory, CD - ROM), Digital Video Disc (DVD), memory sticks, floppy disks, mechanically encoded devices, such as punch cards or raised structures in grooves on which instructions are stored, and any suitable combination of the foregoing .
  • RAM random access memory
  • ROM read only memory
  • EPROM erasable programmable read-only memory
  • EPROM Errically Programmable Read-Only-Memory
  • SRAM static random access memory
  • portable compact disk read-only memory Compact Disc Read-Only Memory
  • CD - ROM Compact Disc Read-Only Memory
  • DVD Digital Video Disc
  • memory sticks floppy disks
  • Computer readable program instructions or code described herein may be downloaded to various computing/processing devices from a computer readable storage medium, or to an external computer or external storage device over a network such as the Internet, a local area network, a wide area network and/or a wireless network.
  • the network may include copper transmission cables, fiber optic transmission, wireless transmission, routers, firewalls, switches, gateway computers, and/or edge servers.
  • a network adapter card or network interface in each computing/processing device receives computer-readable program instructions from a network and forwards the computer-readable program instructions for storage in a computer-readable storage medium in each computing/processing device .
  • the computer program instructions used to perform the operations of the present application may be assembly instructions, Instruction Set Architecture (ISA) instructions, machine instructions, machine-related instructions, microcode, firmware instructions, state setting data, or in one or more source or object code written in any combination of programming languages, including object-oriented programming languages such as Smalltalk, C++, etc., and conventional procedural programming languages such as the "C" language or similar programming languages.
  • the computer readable program instructions may execute entirely on the user's computer, partly on the user's computer, as a stand-alone software package, partly on the user's computer and partly on a remote computer, or entirely on the remote computer or server implement.
  • the remote computer may be connected to the user's computer through any kind of network—including a Local Area Network (LAN) or a Wide Area Network (WAN)—or, may be connected to an external computer (eg, use an internet service provider to connect via the internet).
  • electronic circuits such as programmable logic circuits, Field-Programmable Gate Arrays (FPGA), or Programmable Logic Arrays (Programmable Logic Arrays), are personalized by utilizing state information of computer-readable program instructions.
  • Logic Array, PLA the electronic circuit can execute computer readable program instructions to implement various aspects of the present application.
  • These computer readable program instructions may be provided to a processor of a general purpose computer, special purpose computer or other programmable data processing apparatus to produce a machine that causes the instructions when executed by the processor of the computer or other programmable data processing apparatus , resulting in means for implementing the functions/acts specified in one or more blocks of the flowchart and/or block diagrams.
  • These computer readable program instructions can also be stored in a computer readable storage medium, these instructions cause a computer, programmable data processing apparatus and/or other equipment to operate in a specific manner, so that the computer readable medium on which the instructions are stored includes An article of manufacture comprising instructions for implementing various aspects of the functions/acts specified in one or more blocks of the flowchart and/or block diagrams.
  • Computer readable program instructions can also be loaded onto a computer, other programmable data processing apparatus, or other equipment to cause a series of operational steps to be performed on the computer, other programmable data processing apparatus, or other equipment to produce a computer-implemented process , thereby causing instructions executing on a computer, other programmable data processing apparatus, or other device to implement the functions/acts specified in one or more blocks of the flowcharts and/or block diagrams.
  • each block in the flowchart or block diagrams may represent a module, segment, or portion of instructions, which comprises one or more functions for implementing the specified logical function(s) executable instructions.
  • the functions noted in the blocks may occur out of the order noted in the figures. For example, two blocks in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved.
  • each block of the block diagrams and/or flowchart illustrations, and combinations of blocks in the block diagrams and/or flowchart illustrations can be implemented in hardware (eg, circuits or ASICs (Application) that perform the corresponding functions or actions. Specific Integrated Circuit, application-specific integrated circuit)), or can be implemented by a combination of hardware and software, such as firmware.

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Selective Calling Equipment (AREA)
  • Transforming Electric Information Into Light Information (AREA)

Abstract

一种跨设备同步显示的控制方法,方法包括:具有第一显示器(107)的第一设备(101)接收遥控数据,第一设备(101)与具有第二显示器(109)的第二设备(105)相耦合,第一显示器(107)同步显示第一显示内容,其中,第一显示内容是第二显示器(109)显示的;将遥控数据或者与遥控数据相匹配的控制数据发送至第二设备(105);接收第二显示内容,第二显示内容是第二设备(105)根据遥控数据或者控制数据对第一显示内容调整所得到的;在第一显示器(107)显示第二显示内容。还公开了一种跨设备同步显示的控制系统。

Description

一种跨设备同步显示的控制方法及系统
本申请要求于2021年03月17日提交中国专利局、申请号为202110287070.X、申请名称为“一种跨设备同步显示的控制方法及系统”的中国专利申请的优先权,其全部内容通过引用结合在本申请中。
技术领域
本申请涉及智能设备技术领域,尤其涉及一种跨设备同步显示的控制方法及系统。
背景技术
投屏技术已经广泛应用于人们的工作与生活中,利用显示器自带的投屏模块或者借助于外接的投屏设备,用户可以将智能终端连接显示器,并将智能终端所显示的内容投屏展示于显示器中,以增大观看视野。
相关技术中,用户在控制显示器所展示的内容的过程中,需要在智能终端上操作。在此过程中,用户的视线需要在智能终端和显示器上来回切换,操作不是很便捷。
因此,相关技术中亟需一种能够操作便捷的投屏方式。
发明内容
有鉴于此,提出了一种跨设备同步显示的控制方法及系统。
第一方面,本申请的实施例提供了一种跨设备同步显示的控制方法,包括:
具有第一显示器的第一设备接收遥控数据,所述第一设备与具有第二显示器的第二设备相耦合,所述第一显示器同步显示第一显示内容,其中,所述第一显示内容是所述第二显示器显示的;
发送所述遥控数据至所述第二设备;
接收第二显示内容,所述第二显示内容是所述第二设备根据所述遥控数据对所述第一显示内容调整所得到的;
在所述第一显示器显示所述第二显示内容。
本申请实施例中,第一设备可以将其接收到的遥控数据发送至第二设备,且所述第一设备同步显示所述第二设备显示的内容。第二设备在接收到所述遥控数据之后,可以根据所述遥控数据对显示的第一显示内容进行调整,生成第二显示内容,并将所述第二显示内容发送至第一设备。通过上述实施例的方式,用户在观看第二显示器内容时,在不需要将视线切换至第一设备的情况下,也可以实现对显示界面的调整,大大提升使用便捷度。
根据第一方面的第一种可能的实现方式,所述第一显示器还用于显示界面焦点,且所述遥控数据包括对所述界面焦点的操作数据。
本实施例中,在显示界面中设置界面焦点,可以让用户肉眼识别到当前显示界面中焦点的位置,从而可以根据所述界面焦点进行操作。
根据第一方面的第二种可能的实现方式,所述界面焦点设置于显示界面的控件上。
本申请实施例中,所述界面焦点设置于所述显示界面的控件上,可以使得所述界面焦点聚焦于所述显示界面中相对比较重要的位置处,提高用户对显示界面的调整效率。
根据第一方面的第三种可能的实现方式,所述发送所述遥控数据至所述第二设备,包括:
按照预设转换规则将所述遥控数据转换成与所述第二设备相匹配的控制指令;
发送所述控制指令至所述第二设备。
本申请实施例中,在第一设备端将所述遥控数据转换成与第二设备相匹配的控制指令,那么,第二设备在接收到所述控制指令之后,直接执行所述控制指令。
根据第一方面的第四种可能的实现方式,所述接收第二显示内容,包括:
接收第二显示内容以及所述界面焦点在所述第二显示内容中的位置;
将所述界面焦点叠加于所述第二显示内容的所述位置处。
本实施例中,可以在第一设备端实现对界面焦点与第二显示内容的叠加。
第二方面,本申请的实施例提供了一种一种跨设备同步显示的控制方法,包括:
第二设备的第二显示器显示第一显示内容;
所述第二设备接收遥控数据,其中,所述遥控数据针对具有第一显示器的第一设备,所述第二设备与所述第一设备相耦合,所述第一显示器同步显示所述第一显示内容;
根据所述遥控数据对所述第一显示内容进行调整,生成第二显示内容;
发送所述第二显示内容。
本申请实施例中,第一设备可以将其接收到的遥控数据发送至第二设备,且所述第一设备同步显示所述第二设备显示的内容。第二设备在接收到所述遥控数据之后,可以根据所述遥控数据对显示的第一显示内容进行调整,生成第二显示内容,并将所述第二显示内容发送至第一设备。通过上述实施例的方式,用户在观看第二显示器内容时,在不需要将视线切换至第一设备的情况下,也可以实现对显示界面的调整,大大提升使用便捷度。
根据第二方面的第一种可能的实现方式,所述遥控数据包括针对所述第一显示器所显示的界面焦点的操作数据。
本实施例中,在显示界面中设置界面焦点,可以让用户肉眼识别到当前显示界面中焦点的位置,从而可以根据所述界面焦点进行操作。
根据第二方面的第二种可能的实现方式,所述根据所述遥控数据对所述第一显示内容进行调整,包括:
根据所述遥控数据确定所述界面焦点的新位置;
根据所述新位置调整所述第一显示内容,生成第二显示内容。
本实施例中,根据所述界面焦点的新位置确定第二显示内容,可以自动化地使得所述界面焦点的显示满足预设要求。
根据第二方面的第三种可能的实现方式,所述发送所述第二显示内容,包括:
将所述界面焦点叠加于所述第二显示内容的所述新位置处;
发送叠加所述界面焦点之后的第二显示内容。
本实施例中,第二设备可以完成所述界面焦点的叠加之后,在将叠加界面焦点的第二显示内容发送至第一设备,这样,第一设备可以直接显示所述第二显示内容。
根据第二方面的第四种可能的实现方式,所述发送所述第二显示内容,包括:
发送所述第二显示内容和所述界面焦点的新位置。
本实施例中,可以由第一设备实现对所述界面焦点与所述第二显示内容的叠加。
根据第二方面的第五种可能的实现方式,所述界面焦点包括所述第一显示内容中的控件。
本申请实施例中,所述界面焦点设置于所述显示界面的控件上,可以使得所述界面焦点聚焦于所述显示界面中相对比较重要的位置处,提高用户对显示界面的调整效率。
根据第二方面的第六种可能的实现方式,所述遥控数据包括按照预设转换规则将原始遥控数据转换成的与所述第二设备相匹配的控制指令。
本申请实施例中,所述遥控数据包括第一设备已经根据预设转换规则转换之后的控制指令。
根据第二方面的第七种可能的实现方式,所述根据所述遥控数据对所述第一显示内容进行调整,包括:
按照预设解析规则将所述遥控数据解析成与所述第二设备相匹配的控制指令;
利用所述控制指令对所述第一显示内容进行调整。
本申请实施例中,第二设备可以自己完成对原始遥控数据的转换,生成相匹配的控制指令。
根据第二方面的第八种可能的实现方式,所述按照预设转换规则将所述遥控数据转换成与所述第二设备相匹配的控制指令,包括:
获取所述第一设备的标识信息;
按照所述预设解析规则将所述遥控数据解析成与所述第二设备相匹配的控制指令,所述预设解析规则与所述第一设备的标识信息相关。
本申请实施例中,在所述第一设备多样化且遥控数据的设计规则不相同的情况下,第一设备可以根据所述第一设备的标识信息,确定与所述第一设备相匹配的预设解析规则,并利用所述预设解析规则解析所述遥控数据。
第三方面,本申请的实施例提供了第一设备,包括第一显示器、第一网络模块、第一数据接收模块、第一数据发送模块,其中,
所述第一网络模块,用于与具有第二显示器的第二设备相耦合;
所述第一显示器,用于同步显示第一显示内容,其中,所述第一显示内容是所述第二显示器显示的,以及用于显示第二显示内容;
所述第一数据接收模块,用于接收遥控数据,以及,用于接收所述第二显示内容,所述第二显示内容是所述第二设备根据所述遥控数据或者与所述遥控数据相匹配的控制数据对所述第一显示内容调整所得到的;
所述第一数据发送模块,用于将所述遥控数据或者与所述控制数据发送至所述第二设备。
可选的,在本申请的一个实施例中,所述第一显示器还用于显示界面焦点,且所 述遥控数据包括对所述界面焦点的操作数据。
可选的,在本申请的一个实施例中,所述界面焦点设置于显示界面的控件上。
可选的,在本申请的一个实施例中,第一设备还包括第一数据处理模块,
所述第一数据处理模块,用于按照预设转换规则将所述遥控数据转换成与所述第二设备相匹配的控制指令;
对应地,所述第一数据发送模块,用于发送所述控制指令至所述第二设备。
可选的,在本申请的一个实施例中,所述第一数据接收模块,具体用于:
接收第二显示内容以及所述界面焦点在所述第二显示内容中的位置;
将所述界面焦点叠加于所述第二显示内容的所述位置处。
第四方面,本申请的实施例提供了终端设备还提供第二设备,第二设备包括第二显示器、第二网络模块、第二数据接收模块、第二数据处理模块、第二数据发送模块,其中,
所述第二显示器,用于显示第一显示内容;
所述第二网络模块,用于与具有第一显示器的第一设备相耦合,所述第一显示器同步显示所述第一显示内容;
所述第二数据接收模块,用于接收针对所述第一设备的遥控数据或者与所述遥控数据相匹配的控制数据;
所述第二数据处理模块,用于根据所述遥控数据或者所述控制数据对所述第一显示内容进行调整,并生成第二显示内容;
所述第二数据发送模块,用于发送所述第二显示内容。
可选的,在本申请的一个实施例中,所述遥控数据包括针对所述第一显示器所显示的界面焦点的操作数据。
可选的,在本申请的一个实施例中,所述第二数据处理模块,具体用于:
根据所述遥控数据或者所述控制数据确定所述界面焦点的新位置;
根据所述新位置调整所述第一显示内容,生成第二显示内容。
可选的,在本申请的一个实施例中,所述第二数据发送模块,具体用于:
将所述界面焦点叠加于所述第二显示内容的所述新位置处;
发送叠加所述界面焦点之后的第二显示内容。
可选的,在本申请的一个实施例中,所述第二数据发送模块,具体用于:
发送所述第二显示内容和所述界面焦点的新位置。
可选的,在本申请的一个实施例中,所述界面焦点包括所述第一显示内容中的控件。
可选的,在本申请的一个实施例中,所述控制数据包括将所述遥控数据按照预设转换规则转换得到的所述终端设备可执行的控制指令。
可选的,在本申请的一个实施例中,所述第二数据处理模块,具体用于:
按照预设解析规则将所述遥控数据解析成与所述第二设备相匹配的控制指令;
利用所述控制指令对所述第一显示内容进行调整。
可选的,在本申请的一个实施例中,所述第二数据处理模块,具体用于:
获取所述第一设备的标识信息;
根据所述标识信息确定与所述第一设备相匹配的预设解析规则;
按照所述预设解析规则将所述遥控数据解析成与所述第二设备相匹配的控制指令。
第五方面,本申请的实施例提供了一种终端设备,包括:处理器;用于存储处理器可执行指令的存储器;其中,所述处理器被配置为执行所述指令时实现上述第一/二方面或者第一/二方面的多种可能的实现方式中的一种或几种的跨设备同步显示的控制方法。
第六方面,本申请的实施例提供了一种跨设备同步显示的控制系统,包括上述第三方面或者第三方面的多种可能的实现方式中的第一设备和上述第四方面或者第四方面的多种可能的实现方式中的第二设备。
第七方面,本申请的实施例提供了一种计算机程序产品,包括计算机可读代码,或者承载有计算机可读代码的非易失性计算机可读存储介质,当所述计算机可读代码在电子设备中运行时,所述电子设备中的处理器执行上述第一方面或者第一方面的多种可能的实现方式中的一种或几种的跨设备同步显示的控制方法。
第八方面,本申请实施例提供一种芯片,该芯片包括至少一个处理器,该处理器用于运行存储器中存储的计算机程序或计算机指令,以执行上述各方面任一项可能实现的方法。
可选的,该芯片还可以包括存储器,该存储器用于存储计算机程序或计算机指令。
可选的,该芯片还可以包括通信接口,用于与芯片以外的其他模块进行通信。
可选的,一个或多个芯片可以构成芯片系统。
本申请的这些和其他方面在以下(多个)实施例的描述中会更加简明易懂。
附图说明
包含在说明书中并且构成说明书的一部分的附图与说明书一起示出了本申请的示例性实施例、特征和方面,并且用于解释本申请的原理。
图1示出根据本申请一实施例的一种跨设备同步显示的控制系统的结构示意图。
图2示出第一设备101和遥控设备103的模块结构示意图。
图3示出所述跨设备同步显示的控制系统的工作流程图。
图4示出跨设备同步显示的控制方法的一种实施例的方法流程示意图。
图5展示了具有多个控件的用户界面500的示意图。
图6展示了用户界面600的示意图。
图7示出跨设备同步显示的控制方法的另一种实施例的方法流程示意图。
图8示出调整所述第一显示界面方法的一种实施例的方法流程图。
图9展示了用户界面900的示意图。
图10展示了用户界面1000的示意图。
图11展示了第一设备101和第二设备105的一种实施例的模块结构示意图。
图12示出根据本申请一实施例的终端设备的结构示意图。
图13示出根据本申请一实施例的终端设备的软件结构框图。
具体实施方式
以下将参考附图详细说明本申请的各种示例性实施例、特征和方面。附图中相同的附图标记表示功能相同或相似的元件。尽管在附图中示出了实施例的各种方面,但是除非特别指出,不必按比例绘制附图。
在这里专用的词“示例性”意为“用作例子、实施例或说明性”。这里作为“示例性”所说明的任何实施例不必解释为优于或好于其它实施例。
另外,为了更好的说明本申请,在下文的具体实施方式中给出了众多的具体细节。本领域技术人员应当理解,没有某些具体细节,本申请同样可以实施。在一些实例中,对于本领域技术人员熟知的方法、手段、元件和电路未作详细描述,以便于凸显本申请的主旨。
在本申请实施例中,“/”可以表示前后关联的对象是一种“或”的关系,例如,A/B可以表示A或B;“和/或”可以用于描述关联对象存在三种关系,例如,A和/或B,可以表示:单独存在A,同时存在A和B,单独存在B这三种情况,其中A,B可以是单数或者复数。为了便于描述本申请实施例的技术方案,在本申请实施例中,可以采用“第一”、“第二”等字样对功能相同或相似的技术特征进行区分。该“第一”、“第二”等字样并不对数量和执行次序进行限定,并且“第一”、“第二”等字样也并不限定一定不同。在本申请实施例中,“示例性的”或者“例如”等词用于表示例子、例证或说明,被描述为“示例性的”或者“例如”的任何实施例或设计方案不应被解释为比其它实施例或设计方案更优选或更具优势。使用“示例性的”或者“例如”等词旨在以具体方式呈现相关概念,便于理解。
在本申请实施例中,对于一种技术特征,通过“第一”、“第二”、“第三”、“A”、“B”、“C”和“D”等区分该种技术特征中的技术特征,该“第一”、“第二”、“第三”、“A”、“B”、“C”和“D”描述的技术特征间无先后顺序或者大小顺序。
另外,为了更好的说明本申请,在下文的具体实施方式中给出了众多的具体细节。本领域技术人员应当理解,没有某些具体细节,本申请同样可以实施。在一些实例中,对于本领域技术人员熟知的方法、手段、元件和电路未作详细描述,以便于凸显本申请的主旨。
为了便于理解本申请实施例,下面先对本申请实施例提供的其中一种跨设备同步显示的控制系统的结构进行描述。请参见图1,图1是本申请实施例提供的一种跨设备同步显示的控制系统100的结构示意图,该系统包括第一设备101、第一设备101的遥控设备103、第二设备105。其中,第一设备101具有第一显示器107,第二设备105具有第二显示器109。在一些示例中,第一设备101可以包括智能显示设备、智能电视、投影设备等具有显示功能的设备,遥控设备103可以包括与所述智能显示设备、所述智能电视、所述投影设备配套的遥控器,如电视遥控器、显示设备遥控器、投影仪遥控器等。第二设备105可以包括带有显示器的智能终端,如智能手机、平板电脑、个人数字助理(PDA)等,第一显示器107的尺寸一般大于第二显示器109的尺寸。
第一设备101和第二设备105之间建立有数据传输通道,例如第一设备101和第二设备105接入同一个无线局域网(Wireless Local Area Networks,WLAN)(如无线保真(Wireless Fidelity,Wi-Fi)网络),通过所述无线局域网,第一设备101和第二 设备105之间可以传递数据。当然,所述数据传输通道不限于上述无线局域网,例如还可以包括近距离无线通信通道,如蓝牙、红外线等,甚至还可以包括有线连接,本申请在此不做限制。第二设备105可以通过所述数据传输通道将第二显示器109所显示的内容传输至第一设备101,第一设备101可以在第一显示器107上显示第二设备105所传输的内容,这样,可以形成第一显示器107和第二显示器109同步显示的效果。在一个具体的示例中,第一设备101和第二设备105接入同一个Wi-Fi网络,基于镜像协议,第二设备105可以以每秒若干帧的速率发送截屏图像给第一设备101,第一设备101可以不断展示第二设备105传输过来的截屏图像。所述镜像协议例如可以包括airplay镜像、lelink镜像等任何可以能够将第二设备105的显示界面同步给第一设备101的协议,在此不做限制。在一个具体的场景中,智能手机与智能显示设备接入同一个Wi-Fi网络,通过Wi-Fi网络,智能手机能够将显示的数据内容传输给智能显示设备,并在显示屏上展示,实现将智能手机和智能显示设备同步展示的效果。
当然,在其他实施例中,第一设备101和第二设备105之间还可以通过投屏设备建立数据传输通道,尤其对于不具有镜像协议的第二设备105,可以将所述投屏设备作为第二设备105的功能扩展模块,使得第二设备105具备跨设备同步显示的功能。图2展示了利用投屏设备201建立所述数据传输通道的示例性结构示意图。如图2所示,投屏设备201可以与第一设备101连接,具体可以包括有线连接,如将投屏设备201与第一设备101的高清多媒体接口(High Definition Multimedia Interface,HDMI)相连接,当然,也可以包括蓝牙等无线连接方式,在此不做限制。投屏设备201和第二设备105可以同时连接路由设备203,这样,可以建立投屏设备201到第二设备105之间的数据传输通道,也就是建立第一设备101和第二设备105之间的数据传输通道。
第一设备101和遥控设备103之间相互匹配,用户可以通过对遥控设备103进行操作,向第一设备101发送控制指令。在一个具体的示例中,如图3所示,遥控设备103可以包括键盘301、编码模块303、调制模块305、信号发射模块307等,第一设备101可以包括信号接收模块309(如光电转换放大电路)、解调模块311、解码模块313、执行模块315等。具体来说,用户在选择所述键盘301中的按键之后,编码模块303可以对选择的所述按键进行编码,并通过调制模块305将生成的编码转化成调制波,再通过信号发射模块307将所述调制波发射出去。在第一设备101一侧,信号接收模块309接收到所述调制波之后,可以分别利用解调模块311和解码模块313对所述调制波解调、解码之后,生成具有一定频率的脉冲信号。不同频率的脉冲信号分别对应不同的控制指令,基于此,第一设备101在识别出脉冲信号所对应的控制指令之后,可以由执行模型315执行对应的控制指令。需要说明的是,在本申请实施例中,所述信号发射模块307可以通过红外线、蓝牙等近距离无线电波传输所述调制波,本申请在此不做限制。
下面结合附图4所示的交互流程图说明所述跨设备同步显示的控制系统的工作流程。
如图4所示,在本申请的实施例中,在步骤1之前,第一设备101和第二设备105之间可以建立数据传输通道,例如有线通道、无线局域网、近距离无线通信通道(如蓝牙、红外线)等。步骤1中,在用户的操作下,遥控设备103可以向第一设备101 发送遥控数据,遥控数据可以包括上述调制波等数据。步骤2中,第一设备101可以将该遥控数据或者与该遥控数据相匹配的控制数据发送至第二设备105。在此,控制数据可以包括将遥控数据按照预设转换规则转换得到的第二设备105可执行的控制指令。当然,上述解析的过程可以在第一设备101上执行,也可以在第二设备105上执行,本申请在此不做限制。步骤3中,第二设备105可以根据利用所述遥控数据所解析得到的控制指令,对第二设备105当前显示的第一显示内容进行调整,并生成第二显示内容。第一显示内容包括第一显示界面,第二显示内容可以包括第二显示界面。步骤4中,第二设备105可以通过所述数据传输通道将第二显示界面发送至第一设备101。步骤5中,第一设备101可以显示第二显示界面。第一设备101显示第二显示界面可以包括第一设备101利用第一显示器107展示与第二显示界面相同的数据内容。示例性地,在第一设备101的第一显示器107与第二设备105的第二显示器105的尺寸不相匹配的情况下,第一设备101在显示第二显示界面的过程中,可以对该第二显示界面进行尺寸适配,使得适配后的第二显示界面与第一显示器107的尺寸相匹配。
利用本申请实施例所提供的跨设备同步显示的控制方法,可以基于第一设备101和第二设备105之间的数据传输通道,利用第一设备101的遥控设备103对第二设备105进行控制。通过该方式,用户在利用第一设备101同步显示第二设备105的显示界面的过程中,只需要基于第一设备101所显示的界面,利用第一设备101的遥控设备103控制界面的调整。从用户的使用体验感的角度来说,用户的视线不需要在第一设备101和第二设备105之间来回切换,即可完成对显示界面的调整。
下面结合附图对本申请所述的跨设备同步显示的控制方法进行详细的说明。虽然本申请提供了如下述实施例或附图所示的方法操作步骤,但基于常规或者无需创造性的劳动在所述方法中可以包括更多或者更少的操作步骤。在逻辑性上不存在必要因果关系的步骤中,这些步骤的执行顺序不限于本申请实施例提供的执行顺序。所述方法在实际中的跨设备同步显示的控制过程中或者装置执行时,可以按照实施例或者附图所示的方法顺序执行或者并行执行(例如并行处理器或者多线程处理的环境)。
下面对图4所示的所述跨设备同步显示的控制方法的实施例进行详细说明。
本申请实施例中,如图4所示,第一设备101可以将从遥控设备103接收到的遥控数据发送至第二设备105。在实际应用中,第一设备101和遥控设备103之间具有特定的编码解码协议,对于遥控设备103所发送的遥控数据,一般只有第一设备101能够解析出其对应的含义。因此,需要将该遥控数据转换成与第二设备105相匹配的控制数据,才可以被第二设备105所识别。基于此,在本申请的一个实施例中,第一设备101还可以将该遥控数据转换成与第二设备105相匹配的控制数据之后,再发送给第二设备105。
本申请实施例中,所述控制数据包括将所述遥控数据按照预设转换规则转换得到的第二设备105可执行的控制指令。其中,所述预设转换规则可以包括所述遥控数据与第二设备105可识别的控制指令之间的对应关系,也就是将第一设备101可识别的遥控数据转换成第二设备105可识别的控制指令。表1展示了预设转换规则的一种示例,如表1所示,在预设转换规则示例表格中,第一栏为第一设备101的遥控数据,如“向上键”、“向下键”、“返回键”等等,对应的可以设置遥控数据分别对应的 控制指令。例如,第一设备101在接收到“向下键”的遥控数据之后,可以按照表1所示的预设转换规则将遥控数据转换成第二设备105中“用户界面往下移”的控制指令。当然,所述预设转换规则不限于表1所示的示例,可以设置任何的遥控数据到控制指令的转换关系,本申请在此不做限制。
表1 预设转换规则示例表
Figure PCTCN2022079942-appb-000001
当然,在其他实施例中,还可以由第三方设备完成对遥控数据转换,第三方设备与第一设备101相耦合,例如,第三方设备可以包括云端服务器或者服务器集群,还可以包括第一设备101的外接设备,如U盘等外接存储处理器。
在以上各个实施例中,第一设备101将遥控数据转换成与第二设备105相匹配的控制数据,第二设备105在接收到控制数据之后,可以直接执行该控制数据,减少第二设备105一侧的资源消耗。
但是,在实际应用环境下,很多第一设备101不具备处理能力,例如,款式比较陈旧的显示设备等。基于此,第一设备101可以将原始的遥控数据转发至第二设备105。第二设备105在接收到遥控数据之后,可以按照预设解析规则将原始的遥控数据解析成与第二设备105相匹配的控制指令。具体来说,预设解析规则可以包括表1所示的预设转换规则,当然,遥控数据不仅可以包括表1所示的第一设备101解析后的控制指令,还可以包括原始的调制波等,本申请在此不做限制。
在实际应用中,不同品牌、不同型号的显示设备所对应的遥控数据的编码方式是不相同的。对于红外线发射的遥控方式,对于同一个控制指令,不同的显示设备可能使用的调制波的频率是不相同的。基于此,在本申请的一个实施例中,第二设备105按照预设转换规则将遥控数据转换成与第二设备相匹配的控制指令,包括:
S101:获取第一设备101的标识信息;
S103:按照预设解析规则将遥控数据解析成与第二设备105相匹配的控制指令,所述预设解析规则与所述第一设备的标识信息相关。
本申请实施例中,在转换所述遥控数据的过程中,首先,第二设备105可以获取到第一设备101的标识信息,具体地,在第一设备101与第二设备105之间建立数据 通道的过程中,第二设备105即可获取到第一设备101的标识信息,所述标识信息例如可以包括第一设备的品牌、型号等能够标识其身份的信息。表2展示了所述预设解析规则的一种示例性的表格。如表2所示,对于不同的第一设备101,其对应的遥控数据与控制指令之间的解析规则是不相同的。例如,对于相同的控制指令“用户界面往上移”,标识为0001的第一设备101对应的遥控数据为“向上键”,标识为0002的第二设备101对应的遥控数据为数字键2,本申请在此不做限制。
表2 预设解析规则示例表
Figure PCTCN2022079942-appb-000002
本申请实施例中,第二设备105在接收或者解析到所述遥控数据对应的可执行控制指令之后,可以根据所述控制指令对其当前显示的第一显示界面进行调整,生成第二显示界面,并将所述第二显示界面发送至第一设备101。第一设备101在接收到所述第二显示界面之后,可以利用第一显示器107显示所述第二界面,实现与第一设备101的同步显示。当然,在第一设备101的第一显示器107与第二设备105的第二显示器105的尺寸不相匹配的情况下,第一设备101在显示所述第二显示界面的过程中, 可以对所述第二显示界面进行尺寸适配,使得适配后的第二显示界面与第一显示器107的尺寸相匹配。从用户的观看视野来说,第一显示器107和第二显示器109所显示的画面是与显示器的尺寸相适配的。
在本申请的一个实施例中,第一显示器107不仅可以显示第二设备105同步过来的显示界面,还可以显示所述显示界面中的界面焦点。其中,所述界面焦点包括所述显示界面中聚焦的位置,显示界面中的界面焦点的作用类似于鼠标光标。所述界面焦点可以设置于显示界面任意位置处,包括控件、图片、文字等等。本申请实施例中,可以利用光标、边界框、高亮、蒙层等任何能够在所述显示界面中突出显示的样式显示所述界面焦点,例如,利用闪烁的光标在文字输入框中显示所述界面焦点,或者在所述界面焦点所在的图片上增加边界框,或者高亮所述界面焦点所在的控件。利用所述第一显示器107显示所述界面焦点,可以让用户肉眼识别到当前显示界面中焦点的位置,从而可以根据所述界面焦点进行操作。基于此,所述遥控数据可以包括对所述界面焦点的操作。所述操作例如包括上移、下移、左移、右移所述界面焦点,还可以包括对所述界面焦点所在的页面元素的操作,例如打开所述界面焦点所在的链接、图片等等。
在本申请的一个实施例中,所述界面焦点可以设置于所述显示界面的控件上。所述控件可以包括设置于所述显示界面中的可视化图像,如按钮、文件编辑框等。所述控件可以具有执行功能或通过事件引发代码运行并完成响应的功能。图5展示了具有多个控件的用户界面500,如图5所示,用户界面500中的界面焦点位于控件2上。本申请实施例中,所述界面焦点设置于所述显示界面的控件上,可以使得所述界面焦点聚焦于所述显示界面中相对比较重要的位置处,提高用户对显示界面的调整效率。
本申请的一个实施例中,在所述遥控数据包括针对第一显示器107所显示的界面焦点的操作数据的情况下,第二设备105在接收到所述遥控数据之后,可以根据所述遥控数据对所述第一显示界面进行调整,并生成第二显示界面。具体地,在一个实施例中,第二设备105根据所述遥控数据对所述第一显示界面进行调整,包括:
S201:根据所述遥控数据确定所述界面焦点的新位置;
S203:根据所述新位置调整所述第一显示界面,生成第二显示界面。
本申请实施例中,可以根据所述遥控数据确定所述界面焦点的新位置。所述新位置例如可以包括控件的标识、光标在显示界面中的坐标等等。在所述遥控数据包括针对所述界面焦点的操作数据的情况下,解析得到的控制指令均是针对所述界面焦点的操作指令。例如,对于遥控数据“向上键”,表示将所述界面焦点上移,在所述界面焦点设置于控件上的情况下,遥控数据“向上键”对应于将所述界面焦点转换至上一个控件的控制指令。在确定所述新位置之后,可以根据所述新位置调整所述第一显示界面,生成第二显示界面。具体调整的方式可以包括调整所述第一显示界面,使得所述界面焦点的新位置位于生成的第二显示界面的上半部。
下面结合图6所示的应用场景图、图7所示的方法流程图和图8、图9所示的用户界面图说明一个示例性的场景。如图6所示,第一设备101和第二设备105正在同步显示上述用户界面500。用户的视线可以停留在第一设备101的第一显示器107上,此时,用户想要将界面焦点从控件2上下移至控件4上,基于此,用户可以对遥控设 备103进行操作,如点击遥控设备103的向下键。从第二设备105的方法执行角度来说,如图7所示,步骤701中,第二设备105从第一设备101接收到遥控数据“向下键”。步骤702中,第二设备105可以根据所述预设解析规则将所述遥控数据“向下键”转换成“界面焦点下移”的控制指令。然后,步骤703中,第二设备105可以根据上述控制指令确定所述界面焦点的新位置。例如,对于图5所示的用户界面500,确定所述界面焦点的新位置为控件4。在步骤704中,可以判断所述界面焦点的新位置是否在第一显示界面的上部。在判断结果为是的情况下,步骤705中,可以将生成的第二显示界面保持为第一显示界面。例如,在图5所示的用户界面500中,所述界面焦点的新位置,即控件4的位置,依然在用户界面500的上半部,因此,生成的用户界面依然保持为用户界面500。在判断结果为否的情况下,步骤707中,第二设备105可以将所述第一显示界面向上移,使得所述界面焦点的新位置位于生成的第二显示界面的上半部分。如图8所示,在用户界面800中,所述界面焦点位于控件6上,在将所述界面焦点下移至控件7上之后,确定控件7位于用户界面800的下半部分。因此,需要将用户界面800往上移,展示更多的内容,生成图9所示的用户界面900。如图9所示,所述界面焦点的新位置,即控件7位于用户界面900的上半部。
通过上述各个实施例中调整所述第一显示界面的方式,可以自动化地实现页面展示的舒适感,使得所述页面焦点位于显示器中显眼的位置处。
在其他实施例中,在用户选择执行对应的控件的功能的情况下,还可以从所述第一显示界面跳转至第二显示界面。例如在用户界面800中,用户选择新的页签之后,响应于对应的控制指令,第二设备105可以从当前页面跳装至新的用户界面。另外,可选的,所述界面焦点可以设置于所述新的用户界面中的第一个控件上,也可以设置于所述新的用户界面中中间位置处的控件上,在此不做限制。
在本申请的一个实施例中,第二设备105的第二显示器109可以显示所述界面焦点,也可以不显示所述界面焦点。在第二显示器109不显示界面焦点的情况下,第二设备105在确定界面焦点在所述第二显示界面中的新位置的情况下,可以将所述界面焦点在所述第二显示界面的新位置处叠加之后发送至第一设备101,第一设备101在接收到叠加所述界面焦点的第二显示界面之后,可以直接展示所述第二显示界面。具体来说,本申请实施例中,将所述界面焦点叠加于所述第二显示界面的方式可以根据预先设置的叠加规则执行。在所述第二显示界面包括图像的情况下,叠加规则例如可以包括在对应的图像位置处高亮显示、添加闪烁的光标、在对应的控件图像上增加边界框等。当然,叠加规则可以包括任何能够在第二显示界面中突出显示界面焦点的方式,本申请在此不做限制。
当然,在本申请的另一个实施例中,还可以由第一设备101将所述界面焦点叠加于第二显示界面中。基于此,第二设备105在确定所述第二显示界面和所述界面焦点在所述第二显示界面中的新位置之后,可以将所述第二显示界面与所述新位置的信息发送至第一设备101。
第一设备101在接收到所述第二显示界面和所述界面焦点的新位置信息之后,可以在所述界面焦点叠加于所述第二显示界面的所述新位置处。由于第一显示器107和第二显示器109尺寸可能不匹配,因此,第一设备101在接收到所述界面焦点的新位 置信息之后,可以将新位置信息适配为在第一显示器107中的位置。
在一个示例中,对于图5所示的用户界面500,用户通过遥控器103发送遥控数据“向下键”,试图将所述界面焦点的位置从控件2下转至控件4。基于此,第二设备105在接收到所述遥控数据或者所述遥控数据对应的控制数据之后,将用户界面500调整至图10所示的用户界面1000。根据用户界面调整的一些规则,例如,控件4的位置保持在用户界面中的上半部分,因此,用户界面1000和用户界面500相同。第二设备105还确定界面焦点在用户界面1000中的位置,即控件4的位置,并将控件4的位置信息发送至第一设备101。第一设备101在接收到用户界面1000和所述界面焦点在用户界面1000的位置,即控件4的位置之后,可以将所述界面焦点叠加于用户界面1000中控件4的位置处,生成图10所示的效果。
本申请还从第一设备101的角度提出一种跨设备同步显示的控制方法,该控制方法可以应用于智能显示设备、智能电视、投影设备等任何具有显示功能的第一设备101。该第一设备101与具有第二显示器109的第二设备105相耦合,且第一设备的第一显示器107同步显示第二显示器109的第一显示内容。第一设备可以从对应的遥控设备103接收遥控数据。基于与第二设备105之间的耦合关系,第一设备101将所述遥控数据或者与所述遥控数据相匹配的控制数据发送至第二设备105。第一设备101还可以接收第二显示内容,所述第二显示内容包括根据所述遥控数据对所述第一显示内容调整后的显示内容。
可选的,在本申请的一个实施例中,所述第一显示器还用于显示界面焦点,且所述遥控数据包括对所述界面焦点的操作数据。
可选的,在本申请的一个实施例中,所述界面焦点设置于显示界面的控件上。
可选的,在本申请的一个实施例中,所述控制数据包括将所述遥控数据按照预设转换规则转换得到的所述第二设备可执行的控制指令。
可选的,在本申请的一个实施例中,所述接收第二显示界面,包括:
接收第二显示界面以及所述界面焦点在所述第二显示界面中的位置;
将所述界面焦点叠加于所述第二显示界面的所述位置处。
下面先从第二设备105的角度说明所述跨设备同步显示的控制方法的一种实施例,该控制方法可以应用于任何数据处理功能的第二设备105,尤其包括智能手机、平板电脑等设备。具有第二显示器109的第二设备105可以接收遥控数据或者与所述遥控数据相匹配的控制数据,其中,所述遥控数据针对具有第一显示器107的第一设备101,所述第二设备105与所述第一设备101相耦合,所述第一显示器107同步显示所述第二显示器109的第一显示界面。第二设备105可以根据所述遥控数据或者所述控制数据对所述第一显示界面进行调整,并生成第二显示界面,最后发送所述第二显示界面至第一设备101。
可选的,在本申请的一个实施例中,所述遥控数据包括针对所述第一显示器所显示的界面焦点的操作数据。
可选的,在本申请的一个实施例中,所述根据所述遥控数据对所述第一显示界面 进行调整,包括:
根据所述遥控数据确定所述界面焦点的新位置;
根据所述新位置调整所述第一显示界面,生成第二显示界面。
可选的,在本申请的一个实施例中,所述发送所述第二显示界面,包括:
将所述界面焦点叠加于所述第二显示界面的所述新位置处;
发送叠加所述界面焦点之后的第二显示界面。
可选的,在本申请的一个实施例中,所述发送所述第二显示界面,包括:
发送所述第二显示界面和所述界面焦点的新位置。
可选的,在本申请的一个实施例中,所述界面焦点包括所述第一显示界面中的控件。
可选的,在本申请的一个实施例中,所述控制数据包括将所述遥控数据按照预设转换规则转换得到的所述第二设备可执行的控制指令。。
可选的,在本申请的一个实施例中,所述根据所述遥控数据对所述第一显示界面进行调整,包括:
按照预设解析规则将所述遥控数据解析成与所述第二设备相匹配的控制指令;
利用所述控制指令对所述第一显示界面进行调整。
可选的,在本申请的一个实施例中,所述按照预设转换规则将所述遥控数据转换成与所述第二设备相匹配的控制指令,包括:
获取所述第一设备的标识信息;
根据所述标识信息确定与所述第一设备相匹配的预设解析规则;
按照所述预设解析规则将所述遥控数据解析成与所述第二设备相匹配的控制指令。
本申请另一方面还提供第一设备101的一种实施例,图11展示了第一设备101的该实施例的模块结构示意图,如图11所示,第一设备101包括第一显示器107、第一网络模块1301、第一数据接收模块1303、第一数据发送模块1305,其中,
所述第一网络模块1301,用于与具有第二显示器的第二设备相耦合;
所述第一显示器107,用于同步显示所述第二显示器的第一显示界面;
所述第一数据接收模块1303,用于接收遥控数据,以及,用于接收第二显示界面,所述第二显示界面包括根据所述遥控数据或者与所述遥控数据相匹配的控制数据对所述第一显示界面调整后的显示界面;
所述第一数据发送模块1305,用于将所述遥控数据或者与所述控制数据发送至所述第二设备。
可选的,在本申请的一个实施例中,所述第一显示器107还用于显示界面焦点,且所述遥控数据包括对所述界面焦点的操作数据。
可选的,在本申请的一个实施例中,所述界面焦点设置于显示界面的控件上。
可选的,在本申请的一个实施例中,第一设备101还包括第一数据处理模块1307,
所述第一数据处理模块1307,用于按照预设转换规则将所述遥控数据转换成与所述第二设备相匹配的控制指令;
对应地,所述第一数据发送模块1305,用于发送所述控制指令至所述第二设备。
可选的,在本申请的一个实施例中,所述第一数据接收模块1303,具体用于:
接收第二显示界面以及所述界面焦点在所述第二显示界面中的位置;
将所述界面焦点叠加于所述第二显示界面的所述位置处。
本申请另一方面还提供第二设备105的一种实施例,图11展示了第二设备105的该实施例的模块结构示意图,如图11所示,第二设备105包括第二显示器109、第二网络模块1301’、第二数据接收模块1303’、第二数据处理模块1307’、第二数据发送模块1305’,其中,
所述第二网络模块1301’,用于与具有第一显示器107的第一设备101相耦合,所述第一显示器同步显示所述第二显示器的第一显示界面;
所述第二数据接收模块1303’,用于接收针对所述第一设备101的遥控数据或者与所述遥控数据相匹配的控制数据;
所述第二数据处理模块1307’,用于根据所述遥控数据或者所述控制数据对所述第一显示界面进行调整,并生成第二显示界面;
所述第二数据发送模块1305’,用于发送所述第二显示界面。
可选的,在本申请的一个实施例中,所述遥控数据包括针对所述第一显示器所显示的界面焦点的操作数据。
可选的,在本申请的一个实施例中,所述第二数据处理模块1307’,具体用于:
根据所述遥控数据或者所述控制数据确定所述界面焦点的新位置;
根据所述新位置调整所述第一显示界面,生成第二显示界面。
可选的,在本申请的一个实施例中,所述第二数据发送模块1305’,具体用于:
将所述界面焦点叠加于所述第二显示界面的所述新位置处;
发送叠加所述界面焦点之后的第二显示界面。
可选的,在本申请的一个实施例中,所述第二数据发送模块1305’,具体用于:
发送所述第二显示界面和所述界面焦点的新位置。
可选的,在本申请的一个实施例中,所述界面焦点包括所述第一显示界面中的控件。
可选的,在本申请的一个实施例中,所述控制数据包括将所述遥控数据按照预设转换规则转换得到的所述终端设备可执行的控制指令。
可选的,在本申请的一个实施例中,所述第二数据处理模块1307’,具体用于:
按照预设解析规则将所述遥控数据解析成与所述第二设备相匹配的控制指令;
利用所述控制指令对所述第一显示界面进行调整。
可选的,在本申请的一个实施例中,所述第二数据处理模块1307’,具体用于:
获取所述第一设备的标识信息;
根据所述标识信息确定与所述第一设备相匹配的预设解析规则;
按照所述预设解析规则将所述遥控数据解析成与所述第二设备相匹配的控制指令。
在本申请的实施方式中,可以将发起跨设备传输数据并发送数据的一方称作源(source)端、接收数据的一方称作接收(sink)端。需要说明的是,在一对关系中作 为源端的设备,在另一对关系中也可能为接收端,也就是说,对于一个终端设备来说,其既可能是作为另一个终端设备的源端,也可能是另一个终端设备的接收端。
本申请涉及的终端设备(包括上文所述的源端的设备和接收端的设备)可以是指具有无线连接功能的设备,无线连接的功能是指可以通过wifi、蓝牙等无线连接方式与其他终端设备进行连接,本申请的终端设备也可以具有有线连接进行通信的功能。本申请的终端设备可以是触屏的、也可以是非触屏的,触屏的可以通过手指、触控笔等在显示屏幕上点击、滑动等方式对终端设备进行控制,非触屏的设备可以连接鼠标、键盘、触控面板等输入设备,通过输入设备对终端设备进行控制,没有屏幕的设备比如说可以是没有屏幕的蓝牙音箱等。
举例来说,本申请的终端设备可以是智能手机、上网本、平板电脑、笔记本电脑、可穿戴电子设备(如智能手环、智能手表等)、TV、虚拟现实设备、音响、电子墨水,等等。
图12示出根据本申请一实施例的终端设备的结构示意图。以终端设备是手机为例,图12示出了手机200的结构示意图。
手机200可以包括处理器210,外部存储器接口220,内部存储器221,USB接口230,充电管理模块240,电源管理模块241,电池242,天线1,天线2,移动通信模块251,无线通信模块252,音频模块270,扬声器270A,受话器270B,麦克风270C,耳机接口270D,传感器模块280,按键290,马达291,指示器292,摄像头293,显示屏294,以及SIM卡接口295等。其中传感器模块280可以包括陀螺仪传感器280A,加速度传感器280B,接近光传感器280G、指纹传感器280H,触摸传感器280K(当然,手机200还可以包括其它传感器,比如温度传感器,压力传感器、距离传感器、磁传感器、环境光传感器、气压传感器、骨传导传感器等,图中未示出)。
可以理解的是,本申请实施例示意的结构并不构成对手机200的具体限定。在本申请另一些实施例中,手机200可以包括比图示更多或更少的部件,或者组合某些部件,或者拆分某些部件,或者不同的部件布置。图示的部件可以以硬件,软件或软件和硬件的组合实现。
处理器210可以包括一个或多个处理单元,例如:处理器210可以包括应用处理器(application processor,AP),调制解调处理器,图形处理器(graphics processing unit,GPU),图像信号处理器(image signal processor,ISP),控制器,存储器,视频编解码器,数字信号处理器(digital signal processor,DSP),基带处理器,和/或神经网络处理器(Neural-network Processing Unit,NPU)等。其中,不同的处理单元可以是独立的器件,也可以集成在一个或多个处理器中。其中,控制器可以是手机200的神经中枢和指挥中心。控制器可以根据指令操作码和时序信号,产生操作控制信号,完成取指令和执行指令的控制。
处理器210中还可以设置存储器,用于存储指令和数据。在一些实施例中,处理器210中的存储器为高速缓冲存储器。该存储器可以保存处理器210刚用过或循环使用的指令或数据。如果处理器210需要再次使用该指令或数据,可从所述存储器中直接调用。避免了重复存取,减少了处理器210的等待时间,因而提高了系统的效率。
处理器210可以运行本申请实施例提供的跨设备同步显示的控制法。处理器210 可以包括不同的器件,比如集成CPU和GPU时,CPU和GPU可以配合执行本申请实施例提供的跨设备同步显示的控制方法,比如跨设备同步显示的控制方法中部分算法由CPU执行,另一部分算法由GPU执行,以得到较快的处理效率。
显示屏294用于显示图像,视频等。显示屏294包括显示面板。显示面板可以采用液晶显示屏(liquid crystal display,LCD),有机发光二极管(organic light-emitting diode,OLED),有源矩阵有机发光二极体或主动矩阵有机发光二极体(active-matrix organic light emitting diode的,AMOLED),柔性发光二极管(flex light-emitting diode,FLED),Miniled,MicroLed,Micro-oLed,量子点发光二极管(quantum dot light emitting diodes,QLED)等。在一些实施例中,手机200可以包括1个或N个显示屏294,N为大于1的正整数。显示屏294可用于显示由用户输入的信息或提供给用户的信息以及各种图形用户界面(graphical user interface,GUI)。例如,显示器294可以显示照片、视频、网页、或者文件等。再例如,显示器294可以显示图形用户界面。其中,图形用户界面上包括状态栏、可隐藏的导航栏、时间和天气小组件(widget)、以及应用的图标,例如浏览器图标等。状态栏中包括运营商名称(例如中国移动)、移动网络(例如4G)、时间和剩余电量。导航栏中包括后退(back)键图标、主屏幕(home)键图标和前进键图标。此外,可以理解的是,在一些实施例中,状态栏中还可以包括蓝牙图标、Wi-Fi图标、外接设备图标等。还可以理解的是,在另一些实施例中,图形用户界面中还可以包括Dock栏,Dock栏中可以包括常用的应用图标等。当处理器210检测到用户的手指(或触控笔等)针对某一应用图标的触摸事件后,响应于该触摸事件,打开与该应用图标对应的应用的用户界面,并在显示器294上显示该应用的用户界面。
在本申请实施例中,显示屏294可以是一个一体的柔性显示屏,也可以采用两个刚性屏以及位于两个刚性屏之间的一个柔性屏组成的拼接显示屏。
当处理器210运行本申请实施例提供的跨设备同步显示的控制方法后,终端设备可以通过天线1、天线2或者USB接口与其他的终端设备建立连接,并根据本申请实施例提供的跨设备同步显示的控制方法传输数据以及控制显示屏294显示相应的图形用户界面。
摄像头293(前置摄像头或者后置摄像头,或者一个摄像头既可作为前置摄像头,也可作为后置摄像头)用于捕获静态图像或视频。通常,摄像头293可以包括感光元件比如镜头组和图像传感器,其中,镜头组包括多个透镜(凸透镜或凹透镜),用于采集待拍摄物体反射的光信号,并将采集的光信号传递给图像传感器。图像传感器根据所述光信号生成待拍摄物体的原始图像。
内部存储器221可以用于存储计算机可执行程序代码,所述可执行程序代码包括指令。处理器210通过运行存储在内部存储器221的指令,从而执行手机200的各种功能应用以及数据处理。内部存储器221可以包括存储程序区和存储数据区。其中,存储程序区可存储操作系统,应用程序(比如相机应用,微信应用等)的代码等。存储数据区可存储手机200使用过程中所创建的数据(比如相机应用采集的图像、视频等)等。
内部存储器221还可以存储本申请实施例提供的跨设备同步显示的控制方法对应的一个或多个计算机程序1310。该一个或多个计算机程序1304被存储在上述存储器 221中并被配置为被该一个或多个处理器210执行,该一个或多个计算机程序1310包括指令,上述指令可以用于执行上述任一实施例所述的跨设备同步显示的控制方法。
此外,内部存储器221可以包括高速随机存取存储器,还可以包括非易失性存储器,例如至少一个磁盘存储器件,闪存器件,通用闪存存储器(universal flash storage,UFS)等。
当然,本申请实施例提供的跨设备同步显示的控制方法的代码还可以存储在外部存储器中。这种情况下,处理器210可以通过外部存储器接口220运行存储在外部存储器中的跨设备同步显示的控制方法的代码。
下面介绍传感器模块280的功能。
陀螺仪传感器280A,可以用于确定手机200的运动姿态。在一些实施例中,可以通过陀螺仪传感器280A确定手机200围绕三个轴(即,x,y和z轴)的角速度。即陀螺仪传感器280A可以用于检测手机200当前的运动状态,比如抖动还是静止。
当本申请实施例中的显示屏为可折叠屏时,陀螺仪传感器280A可用于检测作用于显示屏294上的折叠或者展开操作。陀螺仪传感器280A可以将检测到的折叠操作或者展开操作作为事件上报给处理器210,以确定显示屏294的折叠状态或展开状态。
加速度传感器280B可检测手机200在各个方向上(一般为三轴)加速度的大小。即陀螺仪传感器280A可以用于检测手机200当前的运动状态,比如抖动还是静止。当本申请实施例中的显示屏为可折叠屏时,加速度传感器280B可用于检测作用于显示屏294上的折叠或者展开操作。加速度传感器280B可以将检测到的折叠操作或者展开操作作为事件上报给处理器210,以确定显示屏294的折叠状态或展开状态。
接近光传感器280G可以包括例如发光二极管(LED)和光检测器,例如光电二极管。发光二极管可以是红外发光二极管。手机通过发光二极管向外发射红外光。手机使用光电二极管检测来自附近物体的红外反射光。当检测到充分的反射光时,可以确定手机附近有物体。当检测到不充分的反射光时,手机可以确定手机附近没有物体。当本申请实施例中的显示屏为可折叠屏时,接近光传感器280G可以设置在可折叠的显示屏294的第一屏上,接近光传感器280G可根据红外信号的光程差来检测第一屏与第二屏的折叠角度或者展开角度的大小。
陀螺仪传感器280A(或加速度传感器280B)可以将检测到的运动状态信息(比如角速度)发送给处理器210。处理器210基于运动状态信息确定当前是手持状态还是脚架状态(比如,角速度不为0时,说明手机200处于手持状态)。
指纹传感器280H用于采集指纹。手机200可以利用采集的指纹特性实现指纹解锁,访问应用锁,指纹拍照,指纹接听来电等。
触摸传感器280K,也称“触控面板”。触摸传感器280K可以设置于显示屏294,由触摸传感器280K与显示屏294组成触摸屏,也称“触控屏”。触摸传感器280K用于检测作用于其上或附近的触摸操作。触摸传感器可以将检测到的触摸操作传递给应用处理器,以确定触摸事件类型。可以通过显示屏294提供与触摸操作相关的视觉输出。在另一些实施例中,触摸传感器280K也可以设置于手机200的表面,与显示屏294所处的位置不同。
示例性的,手机200的显示屏294显示主界面,主界面中包括多个应用(比如相 机应用、微信应用等)的图标。用户通过触摸传感器280K点击主界面中相机应用的图标,触发处理器210启动相机应用,打开摄像头293。显示屏294显示相机应用的界面,例如取景界面。
手机200的无线通信功能可以通过天线1,天线2,移动通信模块251,无线通信模块252,调制解调处理器以及基带处理器等实现。
天线1和天线2用于发射和接收电磁波信号。手机200中的每个天线可用于覆盖单个或多个通信频带。不同的天线还可以复用,以提高天线的利用率。例如:可以将天线1复用为无线局域网的分集天线。在另外一些实施例中,天线可以和调谐开关结合使用。
移动通信模块251可以提供应用在手机200上的包括2G/3G/4G/5G等无线通信的解决方案。移动通信模块251可以包括至少一个滤波器,开关,功率放大器,低噪声放大器(low noise amplifier,LNA)等。移动通信模块251可以由天线1接收电磁波,并对接收的电磁波进行滤波,放大等处理,传送至调制解调处理器进行解调。移动通信模块251还可以对经调制解调处理器调制后的信号放大,经天线1转为电磁波辐射出去。在一些实施例中,移动通信模块251的至少部分功能模块可以被设置于处理器210中。在一些实施例中,移动通信模块251的至少部分功能模块可以与处理器210的至少部分模块被设置在同一个器件中。在本申请实施例中,移动通信模块251还可以用于与其它终端设备进行信息交互。
调制解调处理器可以包括调制器和解调器。其中,调制器用于将待发送的低频基带信号调制成中高频信号。解调器用于将接收的电磁波信号解调为低频基带信号。随后解调器将解调得到的低频基带信号传送至基带处理器处理。低频基带信号经基带处理器处理后,被传递给应用处理器。应用处理器通过音频设备(不限于扬声器270A,受话器270B等)输出声音信号,或通过显示屏294显示图像或视频。在一些实施例中,调制解调处理器可以是独立的器件。在另一些实施例中,调制解调处理器可以独立于处理器210,与移动通信模块251或其他功能模块设置在同一个器件中。
无线通信模块252可以提供应用在手机200上的包括无线局域网(wireless local area networks,WLAN)(如无线保真(wireless fidelity,Wi-Fi)网络),蓝牙(bluetooth,BT),全球导航卫星系统(global navigation satellite system,GNSS),调频(frequency modulation,FM),近距离无线通信技术(near field communication,NFC),红外技术(infrared,IR)等无线通信的解决方案。无线通信模块252可以是集成至少一个通信处理模块的一个或多个器件。无线通信模块252经由天线2接收电磁波,将电磁波信号调频以及滤波处理,将处理后的信号发送到处理器210。无线通信模块252还可以从处理器210接收待发送的信号,对其进行调频,放大,经天线2转为电磁波辐射出去。本申请实施例中,无线通信模块252,用于在处理器210的控制下与其他终端设备之间传输数据,比如,处理器210运行本申请实施例提供的跨设备同步显示的控制方法时,处理器可以控制无线通信模块252向其他终端设备发送判断请求,还可以接收其他终端设备基于上述判断请求做出的判断结果,判断结果表示要传输的数据能否传输给其他终端设备,然后控制显示屏294显示判断结果,为用户提供直观的视觉反馈,避免错误操作和反复操作,提高操作效率
另外,手机200可以通过音频模块270,扬声器270A,受话器270B,麦克风270C,耳机接口270D,以及应用处理器等实现音频功能。例如音乐播放,录音等。手机200可以接收按键290输入,产生与手机200的用户设置以及功能控制有关的键信号输入。手机200可以利用马达291产生振动提示(比如来电振动提示)。手机200中的指示器292可以是指示灯,可以用于指示充电状态,电量变化,也可以用于指示消息,未接来电,通知等。手机200中的SIM卡接口295用于连接SIM卡。SIM卡可以通过插入SIM卡接口295,或从SIM卡接口295拔出,实现和手机200的接触和分离。
应理解,在实际应用中,手机200可以包括比图12所示的更多或更少的部件,本申请实施例不作限定。图示手机200仅是一个范例,并且手机200可以具有比图中所示出的更多的或者更少的部件,可以组合两个或更多的部件,或者可以具有不同的部件配置。图中所示出的各种部件可以在包括一个或多个信号处理和/或专用集成电路在内的硬件、软件、或硬件和软件的组合中实现。
终端设备的软件系统可以采用分层架构,事件驱动架构,微核架构,微服务架构,或云架构。本申请实施例以分层架构的Android系统为例,示例性说明终端设备的软件结构。
图13是本申请实施例的终端设备的软件结构框图。
分层架构将软件分成若干个层,每一层都有清晰的角色和分工。层与层之间通过软件接口通信。在一些实施例中,将Android系统分为四层,从上至下分别为应用程序层,应用程序框架层,安卓运行时(Android runtime)和系统库,以及内核层。
应用程序层可以包括一系列应用程序包。
如图13所示,应用程序包可以包括电话、相机,图库,日历,通话,地图,导航,WLAN,蓝牙,音乐,视频,短信息等应用程序。
应用程序框架层为应用程序层的应用程序提供应用编程接口(application programming interface,API)和编程框架。应用程序框架层包括一些预先定义的函数。
如图13所示,应用程序框架层可以包括窗口管理器,内容提供器,视图系统,电话管理器,资源管理器,通知管理器等。
窗口管理器用于管理窗口程序。窗口管理器可以获取显示屏大小,判断是否有状态栏,锁定屏幕,截取屏幕等。窗口管理器还可以用于检测是否存在本申请实施例的扩设备传输操作,例如拖拽操作。
内容提供器用来存放和获取数据,并使这些数据可以被应用程序访问。所述数据可以包括视频,图像,音频,拨打和接听的电话,浏览历史和书签,电话簿等。
视图系统包括可视控件,例如显示文字的控件,显示图片的控件等。视图系统可用于构建应用程序。显示界面可以由一个或多个视图组成的。例如,包括短信通知图标的显示界面,可以包括显示文字的视图以及显示图片的视图。
电话管理器用于提供终端设备的通信功能。例如通话状态的管理(包括接通,挂断等)。
资源管理器为应用程序提供各种资源,比如本地化字符串,图标,图片,布局文件,视频文件等等。
通知管理器使应用程序可以在状态栏中显示通知信息,可以用于传达告知类型的 消息,可以短暂停留后自动消失,无需用户交互。比如通知管理器被用于告知下载完成,消息提醒等。通知管理器还可以是以图表或者滚动条文本形式出现在系统顶部状态栏的通知,例如后台运行的应用程序的通知,还可以是以对话窗口形式出现在屏幕上的通知。例如在状态栏提示文本信息,发出提示音,终端设备振动,指示灯闪烁等。
Android Runtime包括核心库和虚拟机。Android runtime负责安卓系统的调度和管理。
核心库包含两部分:一部分是java语言需要调用的功能函数,另一部分是安卓的核心库。
应用程序层和应用程序框架层运行在虚拟机中。虚拟机将应用程序层和应用程序框架层的java文件执行为二进制文件。虚拟机用于执行对象生命周期的管理,堆栈管理,线程管理,安全和异常的管理,以及垃圾回收等功能。
系统库可以包括多个功能模块。例如:表面管理器(surface manager),媒体库(Media Libraries),三维图形处理库(例如:OpenGL ES),2D图形引擎(例如:SGL)等。
表面管理器用于对显示子系统进行管理,并且为多个应用程序提供了2D和3D图层的融合。
媒体库支持多种常用的音频,视频格式回放和录制,以及静态图像文件等。媒体库可以支持多种音视频编码格式,例如:MPEG4,H.264,MP3,AAC,AMR,JPG,PNG等。
三维图形处理库用于实现三维图形绘图,图像渲染,合成,和图层处理等。
2D图形引擎是2D绘图的绘图引擎。
内核层是硬件和软件之间的层。内核层至少包含显示驱动,摄像头驱动,音频驱动,传感器驱动。
本申请的实施例提供了一种跨设备同步显示的控制装置,包括:处理器以及用于存储处理器可执行指令的存储器;其中,所述处理器被配置为执行所述指令时实现上述方法。
本申请的实施例提供了一种非易失性计算机可读存储介质,其上存储有计算机程序指令,所述计算机程序指令被处理器执行时实现上述方法。
本申请的实施例提供了一种计算机程序产品,包括计算机可读代码,或者承载有计算机可读代码的非易失性计算机可读存储介质,当所述计算机可读代码在电子设备的处理器中运行时,所述电子设备中的处理器执行上述方法。
计算机可读存储介质可以是可以保持和存储由指令执行设备使用的指令的有形设备。计算机可读存储介质例如可以是――但不限于――电存储设备、磁存储设备、光存储设备、电磁存储设备、半导体存储设备或者上述的任意合适的组合。计算机可读存储介质的更具体的例子(非穷举的列表)包括:便携式计算机盘、硬盘、随机存取存储器(Random Access Memory,RAM)、只读存储器(Read Only Memory,ROM)、可擦式可编程只读存储器(Electrically Programmable Read-Only-Memory,EPROM或闪存)、静态随机存取存储器(Static Random-Access Memory,SRAM)、便携式压缩盘只读存储器(Compact Disc Read-Only Memory,CD-ROM)、数字多功能盘(Digital  Video Disc,DVD)、记忆棒、软盘、机械编码设备、例如其上存储有指令的打孔卡或凹槽内凸起结构、以及上述的任意合适的组合。
这里所描述的计算机可读程序指令或代码可以从计算机可读存储介质下载到各个计算/处理设备,或者通过网络、例如因特网、局域网、广域网和/或无线网下载到外部计算机或外部存储设备。网络可以包括铜传输电缆、光纤传输、无线传输、路由器、防火墙、交换机、网关计算机和/或边缘服务器。每个计算/处理设备中的网络适配卡或者网络接口从网络接收计算机可读程序指令,并转发该计算机可读程序指令,以供存储在各个计算/处理设备中的计算机可读存储介质中。
用于执行本申请操作的计算机程序指令可以是汇编指令、指令集架构(Instruction Set Architecture,ISA)指令、机器指令、机器相关指令、微代码、固件指令、状态设置数据、或者以一种或多种编程语言的任意组合编写的源代码或目标代码,所述编程语言包括面向对象的编程语言—诸如Smalltalk、C++等,以及常规的过程式编程语言—诸如“C”语言或类似的编程语言。计算机可读程序指令可以完全地在用户计算机上执行、部分地在用户计算机上执行、作为一个独立的软件包执行、部分在用户计算机上部分在远程计算机上执行、或者完全在远程计算机或服务器上执行。在涉及远程计算机的情形中,远程计算机可以通过任意种类的网络—包括局域网(Local Area Network,LAN)或广域网(Wide Area Network,WAN)—连接到用户计算机,或者,可以连接到外部计算机(例如利用因特网服务提供商来通过因特网连接)。在一些实施例中,通过利用计算机可读程序指令的状态信息来个性化定制电子电路,例如可编程逻辑电路、现场可编程门阵列(Field-Programmable Gate Array,FPGA)或可编程逻辑阵列(Programmable Logic Array,PLA),该电子电路可以执行计算机可读程序指令,从而实现本申请的各个方面。
这里参照根据本申请实施例的方法、装置(系统)和计算机程序产品的流程图和/或框图描述了本申请的各个方面。应当理解,流程图和/或框图的每个方框以及流程图和/或框图中各方框的组合,都可以由计算机可读程序指令实现。
这些计算机可读程序指令可以提供给通用计算机、专用计算机或其它可编程数据处理装置的处理器,从而生产出一种机器,使得这些指令在通过计算机或其它可编程数据处理装置的处理器执行时,产生了实现流程图和/或框图中的一个或多个方框中规定的功能/动作的装置。也可以把这些计算机可读程序指令存储在计算机可读存储介质中,这些指令使得计算机、可编程数据处理装置和/或其他设备以特定方式工作,从而,存储有指令的计算机可读介质则包括一个制造品,其包括实现流程图和/或框图中的一个或多个方框中规定的功能/动作的各个方面的指令。
也可以把计算机可读程序指令加载到计算机、其它可编程数据处理装置、或其它设备上,使得在计算机、其它可编程数据处理装置或其它设备上执行一系列操作步骤,以产生计算机实现的过程,从而使得在计算机、其它可编程数据处理装置、或其它设备上执行的指令实现流程图和/或框图中的一个或多个方框中规定的功能/动作。
附图中的流程图和框图显示了根据本申请的多个实施例的装置、系统、方法和计算机程序产品的可能实现的体系架构、功能和操作。在这点上,流程图或框图中的每个方框可以代表一个模块、程序段或指令的一部分,所述模块、程序段或指令的一部 分包含一个或多个用于实现规定的逻辑功能的可执行指令。在有些作为替换的实现中,方框中所标注的功能也可以以不同于附图中所标注的顺序发生。例如,两个连续的方框实际上可以基本并行地执行,它们有时也可以按相反的顺序执行,这依所涉及的功能而定。
也要注意的是,框图和/或流程图中的每个方框、以及框图和/或流程图中的方框的组合,可以用执行相应的功能或动作的硬件(例如电路或ASIC(Application Specific Integrated Circuit,专用集成电路))来实现,或者可以用硬件和软件的组合,如固件等来实现。
尽管在此结合各实施例对本发明进行了描述,然而,在实施所要求保护的本发明过程中,本领域技术人员通过查看所述附图、公开内容、以及所附权利要求书,可理解并实现所述公开实施例的其它变化。在权利要求中,“包括”(comprising)一词不排除其他组成部分或步骤,“一”或“一个”不排除多个的情况。单个处理器或其它单元可以实现权利要求中列举的若干项功能。相互不同的从属权利要求中记载了某些措施,但这并不表示这些措施不能组合起来产生良好的效果。
以上已经描述了本申请的各实施例,上述说明是示例性的,并非穷尽性的,并且也不限于所披露的各实施例。在不偏离所说明的各实施例的范围和精神的情况下,对于本技术领域的普通技术人员来说许多修改和变更都是显而易见的。本文中所用术语的选择,旨在最好地解释各实施例的原理、实际应用或对市场中的技术的改进,或者使本技术领域的其它普通技术人员能理解本文披露的各实施例。

Claims (19)

  1. 一种跨设备同步显示的控制方法,其特征在于,包括:
    具有第一显示器的第一设备接收遥控数据,所述第一设备与具有第二显示器的第二设备相耦合,所述第一显示器同步显示第一显示内容,其中,所述第一显示内容是所述第二显示器显示的;
    将所述遥控数据或者与所述遥控数据相匹配的控制数据发送至所述第二设备;
    接收第二显示内容,所述第二显示内容是所述第二设备根据所述遥控数据或者所述控制数据对所述第一显示内容调整所得到的;
    在所述第一显示器显示所述第二显示内容。
  2. 根据权利要求1所述的方法,其特征在于,所述第一显示器还用于显示界面焦点,且所述遥控数据包括对所述界面焦点的操作数据。
  3. 根据权利要求2所述的方法,其特征在于,所述界面焦点设置于显示界面的控件上。
  4. 根据权利要求1-3任一项所述的方法,其特征在于,所述控制数据包括将所述遥控数据按照预设转换规则转换得到的所述第二设备可执行的控制指令。
  5. 根据权利要求2或3所述的方法,其特征在于,所述接收第二显示内容,包括:
    接收第二显示内容以及所述界面焦点在所述第二显示内容中的位置;
    将所述界面焦点叠加于所述第二显示内容的所述位置处。
  6. 一种跨设备同步显示的控制方法,其特征在于,包括:
    第二设备的第二显示器显示第一显示内容;
    所述第二设备接收遥控数据或者与所述遥控数据相匹配的控制数据,其中,所述遥控数据针对具有第一显示器的第一设备,所述第二设备与所述第一设备相耦合,所述第一显示器同步显示所述第一显示内容;
    根据所述遥控数据或者所述控制数据对所述第一显示内容进行调整,生成第二显示内容;
    发送所述第二显示内容。
  7. 根据权利要求6所述的方法,其特征在于,所述遥控数据包括针对所述第一显示器所显示的界面焦点的操作数据。
  8. 根据权利要求7所述的方法,其特征在于,所述根据所述遥控数据对所述第一显示内容进行调整,包括:
    根据所述遥控数据确定所述界面焦点的新位置;
    根据所述新位置调整所述第一显示内容,生成第二显示内容。
  9. 根据权利要求8所述的方法,其特征在于,所述发送所述第二显示内容,包括:
    将所述界面焦点叠加于所述第二显示内容的所述新位置处;
    发送叠加所述界面焦点之后的第二显示内容。
  10. 根据权利要求8所述的方法,其特征在于,所述发送所述第二显示内容,包括:
    发送所述第二显示内容和所述界面焦点的新位置。
  11. 根据权利要求7-10任一项所述的方法,其特征在于,所述界面焦点包括所述第一显示内容中的控件。
  12. 根据权利要求6-11任一项所述的方法,其特征在于,所述控制数据包括将所述遥控数据按照预设转换规则转换得到的所述第二设备可执行的控制指令。
  13. 根据权利要求6-11任一项所述的方法,其特征在于,所述根据所述遥控数据对所述第一显示内容进行调整,包括:
    按照预设解析规则将所述遥控数据解析成与所述第二设备相匹配的控制指令;
    利用所述控制指令对所述第一显示内容进行调整。
  14. 根据权利要求13所述的方法,其特征在于,所述按照预设转换规则将所述遥控数据转换成与所述第二设备相匹配的控制指令,包括:
    获取所述第一设备的标识信息;
    按照预设解析规则将所述遥控数据解析成与所述第二设备相匹配的控制指令,所述预设解析规则与所述第一设备的标识信息相关。
  15. 一种终端设备,其特征在于,包括:
    处理器;
    用于存储处理器可执行指令的存储器;
    其中,所述处理器被配置为执行所述指令时实现权利要求1-5任意一项所述的方法。
  16. 一种终端设备,其特征在于,包括:
    处理器;
    用于存储处理器可执行指令的存储器;
    其中,所述处理器被配置为执行所述指令时实现权利要求6-14任意一项所述的方法。
  17. 一种跨设备同步显示的控制系统,其特征在于,包括权利要求15所述的终端 设备和权利要求16所述的终端设备。
  18. 一种计算机可读存储介质,其上存储有计算机程序指令,其特征在于,所述计算机程序指令被处理器执行时实现权利要求1-5任意一项所述的方法,或者实现权利要求6-14任意一项所述的方法。
  19. 一种计算机程序产品,其特征在于,包括计算机可读代码,或者承载有计算机可读代码的非易失性计算机可读存储介质,当所述计算机可读代码在电子设备的处理器中运行时,所述电子设备中的处理器执行上述权利要求1-5任意一项所述的方法,或者实现权利要求6-14任意一项所述的方法。
PCT/CN2022/079942 2021-03-17 2022-03-09 一种跨设备同步显示的控制方法及系统 WO2022194005A1 (zh)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN202110287070.X 2021-03-17
CN202110287070.XA CN115113832A (zh) 2021-03-17 2021-03-17 一种跨设备同步显示的控制方法及系统

Publications (1)

Publication Number Publication Date
WO2022194005A1 true WO2022194005A1 (zh) 2022-09-22

Family

ID=83321569

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2022/079942 WO2022194005A1 (zh) 2021-03-17 2022-03-09 一种跨设备同步显示的控制方法及系统

Country Status (2)

Country Link
CN (1) CN115113832A (zh)
WO (1) WO2022194005A1 (zh)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116261006A (zh) * 2023-02-27 2023-06-13 泓凯电子科技(东莞)有限公司 一种无线触摸蓝牙反控系统

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104333789A (zh) * 2014-10-30 2015-02-04 向火平 同屏互动系统及其控制方法
US20150147961A1 (en) * 2013-07-19 2015-05-28 Google Inc. Content Retrieval via Remote Control
CN106502604A (zh) * 2016-09-28 2017-03-15 北京小米移动软件有限公司 投屏切换方法及装置
CN108762702A (zh) * 2012-07-06 2018-11-06 Lg 电子株式会社 移动终端、图像显示装置及使用其的用户接口提供方法
CN109076257A (zh) * 2016-03-16 2018-12-21 Lg电子株式会社 显示装置及其操作方法
CN111880870A (zh) * 2020-06-19 2020-11-03 维沃移动通信有限公司 控制电子设备的方法、装置和电子设备
CN114071207A (zh) * 2020-07-30 2022-02-18 华为技术有限公司 控制大屏设备显示的方法、装置、大屏设备和存储介质

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108762702A (zh) * 2012-07-06 2018-11-06 Lg 电子株式会社 移动终端、图像显示装置及使用其的用户接口提供方法
US20150147961A1 (en) * 2013-07-19 2015-05-28 Google Inc. Content Retrieval via Remote Control
CN104333789A (zh) * 2014-10-30 2015-02-04 向火平 同屏互动系统及其控制方法
CN109076257A (zh) * 2016-03-16 2018-12-21 Lg电子株式会社 显示装置及其操作方法
CN106502604A (zh) * 2016-09-28 2017-03-15 北京小米移动软件有限公司 投屏切换方法及装置
CN111880870A (zh) * 2020-06-19 2020-11-03 维沃移动通信有限公司 控制电子设备的方法、装置和电子设备
CN114071207A (zh) * 2020-07-30 2022-02-18 华为技术有限公司 控制大屏设备显示的方法、装置、大屏设备和存储介质

Also Published As

Publication number Publication date
CN115113832A (zh) 2022-09-27

Similar Documents

Publication Publication Date Title
US20220342850A1 (en) Data transmission method and related device
US20230099824A1 (en) Interface layout method, apparatus, and system
US11797249B2 (en) Method and apparatus for providing lock-screen
WO2022100237A1 (zh) 投屏显示方法及相关产品
CN111666055B (zh) 数据的传输方法及装置
WO2022100239A1 (zh) 设备协作方法、装置、系统、电子设备和存储介质
CN112558825A (zh) 一种信息处理方法及电子设备
US11914850B2 (en) User profile picture generation method and electronic device
WO2022105759A1 (zh) 视频处理方法、装置及存储介质
WO2022033342A1 (zh) 数据传输方法和设备
WO2022028494A1 (zh) 一种多设备数据协作的方法及电子设备
WO2022134691A1 (zh) 一种终端设备中啸叫处理方法及装置、终端
WO2022194005A1 (zh) 一种跨设备同步显示的控制方法及系统
WO2022105716A1 (zh) 基于分布式控制的相机控制方法及终端设备
WO2022105793A1 (zh) 图像处理方法及其设备
US20240125603A1 (en) Road Recognition Method and Apparatus
WO2022121751A1 (zh) 相机控制方法、装置和存储介质
WO2022166614A1 (zh) 针对控件操作的执行方法、装置、存储介质和控件
WO2022105755A1 (zh) 字库同步方法、装置和存储介质
WO2022089276A1 (zh) 一种收藏处理的方法及相关装置
WO2022068628A1 (zh) 一种界面的分布式显示方法、电子设备及通信系统

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 22770360

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 22770360

Country of ref document: EP

Kind code of ref document: A1