WO2022179371A1 - Procédé d'affichage en mosaïque, dispositif électronique et système - Google Patents

Procédé d'affichage en mosaïque, dispositif électronique et système Download PDF

Info

Publication number
WO2022179371A1
WO2022179371A1 PCT/CN2022/073727 CN2022073727W WO2022179371A1 WO 2022179371 A1 WO2022179371 A1 WO 2022179371A1 CN 2022073727 W CN2022073727 W CN 2022073727W WO 2022179371 A1 WO2022179371 A1 WO 2022179371A1
Authority
WO
WIPO (PCT)
Prior art keywords
electronic device
mobile phone
interface
display
application
Prior art date
Application number
PCT/CN2022/073727
Other languages
English (en)
Chinese (zh)
Inventor
王依伦
周星辰
Original Assignee
华为技术有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 华为技术有限公司 filed Critical 华为技术有限公司
Publication of WO2022179371A1 publication Critical patent/WO2022179371A1/fr

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/14Digital output to display device ; Cooperation and interconnection of the display device with other functional units
    • G06F3/1423Digital output to display device ; Cooperation and interconnection of the display device with other functional units controlling a plurality of local displays, e.g. CRT and flat panel display
    • G06F3/1446Digital output to display device ; Cooperation and interconnection of the display device with other functional units controlling a plurality of local displays, e.g. CRT and flat panel display display composed of modules, e.g. video walls
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/14Digital output to display device ; Cooperation and interconnection of the display device with other functional units
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/14Digital output to display device ; Cooperation and interconnection of the display device with other functional units
    • G06F3/1454Digital output to display device ; Cooperation and interconnection of the display device with other functional units involving copying of the display data of a local workstation or window to a remote workstation or window so that an actual copy of the data is displayed simultaneously on two or more displays, e.g. teledisplay

Definitions

  • the embodiments of the present application relate to the field of terminals, and more particularly, to a method, electronic device and system for mosaic display.
  • the present application provides a method, electronic device and system for splicing display, which can realize the display of multiple application programs on multiple devices without the user switching back and forth between applications, which helps to improve the user experience.
  • a system in a first aspect, includes a first electronic device and a second electronic device, the first electronic device includes a first application program and a second application program, the first electronic device is connected to the The second electronic device communicates, and the first electronic device is used to display a first interface, and the first interface is a display interface of the first application; the first electronic device is also used to display the first interface.
  • a first input of the user is detected, and the first input is an operation of starting the second application; the first electronic device is further configured to send the second application to the second electronic device in response to the first input image information corresponding to the display interface of the program; the second electronic device is further configured to display the display interface of the second application program in response to receiving the image information corresponding to the display interface of the second application program.
  • the first electronic device when the first electronic device detects the user's operation of opening the second application program while displaying the display interface of the first application program, the first electronic device may send the display interface of the second application program to the first electronic device. Display on two electronic devices, thereby realizing the display of multiple applications of a certain device on multiple devices, without requiring the user to switch applications on a single device, which helps to improve user experience.
  • the present application also provides a system, the system includes a first electronic device and a second electronic device, the first electronic device includes a first application program and a second application program, the first electronic device The device communicates with the second electronic device through a short-range wireless connection, the first electronic device is used to display a first interface, and the first interface is the display interface of the first application; the first electronic device is also used for The message of the second application program is acquired while the first interface is displayed, and the first input is the operation of starting the second application program; the first electronic device is also used to receive the instruction information sent by the second electronic device, The indication information is used to instruct the first electronic device and the second electronic device to perform splicing display; the first electronic device is also used to display the first interface and send the first electronic device to the second electronic device in response to receiving the indication information Image information corresponding to the display interface of the second application; the second electronic device is further configured to display the display interface of the second application in response to receiving the image information corresponding to the display interface of the second application; or,
  • the message of the second application may be a message from a server of the second application, or the message of the second application may also be a pop-up message in the second application.
  • a time period between when the first electronic device acquires the message of the second application and when the first electronic device receives the indication information is less than or equal to a preset time period.
  • the first electronic device is specifically configured to: receive a notification message sent by a server corresponding to the second application while displaying the first interface; respond to When the notification message is received, a message prompt box is displayed, and the message prompt box is used to prompt the user to receive the notification message; wherein, the first input is the user's input for the message prompt box.
  • the first electronic device may display a message prompt box after receiving the notification message sent by the server, and when the first electronic device detects the user's input to the message prompt box, the first electronic device communicates with the second electronic device through dual applications. mode is displayed.
  • the first input is an operation of the user clicking on the message prompt box; or, the first input is an operation of dragging the message prompt box in a first direction.
  • the first electronic device is located in a first orientation
  • the second electronic device is located in a second orientation.
  • the first electronic device may send image information corresponding to the display interface of the second application to the second electronic device.
  • the first interface includes a first interface element, the first interface element is associated with a pop-up window, and the pop-up window is used to prompt the user to open the second application
  • the first electronic device is further configured to: detect a second input of the user to the first interface element on the first interface; display the pop-up window in response to the second input; wherein the first input is for the Popup input.
  • the first electronic device and the second electronic device may be displayed in a dual application mode when detecting the user's input to the pop-up window in the first application program.
  • the first electronic device is further configured to receive first indication information sent by the second electronic device before detecting the first input, the first The indication information is used to instruct the first electronic device and the second electronic device to perform splicing display; the first electronic device is also used to send the first electronic device to the second electronic device in response to receiving the first indication information Image information corresponding to the desktop of the device; the second electronic device is further configured to display the desktop of the first electronic device in response to receiving the image information corresponding to the desktop of the first electronic device.
  • the first electronic device further includes a third application program
  • the second electronic device is further configured to, in response to detecting the third input of the user, send a request to the first electronic device.
  • An electronic device sends second indication information, the second indication information is used to instruct the second electronic device to detect the third input, and the third input is an input for the third application; the first electronic device further In response to receiving the second indication information, sending the image information corresponding to the display interface of the third application to the second electronic device; the second electronic device is also used in response to receiving the third application.
  • the image information corresponding to the display interface of the device displays the display interface of the third application; the first electronic device is further configured to display the first interface in response to the user's operation of opening the first application.
  • the first electronic device can display the first electronic device through the first application program and the third application program. to determine which application to replace. For example, when the first application program is started earlier, the first electronic device may display the display interface of the second application program and send image information corresponding to the display interface of the third application program to the second electronic device; When the three applications are started earlier, the first electronic device may display the display interface of the first application and send image information corresponding to the display interface of the second application to the second electronic device.
  • the first electronic device further includes a third application program
  • the second electronic device is further configured to, in response to detecting the third input of the user, send a request to the first electronic device.
  • An electronic device sends second indication information, the second indication information is used to instruct the second electronic device to detect the third input, and the third input is an input for the third application; the first electronic device further In response to receiving the second indication information, sending the image information corresponding to the display interface of the third application to the second electronic device; the second electronic device is also used in response to receiving the third application.
  • the image information corresponding to the display interface of the third application program is displayed; the first electronic device is also used to determine the image information corresponding to the display interface of the second application program before sending the image information corresponding to the display interface of the second application program to the second electronic device.
  • the user's input on the first interface is detected within a preset time period and the third indication information sent by the second electronic device is not received, and the third indication information is used to instruct the second electronic device to use the third application program.
  • User input is detected on the display interface.
  • the first electronic device can display the user's first input on the display interface of the first application program through the user at this time. and the operation on the display interface of the third application program to determine the application program that the user focuses on. If the first electronic device determines that the application focused by the user is the first application, the first electronic device may send image information corresponding to the display interface of the second application to the second electronic device, so that the second electronic device uses the second The display interface of the application program replaces the display interface of the third application program, so that the application program that the user is focusing on is prevented from being replaced, which helps to improve the user's experience.
  • the first electronic device may also collect the user's iris information by turning on the camera, so as to determine which electronic device the user focuses on. For example, if the first electronic device determines that the electronic device focused by the user is the second electronic device, the first electronic device may display the display interface of the second application and send the display interface corresponding to the third application to the second electronic device Image information; for another example, if the first electronic device determines that the electronic device focused by the user is the first electronic device, the first electronic device may display the display interface of the first application and send the second application's information to the second electronic device. Displays the image information corresponding to the interface.
  • the first electronic device is further configured to display a second interface before displaying the first interface; the second electronic device is further configured to respond to the user The fourth input of the first electronic device sends fourth indication information to the first electronic device.
  • the fourth indication information is used to instruct the first electronic device and the second electronic device to perform splicing display; the first electronic device is also used to respond to Receive the fourth indication information, expand the image information corresponding to the second interface; divide the expanded image information to obtain the first part of the image information and the second part of the image information, wherein the first part of the image information
  • the first electronic device and the second electronic device are displayed in the full screen mode and the first input of the user is detected, the first electronic device and the second electronic device are switched to the dual application mode, which eliminates the need for the user Switching back and forth between applications, the first application program and the second application program can be viewed respectively through the first electronic device and the second electronic device, which helps to improve the user's experience.
  • the first electronic device is specifically configured to: in response to the first input, display the second interface and send the second application to the second electronic device The corresponding image information of the display interface.
  • the first electronic device may also display the display interface of the second application in response to the first input and send image information corresponding to the first interface to the second electronic device; the second electronic device The device is further configured to display the first interface in response to receiving the image information corresponding to the first interface.
  • a method for splicing display is provided.
  • the method is applied to a first electronic device, where the first electronic device includes a first application program and a second application program, and the first electronic device is connected to a short-range wireless connection with
  • the second electronic device communicates, and the method includes: the first electronic device displays a first interface, where the first interface is a display interface of the first application; the first electronic device detects the first interface while displaying the first interface The user's first input, the first input is an operation to start the second application; the first electronic device responds to the first input and sends to the second electronic device an image corresponding to the display interface of the second application information, so that the second electronic device displays the display interface of the second application.
  • the method before the first electronic device detects the user's first input, the method further includes: receiving the second application program while displaying the first interface The notification message sent by the corresponding server; in response to receiving the notification message, a message prompt box is displayed, and the message prompt box is used to prompt the user to receive the notification message; wherein, the first input is the user's input for the message prompt box .
  • the first input is an operation of the user clicking on the message prompt box; or, the first input is an operation of dragging the message prompt box in a first direction.
  • the first interface includes a first interface element, the first interface element is associated with a pop-up window, and the pop-up window is used to prompt the user to open the second application
  • the method further includes: detecting the user's second input to the first interface element on the first interface; and displaying the pop-up window in response to the second input ; wherein, the first input is the input for the pop-up window.
  • the method before the first electronic device detects the user's first input, the method further includes: the first electronic device receives the first electronic device sent by the second electronic device indication information, the first indication information is used to instruct the first electronic device and the second electronic device to perform splicing display; in response to receiving the first indication information, the first electronic device sends the first electronic device to the second electronic device Image information corresponding to the desktop of an electronic device, so that the second electronic device displays the desktop of the first electronic device.
  • the first electronic device further includes a third application
  • the method further includes: receiving, by the first electronic device, second indication information sent by the second electronic device , the second indication information is used to instruct the second electronic device to detect the third input, and the third input is an input for the third application; the first electronic device responds to receiving the second indication information, Sending image information corresponding to the display interface of the third application to the second electronic device, so that the second electronic device displays the display interface of the third application; the first electronic device responds to the user opening the first application The operation of the program displays the first interface.
  • the first electronic device further includes a third application
  • the method further includes: the first electronic device receiving second indication information sent by the second electronic device , the second indication information is used to instruct the second electronic device to detect the third input, and the third input is an input for the third application; the first electronic device responds to receiving the second indication information, Send the image information corresponding to the display interface of the third application to the second electronic device, so that the second electronic device displays the display interface of the third application; the first electronic device is sending the second electronic device Before the image information corresponding to the display interface of the second application, it is determined that the user's input on the first interface is detected within a preset time period and the third indication information sent by the second electronic device is not received, and the third indication The information is used to instruct the second electronic device to detect the user's input on the display interface of the third application.
  • the method before the first electronic device displays the first interface, the method further includes: the first electronic device displays a second interface; the first electronic device receives the fourth indication information sent by the first electronic device, the fourth indication information is used to instruct the first electronic device and the second electronic device to perform splicing display; the first electronic device is also used to respond to receiving the fourth instruction information, expand the image information corresponding to the second interface; divide the expanded image information to obtain the first part of the image information and the second part of the image information, wherein the first part of the image information is the first interface
  • the displayed image information the first electronic device is further configured to display the first interface and send the second part of the image information to the second electronic device, so that the second electronic device displays the second part of the image information.
  • the method further includes: in response to the first input, the first electronic device displaying the second interface and sending the second application to the second electronic device The image information corresponding to the display interface of the program.
  • an apparatus comprising: a display unit for displaying a first interface, where the first interface is a display interface of the first application; a detection unit for displaying the first interface on the display unit The interface detects the user's first input at the same time, and the first input is an operation to start the second application; the sending unit is configured to respond to the first input and send the second application to the second electronic device. Image information corresponding to the display interface is displayed, so that the second electronic device displays the display interface of the second application program.
  • an electronic device comprising: one or more processors; a memory; and one or more computer programs. Wherein, one or more computer programs are stored in the memory, the one or more computer programs comprising instructions. When the instructions are executed by the electronic device, the electronic device is caused to execute the method in any one of the possible implementations of the second aspect above.
  • a computer program product comprising instructions, which when the computer program product is run on a first electronic device, cause the electronic device to perform the method of the second aspect above.
  • a computer-readable storage medium comprising instructions that, when the instructions are executed on a first electronic device, cause the electronic device to perform the method of the second aspect.
  • a chip is provided for executing instructions, and when the chip is running, the chip executes the method described in the second aspect above.
  • FIG. 1 is a schematic diagram of a hardware structure of an electronic device provided by an embodiment of the present application.
  • FIG. 2 is a block diagram of a software structure provided by an embodiment of the present application.
  • FIG. 3 is a set of graphical user interfaces provided by an embodiment of the present application.
  • FIG. 4 is another set of graphical user interfaces provided by an embodiment of the present application.
  • FIG. 5 is another set of graphical user interfaces provided by the embodiments of the present application.
  • FIG. 6 is another set of graphical user interfaces provided by an embodiment of the present application.
  • FIG. 7 is another set of graphical user interfaces provided by the embodiments of the present application.
  • FIG. 8 is another set of graphical user interfaces provided by the embodiments of the present application.
  • FIG. 9 is another set of graphical user interfaces provided by the embodiments of the present application.
  • FIG. 10 is another set of graphical user interfaces provided by the embodiments of the present application.
  • FIG. 11 is another set of graphical user interfaces provided by the embodiments of the present application.
  • FIG. 12 is another set of graphical user interfaces provided by the embodiments of the present application.
  • FIG. 13 is another set of graphical user interfaces provided by the embodiments of the present application.
  • FIG. 14 is another set of graphical user interfaces provided by the embodiments of the present application.
  • FIG. 15 is another set of graphical user interfaces provided by the embodiments of the present application.
  • FIG. 16 is another set of graphical user interfaces provided by the embodiments of the present application.
  • FIG. 17 is another set of graphical user interfaces provided by the embodiments of the present application.
  • FIG. 18 is another set of graphical user interfaces provided by the embodiments of the present application.
  • FIG. 19 is another set of graphical user interfaces provided by the embodiments of the present application.
  • FIG. 20 is another set of graphical user interfaces provided by the embodiments of the present application.
  • FIG. 21 is another set of graphical user interfaces provided by the embodiments of the present application.
  • FIG. 22 is another set of graphical user interfaces provided by the embodiments of the present application.
  • FIG. 23 is a schematic structural diagram of a source end device and a sink end device provided by an embodiment of the present application.
  • FIG. 24 is a schematic flowchart of a method for displaying a source-end device and a sink-end device in a full-screen mode provided by an embodiment of the present application.
  • FIG. 25 is a process in which the source terminal device according to the embodiment of the present application performs cropping in a split manner and displays the cropped canvas on the virtual screen.
  • FIG. 26 is a process in which the source terminal device according to the embodiment of the present application performs clipping by masking and displays the clipped canvas on the virtual screen.
  • FIG. 27 is another process in which the source-end device according to the embodiment of the present application performs cropping in a split manner and displays the cropped canvas on the virtual screen.
  • FIG. 28 is another process in which the source-end device according to the embodiment of the present application performs the cropping by masking and displays the cropped canvas on the virtual screen.
  • FIG. 29 is a schematic flowchart of a method for displaying a source-end device and a sink-end device in a paging mode according to an embodiment of the present application.
  • FIG. 30 is a schematic diagram showing that the desktop of the source end is displayed in a paging mode according to an embodiment of the present application.
  • FIG. 31 is a schematic flowchart of a method for displaying a source-end device and a sink-end device in a parallel mode according to an embodiment of the present application.
  • FIG. 32 is a process of opening a certain active page in the application layer sequence in the parallel mode provided by the embodiment of the present application.
  • FIG. 33 is a schematic flowchart of a method for displaying a source-end device and a sink-end device in a dual application mode according to an embodiment of the present application.
  • FIG. 34 is a schematic flowchart of a method for switching display in multiple modes provided by an embodiment of the present application.
  • FIG. 35 is another schematic flowchart of a method for switching display in multiple modes provided by an embodiment of the present application.
  • FIG. 36 is another schematic flowchart of a method for switching display in multiple modes provided by an embodiment of the present application.
  • FIG. 37 is a schematic flowchart of a method for splicing display provided by an embodiment of the present application.
  • FIG. 38 is a schematic structural diagram of an apparatus provided by an embodiment of the present application.
  • FIG. 39 is another schematic structural diagram of an electronic device provided by an embodiment of the present application.
  • first and second are only used for descriptive purposes, and should not be construed as indicating or implying relative importance or implicitly indicating the number of indicated technical features.
  • a feature defined as “first” or “second” may expressly or implicitly include one or more of that feature.
  • plural means two or more.
  • the methods provided in the embodiments of the present application can be applied to mobile phones, tablet computers, wearable devices, in-vehicle devices, augmented reality (AR)/virtual reality (VR) devices, notebook computers, ultra-mobile personal computers (ultra mobile personal computers) -mobile personal computer, UMPC), netbook, personal digital assistant (personal digital assistant, PDA) and other electronic devices, the embodiments of the present application do not impose any restrictions on the specific types of electronic devices.
  • AR augmented reality
  • VR virtual reality
  • notebook computers notebook computers
  • netbook personal digital assistant
  • PDA personal digital assistant
  • FIG. 1 shows a schematic structural diagram of an electronic device 100 .
  • the electronic device 100 may include a processor 110, an external memory interface 120, an internal memory 121, a universal serial bus (USB) interface 130, a charge management module 140, a power management module 141, a battery 142, an antenna 1, an antenna 2 , mobile communication module 150, wireless communication module 160, audio module 170, speaker 170A, receiver 170B, microphone 170C, headphone jack 170D, sensor module 180, buttons 190, motor 191, indicator 192, camera 193, display screen 194, and User identification (subscriber identification module, SIM) card interface 195 and so on.
  • SIM Subscriber identification module
  • the sensor module 180 may include a pressure sensor 180A, a gyroscope sensor 180B, an air pressure sensor 180C, a magnetic sensor 180D, an acceleration sensor 180E, a distance sensor 180F, a proximity light sensor 180G, a fingerprint sensor 180H, a temperature sensor 180J, a touch sensor 180K, and ambient light. Sensor 180L, bone conduction sensor 180M, etc.
  • the structures illustrated in the embodiments of the present application do not constitute a specific limitation on the electronic device 100 .
  • the electronic device 100 may include more or less components than shown, or combine some components, or separate some components, or arrange different components.
  • the illustrated components may be implemented in hardware, software, or a combination of software and hardware.
  • the processor 110 may include one or more processing units, for example, the processor 110 may include an application processor (application processor, AP), a modem processor, a graphics processor (graphics processing unit, GPU), an image signal processor (image signal processor, ISP), controller, memory, video codec, digital signal processor (digital signal processor, DSP), baseband processor, and/or neural-network processing unit (NPU) Wait. Wherein, different processing units may be independent devices, or may be integrated in one or more processors.
  • application processor application processor, AP
  • modem processor graphics processor
  • graphics processor graphics processor
  • ISP image signal processor
  • controller memory
  • video codec digital signal processor
  • DSP digital signal processor
  • NPU neural-network processing unit
  • the controller may be the nerve center and command center of the electronic device 100 .
  • the controller can generate an operation control signal according to the instruction operation code and timing signal, and complete the control of fetching and executing instructions.
  • a memory may also be provided in the processor 110 for storing instructions and data.
  • the memory in processor 110 is cache memory. This memory may hold instructions or data that have just been used or recycled by the processor 110 . If the processor 110 needs to use the instruction or data again, it can be called directly from the memory. Repeated accesses are avoided and the latency of the processor 110 is reduced, thereby increasing the efficiency of the system.
  • the processor 110 may include one or more interfaces.
  • the interface may include an integrated circuit (inter-integrated circuit, I2C) interface, an integrated circuit built-in audio (inter-integrated circuit sound, I2S) interface, a pulse code modulation (pulse code modulation, PCM) interface, a universal asynchronous transceiver (universal asynchronous transmitter) receiver/transmitter, UART) interface, mobile industry processor interface (MIPI), general-purpose input/output (GPIO) interface, subscriber identity module (SIM) interface, and / or universal serial bus (universal serial bus, USB) interface, etc.
  • I2C integrated circuit
  • I2S integrated circuit built-in audio
  • PCM pulse code modulation
  • PCM pulse code modulation
  • UART universal asynchronous transceiver
  • MIPI mobile industry processor interface
  • GPIO general-purpose input/output
  • SIM subscriber identity module
  • USB universal serial bus
  • the I2C interface is a bidirectional synchronous serial bus that includes a serial data line (SDA) and a serial clock line (SCL).
  • the processor 110 may contain multiple sets of I2C buses.
  • the processor 110 can be respectively coupled to the touch sensor 180K, the charger, the flash, the camera 193 and the like through different I2C bus interfaces.
  • the processor 110 may couple the touch sensor 180K through the I2C interface, so that the processor 110 and the touch sensor 180K communicate with each other through the I2C bus interface, so as to realize the touch function of the electronic device 100 .
  • the I2S interface can be used for audio communication.
  • the processor 110 may contain multiple sets of I2S buses.
  • the processor 110 may be coupled with the audio module 170 through an I2S bus to implement communication between the processor 110 and the audio module 170 .
  • the audio module 170 can transmit audio signals to the wireless communication module 160 through the I2S interface, so as to realize the function of answering calls through a Bluetooth headset.
  • the PCM interface can also be used for audio communications, sampling, quantizing and encoding analog signals.
  • the audio module 170 and the wireless communication module 160 may be coupled through a PCM bus interface.
  • the audio module 170 can also transmit audio signals to the wireless communication module 160 through the PCM interface, so as to realize the function of answering calls through the Bluetooth headset. Both the I2S interface and the PCM interface can be used for audio communication.
  • the UART interface is a universal serial data bus used for asynchronous communication.
  • the bus may be a bidirectional communication bus. It converts the data to be transmitted between serial communication and parallel communication.
  • a UART interface is typically used to connect the processor 110 with the wireless communication module 160 .
  • the processor 110 communicates with the Bluetooth module in the wireless communication module 160 through the UART interface to implement the Bluetooth function.
  • the audio module 170 can transmit audio signals to the wireless communication module 160 through the UART interface, so as to realize the function of playing music through the Bluetooth headset.
  • the MIPI interface can be used to connect the processor 110 with peripheral devices such as the display screen 194 and the camera 193 .
  • MIPI interfaces include camera serial interface (CSI), display serial interface (DSI), etc.
  • the processor 110 communicates with the camera 193 through a CSI interface, so as to realize the photographing function of the electronic device 100 .
  • the processor 110 communicates with the display screen 194 through the DSI interface to implement the display function of the electronic device 100 .
  • the GPIO interface can be configured by software.
  • the GPIO interface can be configured as a control signal or as a data signal.
  • the GPIO interface may be used to connect the processor 110 with the camera 193, the display screen 194, the wireless communication module 160, the audio module 170, the sensor module 180, and the like.
  • the GPIO interface can also be configured as I2C interface, I2S interface, UART interface, MIPI interface, etc.
  • the USB interface 130 is an interface that conforms to the USB standard specification, and may specifically be a Mini USB interface, a Micro USB interface, a USB Type C interface, and the like.
  • the USB interface 130 can be used to connect a charger to charge the electronic device 100, and can also be used to transmit data between the electronic device 100 and peripheral devices. It can also be used to connect headphones to play audio through the headphones.
  • the interface can also be used to connect other electronic devices, such as AR devices.
  • the interface connection relationship between the modules illustrated in the embodiments of the present application is only a schematic illustration, and does not constitute a structural limitation of the electronic device 100 .
  • the electronic device 100 may also adopt different interface connection manners in the foregoing embodiments, or a combination of multiple interface connection manners.
  • the charging management module 140 is used to receive charging input from the charger.
  • the charger may be a wireless charger or a wired charger.
  • the charging management module 140 may receive charging input from the wired charger through the USB interface 130 .
  • the charging management module 140 may receive wireless charging input through a wireless charging coil of the electronic device 100 . While the charging management module 140 charges the battery 142 , it can also supply power to the electronic device through the power management module 141 .
  • the power management module 141 is used for connecting the battery 142 , the charging management module 140 and the processor 110 .
  • the power management module 141 receives input from the battery 142 and/or the charging management module 140 and supplies power to the processor 110 , the internal memory 121 , the external memory, the display screen 194 , the camera 193 , and the wireless communication module 160 .
  • the power management module 141 can also be used to monitor parameters such as battery capacity, battery cycle times, battery health status (leakage, impedance).
  • the power management module 141 may also be provided in the processor 110 .
  • the power management module 141 and the charging management module 140 may also be provided in the same device.
  • the wireless communication function of the electronic device 100 may be implemented by the antenna 1, the antenna 2, the mobile communication module 150, the wireless communication module 160, the modulation and demodulation processor, the baseband processor, and the like.
  • Antenna 1 and Antenna 2 are used to transmit and receive electromagnetic wave signals.
  • Each antenna in electronic device 100 may be used to cover a single or multiple communication frequency bands. Different antennas can also be reused to improve antenna utilization.
  • the antenna 1 can be multiplexed as a diversity antenna of the wireless local area network. In other embodiments, the antenna may be used in conjunction with a tuning switch.
  • the mobile communication module 150 may provide wireless communication solutions including 2G/3G/4G/5G etc. applied on the electronic device 100 .
  • the mobile communication module 150 may include at least one filter, switch, power amplifier, low noise amplifier (LNA) and the like.
  • the mobile communication module 150 can receive electromagnetic waves from the antenna 1, filter and amplify the received electromagnetic waves, and transmit them to the modulation and demodulation processor for demodulation.
  • the mobile communication module 150 can also amplify the signal modulated by the modulation and demodulation processor, and then turn it into an electromagnetic wave for radiation through the antenna 1 .
  • at least part of the functional modules of the mobile communication module 150 may be provided in the processor 110 .
  • at least part of the functional modules of the mobile communication module 150 may be provided in the same device as at least part of the modules of the processor 110 .
  • the modem processor may include a modulator and a demodulator.
  • the modulator is used to modulate the low frequency baseband signal to be sent into a medium and high frequency signal.
  • the demodulator is used to demodulate the received electromagnetic wave signal into a low frequency baseband signal. Then the demodulator transmits the demodulated low-frequency baseband signal to the baseband processor for processing.
  • the low frequency baseband signal is processed by the baseband processor and passed to the application processor.
  • the application processor outputs sound signals through audio devices (not limited to the speaker 170A, the receiver 170B, etc.), or displays images or videos through the display screen 194 .
  • the modem processor may be a stand-alone device.
  • the modem processor may be independent of the processor 110, and may be provided in the same device as the mobile communication module 150 or other functional modules.
  • the wireless communication module 160 can provide applications on the electronic device 100 including wireless local area networks (WLAN) (such as wireless fidelity (Wi-Fi) networks), bluetooth (BT), global navigation satellites Wireless communication solutions such as global navigation satellite system (GNSS), frequency modulation (FM), near field communication (NFC), and infrared technology (IR).
  • WLAN wireless local area networks
  • BT Bluetooth
  • GNSS global navigation satellite system
  • FM frequency modulation
  • NFC near field communication
  • IR infrared technology
  • the wireless communication module 160 may be one or more devices integrating at least one communication processing module.
  • the wireless communication module 160 receives electromagnetic waves via the antenna 2 , frequency modulates and filters the electromagnetic wave signals, and sends the processed signals to the processor 110 .
  • the wireless communication module 160 can also receive the signal to be sent from the processor 110 , perform frequency modulation on it, amplify it, and convert it into electromagnetic waves for radiation through the antenna 2 .
  • the antenna 1 of the electronic device 100 is coupled with the mobile communication module 150, and the antenna 2 is coupled with the wireless communication module 160, so that the electronic device 100 can communicate with the network and other devices through wireless communication technology.
  • the wireless communication technologies may include global system for mobile communications (GSM), general packet radio service (GPRS), code division multiple access (CDMA), broadband Code Division Multiple Access (WCDMA), Time Division Code Division Multiple Access (TD-SCDMA), Long Term Evolution (LTE), BT, GNSS, WLAN, NFC , FM, and/or IR technology, etc.
  • the GNSS may include a global positioning system (global positioning system, GPS), a global navigation satellite system (GLONASS), a Beidou navigation satellite system (BDS), a quasi-zenith satellite system (quasi -zenith satellite system, QZSS) and/or satellite based augmentation systems (SBAS).
  • GPS global positioning system
  • GLONASS global navigation satellite system
  • BDS Beidou navigation satellite system
  • QZSS quasi-zenith satellite system
  • SBAS satellite based augmentation systems
  • the electronic device 100 implements a display function through a GPU, a display screen 194, an application processor, and the like.
  • the GPU is a microprocessor for image processing, and is connected to the display screen 194 and the application processor.
  • the GPU is used to perform mathematical and geometric calculations for graphics rendering.
  • Processor 110 may include one or more GPUs that execute program instructions to generate or alter display information.
  • Display screen 194 is used to display images, videos, and the like.
  • Display screen 194 includes a display panel.
  • the display panel can be a liquid crystal display (LCD), an organic light-emitting diode (OLED), an active-matrix organic light-emitting diode or an active-matrix organic light-emitting diode (active-matrix organic light).
  • LED diode AMOLED
  • flexible light-emitting diode flexible light-emitting diode (flex light-emitting diode, FLED), Miniled, MicroLed, Micro-oLed, quantum dot light-emitting diode (quantum dot light emitting diodes, QLED) and so on.
  • the electronic device 100 may include one or N display screens 194 , where N is a positive integer greater than one.
  • the electronic device 100 may implement a shooting function through an ISP, a camera 193, a video codec, a GPU, a display screen 194, an application processor, and the like.
  • the ISP is used to process the data fed back by the camera 193 .
  • the shutter is opened, the light is transmitted to the camera photosensitive element through the lens, the light signal is converted into an electrical signal, and the camera photosensitive element transmits the electrical signal to the ISP for processing, and converts it into an image visible to the naked eye.
  • ISP can also perform algorithm optimization on image noise, brightness, and skin tone.
  • ISP can also optimize the exposure, color temperature and other parameters of the shooting scene.
  • the ISP may be provided in the camera 193 .
  • Camera 193 is used to capture still images or video.
  • the object is projected through the lens to generate an optical image onto the photosensitive element.
  • the photosensitive element may be a charge coupled device (CCD) or a complementary metal-oxide-semiconductor (CMOS) phototransistor.
  • CMOS complementary metal-oxide-semiconductor
  • the photosensitive element converts the optical signal into an electrical signal, and then transmits the electrical signal to the ISP to convert it into a digital image signal.
  • the ISP outputs the digital image signal to the DSP for processing.
  • DSP converts digital image signals into standard RGB, YUV and other formats of image signals.
  • the electronic device 100 may include 1 or N cameras 193 , where N is a positive integer greater than 1.
  • a digital signal processor is used to process digital signals, in addition to processing digital image signals, it can also process other digital signals. For example, when the electronic device 100 selects a frequency point, the digital signal processor is used to perform Fourier transform on the frequency point energy and so on.
  • Video codecs are used to compress or decompress digital video.
  • the electronic device 100 may support one or more video codecs.
  • the electronic device 100 can play or record videos of various encoding formats, such as: Moving Picture Experts Group (moving picture experts group, MPEG) 1, MPEG2, MPEG3, MPEG4 and so on.
  • MPEG Moving Picture Experts Group
  • MPEG2 moving picture experts group
  • MPEG3 MPEG4
  • MPEG4 Moving Picture Experts Group
  • the NPU is a neural-network (NN) computing processor.
  • NN neural-network
  • Applications such as intelligent cognition of the electronic device 100 can be implemented through the NPU, such as image recognition, face recognition, speech recognition, text understanding, and the like.
  • the external memory interface 120 can be used to connect an external memory card, such as a Micro SD card, to expand the storage capacity of the electronic device 100 .
  • the external memory card communicates with the processor 110 through the external memory interface 120 to realize the data storage function. For example to save files like music, video etc in external memory card.
  • Internal memory 121 may be used to store computer executable program code, which includes instructions.
  • the processor 110 executes various functional applications and data processing of the electronic device 100 by executing the instructions stored in the internal memory 121 .
  • the internal memory 121 may include a storage program area and a storage data area.
  • the storage program area can store an operating system, an application program required for at least one function (such as a sound playback function, an image playback function, etc.), and the like.
  • the storage data area may store data (such as audio data, phone book, etc.) created during the use of the electronic device 100 and the like.
  • the internal memory 121 may include high-speed random access memory, and may also include non-volatile memory, such as at least one magnetic disk storage device, flash memory device, universal flash storage (UFS), and the like.
  • the electronic device 100 may implement audio functions through an audio module 170, a speaker 170A, a receiver 170B, a microphone 170C, an earphone interface 170D, an application processor, and the like. Such as music playback, recording, etc.
  • the audio module 170 is used for converting digital audio information into analog audio signal output, and also for converting analog audio input into digital audio signal. Audio module 170 may also be used to encode and decode audio signals. In some embodiments, the audio module 170 may be provided in the processor 110 , or some functional modules of the audio module 170 may be provided in the processor 110 .
  • Speaker 170A also referred to as a "speaker" is used to convert audio electrical signals into sound signals.
  • the electronic device 100 can listen to music through the speaker 170A, or listen to a hands-free call.
  • the receiver 170B also referred to as "earpiece" is used to convert audio electrical signals into sound signals.
  • the voice can be answered by placing the receiver 170B close to the human ear.
  • the microphone 170C also called “microphone” or “microphone” is used to convert sound signals into electrical signals.
  • the user can make a sound by approaching the microphone 170C through a human mouth, and input the sound signal into the microphone 170C.
  • the electronic device 100 may be provided with at least one microphone 170C. In other embodiments, the electronic device 100 may be provided with two microphones 170C, which can implement a noise reduction function in addition to collecting sound signals. In other embodiments, the electronic device 100 may further be provided with three, four or more microphones 170C to collect sound signals, reduce noise, identify sound sources, and implement directional recording functions.
  • the earphone jack 170D is used to connect wired earphones.
  • the earphone interface 170D may be the USB interface 130, or may be a 3.5mm open mobile terminal platform (OMTP) standard interface, a cellular telecommunications industry association of the USA (CTIA) standard interface.
  • OMTP open mobile terminal platform
  • CTIA cellular telecommunications industry association of the USA
  • the pressure sensor 180A is used to sense pressure signals, and can convert the pressure signals into electrical signals.
  • the pressure sensor 180A may be provided on the display screen 194 .
  • the capacitive pressure sensor may be comprised of at least two parallel plates of conductive material. When a force is applied to the pressure sensor 180A, the capacitance between the electrodes changes.
  • the electronic device 100 determines the intensity of the pressure according to the change in capacitance. When a touch operation acts on the display screen 194, the electronic device 100 detects the intensity of the touch operation according to the pressure sensor 180A.
  • the electronic device 100 may also calculate the touched position according to the detection signal of the pressure sensor 180A.
  • touch operations acting on the same touch position but with different touch operation intensities may correspond to different operation instructions. For example, when a touch operation whose intensity is less than the first pressure threshold acts on the short message application icon, the instruction for viewing the short message is executed. When a touch operation with a touch operation intensity greater than or equal to the first pressure threshold acts on the short message application icon, the instruction to create a new short message is executed.
  • the gyro sensor 180B may be used to determine the motion attitude of the electronic device 100 .
  • the angular velocity of electronic device 100 about three axes ie, x, y, and z axes
  • the gyro sensor 180B can be used for image stabilization.
  • the gyro sensor 180B detects the shaking angle of the electronic device 100, calculates the distance that the lens module needs to compensate according to the angle, and allows the lens to offset the shaking of the electronic device 100 through reverse motion to achieve anti-shake.
  • the gyro sensor 180B can also be used for navigation and somatosensory game scenarios.
  • the air pressure sensor 180C is used to measure air pressure.
  • the electronic device 100 calculates the altitude through the air pressure value measured by the air pressure sensor 180C to assist in positioning and navigation.
  • the magnetic sensor 180D includes a Hall sensor.
  • the electronic device 100 can detect the opening and closing of the flip holster using the magnetic sensor 180D.
  • the electronic device 100 can detect the opening and closing of the flip according to the magnetic sensor 180D. Further, according to the detected opening and closing state of the leather case or the opening and closing state of the flip cover, characteristics such as automatic unlocking of the flip cover are set.
  • the acceleration sensor 180E can detect the magnitude of the acceleration of the electronic device 100 in various directions (generally three axes).
  • the magnitude and direction of gravity can be detected when the electronic device 100 is stationary. It can also be used to identify the posture of electronic devices, and can be used in applications such as horizontal and vertical screen switching, pedometers, etc.
  • the electronic device 100 can measure the distance through infrared or laser. In some embodiments, when shooting a scene, the electronic device 100 can use the distance sensor 180F to measure the distance to achieve fast focusing.
  • Proximity light sensor 180G may include, for example, light emitting diodes (LEDs) and light detectors, such as photodiodes.
  • the light emitting diodes may be infrared light emitting diodes.
  • the electronic device 100 emits infrared light to the outside through the light emitting diode.
  • Electronic device 100 uses photodiodes to detect infrared reflected light from nearby objects. When sufficient reflected light is detected, it can be determined that there is an object near the electronic device 100 . When insufficient reflected light is detected, the electronic device 100 may determine that there is no object near the electronic device 100 .
  • the electronic device 100 can use the proximity light sensor 180G to detect that the user holds the electronic device 100 close to the ear to talk, so as to automatically turn off the screen to save power.
  • Proximity light sensor 180G can also be used in holster mode, pocket mode automatically unlocks and locks the screen.
  • the ambient light sensor 180L is used to sense ambient light brightness.
  • the electronic device 100 can adaptively adjust the brightness of the display screen 194 according to the perceived ambient light brightness.
  • the ambient light sensor 180L can also be used to automatically adjust the white balance when taking pictures.
  • the ambient light sensor 180L can also cooperate with the proximity light sensor 180G to detect whether the electronic device 100 is in a pocket, so as to prevent accidental touch.
  • the fingerprint sensor 180H is used to collect fingerprints.
  • the electronic device 100 can use the collected fingerprint characteristics to realize fingerprint unlocking, accessing application locks, taking pictures with fingerprints, answering incoming calls with fingerprints, and the like.
  • the temperature sensor 180J is used to detect the temperature.
  • the electronic device 100 uses the temperature detected by the temperature sensor 180J to execute a temperature processing strategy. For example, when the temperature reported by the temperature sensor 180J exceeds a threshold value, the electronic device 100 reduces the performance of the processor located near the temperature sensor 180J in order to reduce power consumption and implement thermal protection.
  • the electronic device 100 when the temperature is lower than another threshold, the electronic device 100 heats the battery 142 to avoid abnormal shutdown of the electronic device 100 caused by the low temperature.
  • the electronic device 100 boosts the output voltage of the battery 142 to avoid abnormal shutdown caused by low temperature.
  • Touch sensor 180K also called “touch panel”.
  • the touch sensor 180K may be disposed on the display screen 194 , and the touch sensor 180K and the display screen 194 form a touch screen, also called a “touch screen”.
  • the touch sensor 180K is used to detect a touch operation on or near it.
  • the touch sensor can pass the detected touch operation to the application processor to determine the type of touch event.
  • Visual output related to touch operations may be provided through display screen 194 .
  • the touch sensor 180K may also be disposed on the surface of the electronic device 100 , which is different from the location where the display screen 194 is located.
  • the bone conduction sensor 180M can acquire vibration signals.
  • the bone conduction sensor 180M can acquire the vibration signal of the vibrating bone mass of the human voice.
  • the bone conduction sensor 180M can also contact the pulse of the human body and receive the blood pressure beating signal.
  • the bone conduction sensor 180M can also be disposed in the earphone, combined with the bone conduction earphone.
  • the audio module 170 can analyze the voice signal based on the vibration signal of the vocal vibration bone block obtained by the bone conduction sensor 180M, so as to realize the voice function.
  • the application processor can analyze the heart rate information based on the blood pressure beat signal obtained by the bone conduction sensor 180M, and realize the function of heart rate detection.
  • the keys 190 include a power-on key, a volume key, and the like. Keys 190 may be mechanical keys. It can also be a touch key.
  • the electronic device 100 may receive key inputs and generate key signal inputs related to user settings and function control of the electronic device 100 .
  • Motor 191 can generate vibrating cues.
  • the motor 191 can be used for vibrating alerts for incoming calls, and can also be used for touch vibration feedback.
  • touch operations acting on different applications can correspond to different vibration feedback effects.
  • the motor 191 can also correspond to different vibration feedback effects for touch operations on different areas of the display screen 194 .
  • Different application scenarios for example: time reminder, receiving information, alarm clock, games, etc.
  • the touch vibration feedback effect can also support customization.
  • the indicator 192 can be an indicator light, which can be used to indicate the charging state, the change of the power, and can also be used to indicate a message, a missed call, a notification, and the like.
  • the SIM card interface 195 is used to connect a SIM card.
  • the SIM card can be contacted and separated from the electronic device 100 by inserting into the SIM card interface 195 or pulling out from the SIM card interface 195 .
  • the electronic device 100 may support 1 or N SIM card interfaces, where N is a positive integer greater than 1.
  • the SIM card interface 195 can support Nano SIM card, Micro SIM card, SIM card and so on. Multiple cards can be inserted into the same SIM card interface 195 at the same time. The types of the plurality of cards may be the same or different.
  • the SIM card interface 195 can also be compatible with different types of SIM cards.
  • the SIM card interface 195 is also compatible with external memory cards.
  • the electronic device 100 interacts with the network through the SIM card to realize functions such as call and data communication.
  • the electronic device 100 adopts an embedded SIM (embedded-SIM, eSIM) card, that is, an embedded SIM card.
  • the eSIM card can be embedded in the electronic device 100 and cannot be separated from the
  • the telephone cards in the embodiments of the present application include but are not limited to SIM cards, eSIM cards, universal subscriber identity modules (USIM), universal integrated telephone cards (universal integrated circuit cards, UICC) and the like.
  • the software system of the electronic device 100 may adopt a layered architecture, an event-driven architecture, a microkernel architecture, a microservice architecture, or a cloud architecture.
  • the embodiments of the present application take an Android system with a layered architecture as an example to exemplarily describe the software structure of the electronic device 100 .
  • FIG. 2 is a block diagram of the software structure of the electronic device 100 according to the embodiment of the present application.
  • the layered architecture divides the software into several layers, and each layer has a clear role and division of labor. Layers communicate with each other through software interfaces.
  • the Android system is divided into four layers, which are, from top to bottom, an application layer, an application framework layer, an Android runtime (Android runtime) and a system library, and a kernel layer.
  • the application layer can include a series of application packages.
  • the application package can include applications such as camera, gallery, calendar, call, map, navigation, WLAN, Bluetooth, music, video, short message and so on.
  • the application framework layer provides an application programming interface (application programming interface, API) and a programming framework for applications in the application layer.
  • the application framework layer includes some predefined functions.
  • the application framework layer may include window managers, content providers, view systems, telephony managers, resource managers, notification managers, and the like.
  • a window manager is used to manage window programs.
  • the window manager can get the size of the display screen, determine whether there is a status bar, lock the screen, take screenshots, etc.
  • Content providers are used to store and retrieve data and make these data accessible to applications.
  • the data may include video, images, audio, calls made and received, browsing history and bookmarks, phone book, etc.
  • the view system includes visual controls, such as controls for displaying text, controls for displaying pictures, and so on. View systems can be used to build applications.
  • a display interface can consist of one or more views.
  • the display interface including the short message notification icon may include a view for displaying text and a view for displaying pictures.
  • the phone manager is used to provide the communication function of the electronic device 100 .
  • the management of call status including connecting, hanging up, etc.).
  • the resource manager provides various resources for the application, such as localization strings, icons, pictures, layout files, video files and so on.
  • the notification manager enables applications to display notification information in the status bar, which can be used to convey notification-type messages, and can disappear automatically after a brief pause without user interaction. For example, the notification manager is used to notify download completion, message reminders, etc.
  • the notification manager can also display notifications in the status bar at the top of the system in the form of graphs or scroll bar text, such as notifications of applications running in the background, and notifications on the screen in the form of dialog windows. For example, text information is prompted in the status bar, a prompt sound is issued, the electronic device vibrates, and the indicator light flashes.
  • Android Runtime includes core libraries and a virtual machine. Android runtime is responsible for scheduling and management of the Android system.
  • the core library consists of two parts: one is the function functions that the java language needs to call, and the other is the core library of Android.
  • the application layer and the application framework layer run in virtual machines.
  • the virtual machine executes the java files of the application layer and the application framework layer as binary files.
  • the virtual machine is used to perform functions such as object lifecycle management, stack management, thread management, safety and exception management, and garbage collection.
  • a system library can include multiple functional modules. For example: surface manager (surface manager), media library (media library), 3D graphics processing library (eg: OpenGL ES), 2D graphics engine (eg: SGL), etc.
  • surface manager surface manager
  • media library media library
  • 3D graphics processing library eg: OpenGL ES
  • 2D graphics engine eg: SGL
  • the Surface Manager is used to manage the display subsystem and provides a fusion of 2D and 3D layers for multiple applications.
  • the media library supports playback and recording of a variety of commonly used audio and video formats, as well as still image files.
  • the media library can support a variety of audio and video encoding formats, such as: MPEG4, H.264, MP3, AAC, AMR, JPG, PNG, etc.
  • the 3D graphics processing library is used to implement 3D graphics drawing, image rendering, compositing, and layer processing.
  • 2D graphics engine is a drawing engine for 2D drawing.
  • the kernel layer is the layer between hardware and software.
  • the kernel layer contains at least display drivers, camera drivers, audio drivers, and sensor drivers.
  • FIG. 3 is a set of graphical user interfaces (graphical user interface, GUI) provided by the embodiments of the present application.
  • the mobile phone A displays a playback interface of the video application, and the video playback screen 301 is displayed on the playback interface, and the mobile phone B displays the desktop of the mobile phone B at this time.
  • the networking methods include but are not limited to access point (access point, AP) networking and peer-to-peer (peer-to-peer, P2P) networking.
  • the AP networking refers to devices under the same AP (for example, a home Wi-Fi router), which can communicate with each other through the AP devices, thereby forming a many-to-many networking.
  • mobile phone A and mobile phone B may be located under the same home router.
  • mobile phone A may use the received signal strength indication (RSSI) technology to calculate the strength of the received signal.
  • RSSI received signal strength indication
  • the distance between Device A and Device B When the distance is less than or equal to the preset distance, mobile phone A and mobile phone B can perform AP networking.
  • Wi-Fi direct Also known as Wi-Fi peer to peer (Wi-Fi P2P), it is a point-to-point connection method. It enables multiple Wi-Fi devices to communicate with each other in a peer-to-peer network (P2P network) without an access point (AP).
  • P2P network peer-to-peer network
  • One of the stations (station, STA) can act as an AP in the traditional sense and is called a group owner (GO); the other STA can be called a group client (GC), GC It is possible to connect to the GO just like an AP.
  • one STA can play the role of GO (ie, act as AP), and other STAs can play the role of GC.
  • the device on the left when one device is close to another device, the device on the left may be the GO by default, and the device on the right may be the GC.
  • phone A can be a GO and phone B can be a GC.
  • the device when a user's swipe to the right is detected on one device, the device can act as a GC and the device can choose another device on its left to be a GO; or, when a device detects a user's swipe When swiping left, the device can act as a GC and the device can select another device on its right as a GO.
  • the discovery phase After the discovery, the establishment of the P2P connection can be triggered.
  • the mobile phone B When the mobile phone B is close to the mobile phone A, the mobile phone A can use the RSSI technology to calculate the distance between the device A and the device B according to the RSSI. When the distance is less than or equal to the preset distance, mobile phone A and mobile phone B can perform P2P networking.
  • Wi-Fi P2P technology is the foundation of upper-layer services.
  • P2P applications built on P2P mainly include Miracast applications and WLAN direct connection applications.
  • Miracast application scenario an electronic device that supports P2P can be scanned to discover and connect to a large-screen device that supports P2P, and then the video, pictures and other resources of the electronic device can be directly sent to the large-screen device for display.
  • P2P technology the experience of Wi-Fi technology will be greatly enriched.
  • mobile phone B After mobile phone B detects that the user slides right on the desktop, mobile phone B sends indication information to mobile phone A, where the indication information is used to indicate that mobile phone B wishes to enter the full screen mode.
  • mobile phone A can double the size of the canvas displayed in the current playback interface.
  • Mobile phone A can crop the enlarged canvas to obtain two regions (region 302 and region 303 ) with the same size.
  • Mobile phone A can display the canvas shown in area 302 through the display screen and project the canvas of area 303 to mobile phone B, so that mobile phone B can display the canvas of area 303 through the display screen.
  • the display interface of the mobile phone B further includes an exit control.
  • mobile phone B detects that the user clicks the exit control, mobile phone B can send instruction information to mobile phone A, where the instruction information is used to instruct mobile phone A to exit the full-screen mode.
  • the exit control may be drawn by mobile phone B.
  • the exit control can be drawn and displayed on the display screen of mobile phone B.
  • mobile phone C when mobile phone C is close to mobile phone A, mobile phone A, mobile phone B and mobile phone C can perform AP networking or P2P networking.
  • the mobile phone C may send indication information to the mobile phone A, where the indication information is used to indicate that the mobile phone C wishes to enter the full screen mode.
  • mobile phone B when mobile phone C is close to mobile phone B, mobile phone B can determine that the distance between mobile phone C and mobile phone B is less than or equal to the preset distance by means of RSSI ranging, and mobile phone A, mobile phone B and mobile phone C can perform AP. Networking, or, mobile phone A, mobile phone B, and mobile phone C can perform P2P networking.
  • mobile phone A and mobile phone B can determine the distance between mobile phone A and mobile phone C and the distance between mobile phone B and mobile phone C respectively by means of RSSI ranging.
  • mobile phone A, mobile phone B and mobile phone C can perform AP Networking, or, mobile phone A, mobile phone B, and mobile phone C can perform P2P networking.
  • mobile phone A can double the size of the canvas displayed in the current playback interface.
  • Mobile phone A can crop the canvas enlarged by 2 times to obtain three areas (area 304 , area 305 and area 306 ) with the same size.
  • Mobile phone A can display the canvas shown in area 304 through the display screen, mobile phone A can project the canvas shown in area 305 to mobile phone B and the canvas shown in area 306 to mobile phone C, so that mobile phone B can display the canvas shown in area 306 to mobile phone C.
  • the canvas shown in the screen display area 305 is displayed, and the mobile phone C displays the canvas shown in the area 306 through the display screen.
  • the display interface of the mobile phone C further includes an exit control.
  • the mobile phone C detects that the user clicks the exit control, the mobile phone C can send instruction information to the mobile phone A, where the instruction information is used to instruct the mobile phone A to exit the full-screen mode.
  • the exit control may be drawn by mobile phone C.
  • mobile phone C can draw the exit control and display it on the display screen of mobile phone C.
  • both the mobile phone B and the mobile phone C may include an exit control.
  • the near-field wireless connection through the near-field wireless connection, multiple devices can be spliced into a larger screen without adding additional hardware devices.
  • the source device can dynamically modify the size of the display canvas and distribute it to each
  • the display on the sink device helps to improve the user experience.
  • the mobile phone A displays the playing interface of the video application
  • the video playing screen is displayed on the playing interface
  • the mobile phone B displays the desktop of the mobile phone B at this time.
  • mobile phone A and mobile phone B can form a network through a near-field wireless connection. After mobile phone B detects that the user slides right on the desktop, mobile phone B sends indication information to mobile phone A, where the indication information is used to indicate that mobile phone B wishes to enter the full screen mode.
  • mobile phone A can double the size of the canvas displayed in the current playback interface. According to the distance between the mobile phone A and the mobile phone B, the mobile phone A can crop the canvas enlarged by 1 times to obtain the canvas shown in the area 401 , the area 402 and the area 403 . Mobile phone A can display the canvas shown in area 401 through the display screen, and project the canvas shown in area 403 to mobile phone B, so that mobile phone B can display the canvas shown in area 403 through the display screen.
  • the pixel value of the area 402 is determined according to the distance between the mobile phone A and the mobile phone B.
  • a possible implementation is that, if the physical size of the screen of the mobile phone A is X (6cm), the screen resolution is M(1080)*N(2340).
  • the physical size of the width of the area 402 may be calculated by mobile phone A by measuring the distance between mobile phone A and mobile phone B according to RSSI; The sum of the physical frame sizes of mobile phone A and mobile phone B when they are fitted.
  • the mobile phone A can use the segmentation method to crop the enlarged canvas; if the distance between the mobile phone A and the mobile phone B is less than Or equal to the preset distance, mobile phone A can use a mask method to crop the enlarged canvas.
  • a preset distance for example, 3cm
  • mobile phone C When mobile phone C is close to mobile phone B, mobile phone A, mobile phone B, and mobile phone C can perform AP networking or P2P networking.
  • the mobile phone C When the mobile phone C detects an operation of the user swiping right, the mobile phone C may send indication information to the mobile phone A, where the indication information is used to indicate that the mobile phone C wishes to enter the full screen mode.
  • the mobile phone C can also indicate the distance between the mobile phone B and the mobile phone C to the mobile phone A.
  • mobile phone A can double the size of the canvas displayed in the current playback interface.
  • Mobile phone A can crop the canvas enlarged by 2 times according to the distance between mobile phone A and mobile phone B and the distance between mobile phone B and mobile phone C.
  • the mobile phone A can crop the enlarged canvas by 2 into area 404 , area 405 , area 406 , area 407 , and area 408 .
  • the pixel values of the area 405 and the area 407 can be determined by the distance between the mobile phone A and the mobile phone B and the distance between the mobile phone B and the mobile phone C.
  • Mobile phone A can display the canvas shown in area 404 through the display screen, and project the canvas shown in area 406 to mobile phone B and the canvas shown in area 408 to mobile phone C, so that mobile phone B can pass the display screen.
  • the canvas shown in the display area 406 is displayed, and the mobile phone C displays the canvas shown in the area 408 through the display screen.
  • FIG. 5 shows another set of GUIs provided by the embodiments of the present application.
  • mobile phone C in response to mobile phone C detecting that the user clicks the exit control, mobile phone C can send indication information to mobile phone A, where the indication information is used to instruct mobile phone C to exit full screen mode.
  • the mobile phone A can adjust the size of the canvas to be doubled from the original size of the canvas to be doubled.
  • Mobile phone A can crop the enlarged canvas to obtain two areas of the same size (for example, the area cloth 302 and the area 303 shown in (b) in FIG. 3 ).
  • Mobile phone A can display the canvas shown in area 302 on the display screen and project the canvas shown in area 303 to mobile phone B, so that mobile phone B can display the canvas in area 303 through the display screen.
  • the mobile phone C exits the full-screen mode, and the desktop of the mobile phone C is displayed.
  • the GUI shown in (c) of FIG. 5 can be displayed.
  • mobile phone B in response to mobile phone B detecting that the user clicks the exit control, mobile phone B can send indication information to mobile phone A, where the indication information is used to instruct mobile phone B to exit full screen mode.
  • the mobile phone A can adjust the size of the canvas from the original one to double the size of the canvas to not expand the size of the canvas. Therefore, mobile phone A can display the video playback interface, mobile phone B exits the full-screen mode, and displays the desktop of mobile phone B.
  • the mobile phone A processes the canvas enlarged by 2 times by dividing, then when the mobile phone A receives the instruction of the mobile phone C to exit the full-screen mode, it can Continue to process the 1x enlarged canvas by dividing.
  • mobile phone A processes the canvas that has been enlarged by 2 times by masking, then when mobile phone A receives the instruction of mobile phone C to exit the full-screen mode, it can continue to mask the canvas that has been enlarged by 1 times. canvas for processing.
  • FIG. 6 shows another set of GUIs provided by the embodiments of the present application.
  • the distance between mobile phone A and mobile phone B is greater than the preset distance, mobile phone A displays the desktop of mobile phone A, and mobile phone B displays the desktop of mobile phone B.
  • mobile phone A and mobile phone B When mobile phone A and mobile phone B are close and the distance between mobile phone A and mobile phone B is less than or equal to the preset distance, mobile phone A and mobile phone B display GUI as shown in (b) of FIG. 6 .
  • control 601 is displayed on the desktop of mobile phone A
  • control 602 is displayed on the desktop of mobile phone B.
  • a GUI as shown in (c) of FIG. 6 is displayed.
  • the mobile phone B in response to the mobile phone B detecting the operation of the user dragging the control 602 to the right, the mobile phone B sends the indication information to the mobile phone A, and the indication information is used to indicate that the mobile phone B wishes to enter the full screen mode .
  • the mobile phone A can expand the canvas on the current interface (the desktop of the mobile phone A) by 1 times.
  • Phone A can crop the enlarged canvas to obtain two areas of the same size (the area on the left and the area on the right).
  • Mobile phone A can display the canvas shown in the area on the left through the display screen (including the network standard, signal strength and Wi-Fi connection status in the status bar, time (08:08) and date information (August 6, Thursday), Icons of App1, App2, App5, App6, App9, App10) and the canvas shown in the right area (including battery level information and time information in the status bar (08:08), weather information (35°C), location information (Xi'an), icons of App3, App4, App7, App8, App11, App12) are projected onto mobile phone B, so that mobile phone B passes through the canvas in the right area of the display screen. Meanwhile, the mobile phone B may also include an icon 603 in the parallel mode, an icon 604 in the dual application mode, an icon 605 in the paging mode, and an exit control 606 .
  • the icon 603 of the parallel mode, the icon 604 of the dual application mode, the icon 605 of the paging mode, and the exit control 606 may be drawn by the mobile phone B, and the mobile phone B may, after receiving the screen projection content of the mobile phone A, display the content in the projection mode. Add the above icons and controls to the screen content.
  • the icon 603 in the parallel mode, the icon 604 in the dual application mode, the icon 605 in the paging mode, and the exit control 606 may also be drawn by the mobile phone A and displayed on the mobile phone A.
  • the icon 603 of the parallel mode, the icon 604 of the dual application mode, and the icon 605 of the paging mode may be drawn by mobile phone A and displayed on mobile phone A, and the exit control 606 may be drawn by mobile phone B and displayed on mobile phone B. superior.
  • App2 may be a video App.
  • the mobile phone B when the mobile phone B detects the operation of the user clicking the control 607, the mobile phone B can send the touch event and the coordinate point information to the mobile phone A, and the touch event is used to indicate the mobile phone B.
  • the click operation of the user is detected, and the coordinate point information is used to indicate the coordinate point detected by the mobile phone B when the user clicks.
  • Mobile phone A can determine that the user has clicked the full-screen viewing control on mobile phone B according to the touch event and the coordinate point information, so that the video window is displayed on mobile phone A and mobile phone B in full screen. It should be understood that, for the process of full-screen display of the video window on the mobile phone A and the mobile phone B, reference may be made to the descriptions in the foregoing embodiments, which are not repeated here for brevity.
  • the mobile phone A shown in FIG. 6 can cut the canvas enlarged by 1 times by dividing the method, or the mobile phone A can also use the mask method to crop the canvas enlarged by 1 times.
  • the specific implementation process can be Referring to the descriptions in the foregoing embodiments, for brevity, details are not repeated here.
  • FIG. 7 shows another set of GUIs provided by the embodiments of the present application.
  • the GUI shows the process of switching from full screen mode to dual application mode.
  • cell phone A and cell phone B are in full screen mode.
  • the GUI shown in (b) of FIG. 7 is displayed.
  • the mobile phone B in response to the mobile phone B detecting that the user has clicked the icon of the dual application mode, the mobile phone B can send indication information to the mobile phone A, and the indication information is used to indicate that the mobile phone B has detected The user taps the icon for dual app mode.
  • mobile phone A displays the desktop of mobile phone A, and mobile phone A sends the canvas corresponding to the desktop of mobile phone A to mobile phone B, so that mobile phone B also displays the desktop of mobile phone A.
  • the GUI shown in (c) of FIG. 7 is displayed.
  • App3 may be a video App
  • App4 may be a photo App
  • mobile phone B in response to mobile phone B detecting that the user clicks the icon of App4, mobile phone B can send a touch event and coordinate point information to mobile phone A, where the touch event is used to indicate the mobile phone B detects the user's click operation, and the coordinate point information is used to indicate the coordinate point detected by the mobile phone B when the user clicks.
  • Mobile phone A can determine that the user has clicked the icon of App4 on mobile phone B according to the coordinate point information. Therefore, the mobile phone A can start the App4 in the background, and project the display interface of the App4 to the mobile phone B, so that the mobile phone B displays the interface of the App4.
  • the user when the mobile phone A and the mobile phone B are in the dual application mode, the user can start different applications on the mobile phone A on the mobile phone A and the mobile phone B respectively, which helps to improve the user experience.
  • the user can use the mobile phone A and the mobile phone B to open the shopping App1 and the shopping App2, respectively, so as to compare the prices of the products; or, the user can use the mobile phone A to chat with the social app, and use the video App on the mobile phone B while chatting. Watch the video.
  • FIG. 8 shows another set of GUIs provided by the embodiments of the present application.
  • the GUI shows the process of switching from full screen mode to paging mode.
  • cell phone A and cell phone B are in full screen mode.
  • the GUI shown in (b) of FIG. 8 is displayed.
  • mobile phone B in response to mobile phone B detecting that the user has clicked the icon of the paging mode, mobile phone B can send indication information to mobile phone A, and the indication information is used to indicate that user B has detected the user The icon for paging mode was clicked.
  • Mobile phone A can switch from the full-screen mode to the paging mode according to the instruction information.
  • the mobile phone A displays the desktop of the mobile phone A, wherein the desktop of the mobile phone A includes three desktop pages, and the mobile phone A currently displays the first desktop page.
  • Mobile phone A projects the canvas corresponding to the second desktop page of mobile phone A to mobile phone B, so that mobile phone B displays the second desktop page of mobile phone A.
  • the GUI shown in (c) of FIG. 8 is displayed.
  • mobile phone A in response to mobile phone A detecting that the user clicks on the icon of App5, mobile phone A can display the display interface of App5, wherein the display interface is the display interface of document 1, and document 1 Includes 8 pages.
  • Mobile phone A can display the content of page 1 of document 1, and mobile phone A can project the content of page 2 of document 1 to mobile phone B, so that mobile phone B can display the content of page 2 of document 1.
  • mobile phone A when mobile phone A detects the user's sliding operation on the display screen (for example, swiping up), mobile phone A can display the content of page 3 of document 1, and mobile phone A can display the content of page 4 of document 1 at the same time. The content is projected to the mobile phone B, so that the mobile phone B displays the content of the fourth page of the document 1.
  • the mobile phone A and the mobile phone B when the mobile phone A and the mobile phone B are in the paging mode, it helps to improve the efficiency when the user browses the interface content, thereby helping to improve the user's experience.
  • FIG. 9 shows another set of GUIs provided by the embodiments of the present application.
  • the GUI shows the process of switching from full screen mode to parallel mode.
  • mobile phone A and mobile phone B are in full screen mode.
  • the GUI shown in (b) of FIG. 9 is displayed.
  • the mobile phone B in response to the mobile phone B detecting that the user clicks the icon of the parallel mode, the mobile phone B can send the indication information to the mobile phone A, and the indication information is used to instruct the mobile phone B to detect the user.
  • the icon for Parallel Mode was clicked.
  • the mobile phone A can determine to switch from the full-screen mode to the parallel mode according to the indication information.
  • Mobile phone A displays the desktop of mobile phone A, and mobile phone B displays the desktop of mobile phone B.
  • the GUI shown in (c) of FIG. 9 is displayed.
  • Mobile phone B can display the desktop of mobile phone B; or, when mobile phone A and mobile phone B enter the parallel mode, mobile phone A can project the canvas corresponding to the desktop of mobile phone A to mobile phone B, so that mobile phone B can display the desktop of mobile phone A.
  • App6 is a social application App.
  • Mobile phone A displays a chat list between the user and multiple contacts, and mobile phone B displays the desktop of mobile phone B at this time.
  • a GUI as shown in (d) of FIG. 9 is displayed.
  • mobile phone A displays the chat list of the user and multiple contacts, and mobile phone A can associate the user with
  • the chat interface of the contact's father is projected onto the mobile phone B, so that the mobile phone B displays the chat interface between the user and the contact's father.
  • the chat interface includes a text input box, a sending control, a control for sending a local photo, a control for sending a photo taken in real time, and a video calling control.
  • the GUI shown in (e) of FIG. 9 is displayed.
  • the mobile phone B in response to the mobile phone B detecting the operation of the user clicking the video call control, the mobile phone B sends a touch event and coordinate point information to the mobile phone A, and the touch event is used to indicate the mobile phone B.
  • the click operation of the user is detected, and the coordinate point information is used to indicate the coordinate point detected by the mobile phone B when the user clicks.
  • Mobile phone A may determine that the user has clicked the video call control on mobile phone B according to the coordinate point information.
  • Mobile phone A may display an interface for requesting the contact father to answer the video call.
  • the mobile phone A can display a video call interface for displaying the user and the contact father, and the mobile phone B can display the user and contact A chat interface for a human dad.
  • the mobile phone B detects the text content input by the user, the mobile phone B can send the text content to the mobile phone A, so as to complete the real reply to the message on the mobile phone A.
  • the updated chat interface between the user and the contact's father can be projected onto the mobile phone B, so that the mobile phone B displays the updated chat interface between the user and the contact's father.
  • the user can send a text message (eg, "show you the picture I took last night") as well as a picture to the contact dad through the chat interface.
  • the embodiment of the present application provides a method for displaying the upper and lower activity pages of an application program in parallel, and the application program can adjust to display two activity pages at the same time in this mode, thereby bringing a better user experience when splicing multiple devices.
  • FIG. 10 shows another set of GUIs provided by the embodiments of the present application.
  • control 1001 is displayed on the desktop of mobile phone A, and control 1002 is displayed on the desktop of mobile phone B.
  • control 1001 is displayed on the desktop of mobile phone A
  • control 1002 is displayed on the desktop of mobile phone B.
  • a GUI as shown in (b) of FIG. 10 is displayed.
  • the mobile phone A in response to the mobile phone A detecting the operation of the user dragging the control 1001 to the left, the mobile phone A sends the indication information to the mobile phone B, and the indication information is used to indicate that the mobile phone A wishes to enter the full screen mode .
  • the mobile phone B can expand the canvas on the current interface (the desktop of the mobile phone B) by 1 times.
  • Mobile phone B can crop the enlarged canvas to obtain two areas of the same size (the area on the left and the area on the right).
  • Mobile phone B can display the canvas shown in the area on the right through the display screen (including battery level information, time (08:08) and date information (August 6, Thursday), location information (High-tech Zone), App15, App16, App19, App20, App23, App24 icons), and put the canvas shown in the left area (including the network standard in the status bar, Wi-Fi connection status, weather information (35°C), App13, App14, App17, App18 , App21, App22) are projected onto the mobile phone A, so that the mobile phone A displays the canvas shown in the area on the left through the display screen. Meanwhile, the mobile phone A may also include an icon 1003 in the parallel mode, an icon 1004 in the dual application mode, an icon 1005 in the paging mode, and an exit control 1006 .
  • the icon 1003 in the parallel mode, the icon 1004 in the dual application mode, the icon 1005 in the paging mode, and the exit control 1006 may be drawn by the mobile phone A and displayed on the display screen of the mobile phone A.
  • the icon 1003 in the parallel mode, the icon 1004 in the dual application mode, the icon 1005 in the paging mode, and the exit control 1006 may also be drawn by the mobile phone B and created by the mobile phone B.
  • the display shows.
  • the icon 1003 of the parallel mode, the icon 1004 of the dual application mode, and the icon 1005 of the paging mode can be displayed on the mobile phone B, and the exit control 1006 can be displayed on the mobile phone A.
  • FIG. 11 shows another set of GUIs in the embodiment of the present application.
  • control 1101 is displayed on the desktop of mobile phone A
  • control 1102 is displayed on the desktop of mobile phone B.
  • the mobile phone B detects the user's operation of dragging the control 1102 to the right
  • a GUI as shown in (b) of FIG. 11 is displayed.
  • the mobile phone B in response to the mobile phone B detecting the user's operation of dragging the control 1102 to the right, the mobile phone B can display the icon of the full screen mode, the icon of the dual application mode, the parallel mode on the display screen icon and the icon for paging mode. If the user wants mobile phone A and mobile phone B to enter the full screen mode, the user can drag the control 1102 to the upper right.
  • the mobile phone B when the mobile phone B detects that the user drags the control 1102 to partially or completely coincident with the icons corresponding to the full-screen mode, the mobile phone B can send the indication information to the mobile phone A, the indication The information is used to indicate that phone B wishes to enter full screen mode.
  • the mobile phone A After receiving the instruction information, the mobile phone A can expand the canvas on the current interface (the desktop of the mobile phone A) by 1 times. Phone A can crop the enlarged canvas to obtain two areas of the same size (the area on the left and the area on the right).
  • Mobile phone A can display the canvas shown in the area on the left through the display screen (including the network standard in the status bar, Wi-Fi connection status, time (08:08) and date information (August 6, Thursday), App1, App2 , App5, App6, App9, App10) and cast the canvas shown in the right area (including battery level information, weather information (35°C), App3, App4, App7, App8, App11, App12 icons) to the screen to mobile phone B, so that mobile phone B passes through the canvas in the right area of the display.
  • the icon of the parallel mode, the icon of the dual application mode, the icon of the paging mode, and the exit control can also be displayed on the mobile phone B.
  • the mobile phone B when the mobile phone B detects that the user drags the control 1102 to the area of the part that overlaps with the icon in the full screen mode, the mobile phone B Mobile phone A sends the instruction information.
  • the mobile phone A shown in FIG. 11 can cut the canvas enlarged by 1 times by dividing, or the mobile phone A can also use the masking method to crop the canvas enlarged by 1 times.
  • FIG. 12 shows another set of GUIs in the embodiment of the present application.
  • control 1201 is displayed on the desktop of mobile phone A
  • control 1202 is displayed on the desktop of mobile phone B.
  • a GUI as shown in (b) of FIG. 12 is displayed.
  • the mobile phone B in response to the mobile phone A detecting the operation of the user dragging the control 1201 to the left, the mobile phone B can display the icon of the full screen mode, the icon of the dual application mode, the parallel mode on the display screen icon and the icon for paging mode. If the user wants the mobile phone A and the mobile phone B to enter the full screen mode, the user can drag the control 1201 to the upper left.
  • mobile phone A when mobile phone A detects that the user drags the control 1201 to partially or completely overlap with the icons in the full-screen mode, mobile phone A sends indication information to mobile phone B, and the indication information uses to indicate that Phone A wishes to enter full screen mode.
  • the mobile phone B After receiving the instruction information, the mobile phone B can expand the canvas on the current interface (the desktop of the mobile phone B) by 1 times. Mobile phone B can crop the enlarged canvas to obtain two areas of the same size (the area on the left and the area on the right).
  • Mobile phone B can display the canvas shown in the area on the right through the display screen (including battery level information, time (08:08) and date information (August 6, Thursday), location information (High-tech Zone), App15, App16, App19, App20, App23, App24 icons), and put the canvas shown in the left area (including the network standard in the status bar, Wi-Fi connection status, weather information (35°C), App13, App14, App17, App18 , App21, App22) are projected onto the mobile phone A, so that the mobile phone A displays the canvas shown in the area on the left through the display screen.
  • the icon of the parallel mode, the icon of the dual application mode, the icon of the paging mode, and the exit control can also be displayed on the mobile phone A.
  • the mobile phone A when the mobile phone A detects that the user drags the control 1201 to the area of the part that overlaps with the icon in the full screen mode, the mobile phone A can send the indication information to the mobile phone B.
  • FIG. 13 and FIG. 14 show the process that the mobile phone A and the mobile phone B automatically enter the full screen mode, and the mobile phone A and the mobile phone B automatically exit the full screen mode.
  • mobile phone B when the distance between mobile phone A and mobile phone B is less than or equal to the preset distance, mobile phone B sends indication information to mobile phone A, and the indication information is used to indicate that mobile phone B wishes to enter full screen mode.
  • the mobile phone A can expand the canvas on the current interface (the desktop of the mobile phone A) by 1 times.
  • Mobile phone A can crop the canvas enlarged by 1 times, so that mobile phone A and mobile phone B can be displayed in full screen.
  • the mobile phone A and the mobile phone B may by default be the device on the leftmost side as the source end device, and the device on the right side as the sink end device.
  • mobile phone A and mobile phone B can automatically exit the full screen mode, or mobile phone A can stop pointing to the mobile phone. B sends the cropped canvas.
  • Mobile phone A can display the desktop of mobile phone A, and mobile phone B can display the desktop of mobile phone B.
  • mobile phone A and mobile phone B when mobile phone A detects that the distance between mobile phone A and mobile phone B is greater than the preset distance, mobile phone A and mobile phone B can automatically exit the full screen mode, and mobile phone A can also disconnect wirelessly from mobile phone B.
  • FIG. 15 shows another set of GUIs in the embodiment of the present application.
  • control 1501 is displayed on the desktop of mobile phone A
  • control 1502 is displayed on the desktop of mobile phone B.
  • Mobile phone A displays the playback interface of the video application and mobile phone A receives a notification message from App4 (the contact person who sent the notification message is Li Hua, and the content of the message is "There is a meeting at 9:00 am"), at this time, mobile phone B displays mobile phone B's desktop.
  • App4 the contact person who sent the notification message is Li Hua, and the content of the message is "There is a meeting at 9:00 am
  • mobile phone B displays mobile phone B's desktop.
  • mobile phone B in response to detecting that the user drags the control 1502 to the right, mobile phone B can send indication information to mobile phone A, where the indication information is used to instruct mobile phone A and mobile phone B to perform splicing display. Since mobile phone A is currently displaying the video playback screen and mobile phone A receives a notification message from App4, mobile phone A can determine that mobile phone A and mobile phone B enter the dual application mode. Mobile phone A displays the video playback screen, and mobile phone A sends the canvas corresponding to the chat interface of contact Li Hua in App4 to mobile phone B, so that mobile phone B displays the chat interface with contact Li Hua. In this way, the user can continue to watch the video on the mobile phone A, and can reply to the message through the mobile phone B.
  • the mobile phone B can also display a parallel mode 1503 , a full-screen mode 1504 , a paging mode 1505 , and an exit control 1506 .
  • the display process after the mobile phone B detects that the user clicks on the parallel mode 1503 , the full screen mode 1504 , the paging mode 1505 , and the exit control 1506 may refer to the description in the above embodiment, and will not be repeated here.
  • mobile phone A can also determine that mobile phone A and mobile phone B enter the dual application mode. Mobile phone A displays the video playback interface, and mobile phone A sends the canvas corresponding to the chat interface of contact Li Hua in App4 to mobile phone B, so that mobile phone B displays the chat interface between the user and contact Li Hua.
  • mobile phone B when mobile phone B detects a user's click operation in the text input box 1507 , mobile phone B can send the click event and corresponding coordinate information to mobile phone A. In response to receiving the click event and the corresponding coordinate information, the mobile phone A can determine that the mobile phone B detects that the user has clicked the text input box. Therefore, mobile phone A can pull up the input method on the chat interface, and send the canvas corresponding to the chat interface after pulling up the input method to mobile phone B.
  • the mobile phone B in response to receiving the canvas sent by the mobile phone A, the mobile phone B can display the chat interface after pulling up the input method.
  • mobile phone B when mobile phone B detects the user's click operation on the screen, mobile phone B can send the click event and the corresponding coordinate information to mobile phone A, so that mobile phone A can determine that mobile phone B detects that the user has clicked through the input method.
  • a certain key eg, cell phone A can determine that cell phone B detected that the user clicked the key "o".
  • the mobile phone A can display the text content "o" in the text input box 1507, and at the same time send the canvas corresponding to the chat interface with the text content "o" displayed in the text input box 1507 to the mobile phone B.
  • the mobile phone B After receiving the canvas, the mobile phone B can display the chat interface, wherein the text input box of the chat interface includes the text content "o".
  • the user clicks the send control on mobile phone B.
  • the mobile phone B can send the click event and the corresponding coordinate information to the mobile phone A, so that the mobile phone A can determine that the mobile phone B detects that the user has clicked a certain button through the input method ( For example, mobile phone A can determine that mobile phone B detects that the user has clicked the send control).
  • the mobile phone A can reply the text content (for example, "ok") in the text input box to the user Li Hua.
  • the mobile phone A completes the real reply to the message.
  • mobile phone A After mobile phone A completes the real reply to the message, mobile phone A can also send the canvas corresponding to the chatting interface after replying to the message to mobile phone B, so that mobile phone B can display the chatting interface after replying to the message, wherein the chatting interface includes The content of the message sent by Li Hua was "there is a meeting at 9 am" and the content of the user's reply to the message content was "ok".
  • the mobile phone B when the mobile phone B detects that the user slides up from the bottom of the screen, the mobile phone B can send the touch event to the mobile phone A.
  • mobile phone A In response to receiving the touch event sent by mobile phone A, mobile phone A can determine that the user has detected the user's swipe up operation from the bottom of the screen on the desktop of mobile phone B, and mobile phone A can determine that the user wants to exit the display interface of App4 on mobile phone B. .
  • the mobile phone A can stop sending the image information corresponding to the chat interface between the user and the contact Li Hua in the App4 to the mobile phone B.
  • the mobile phone A can display the display interface of the video App and the mobile phone B can display the desktop of the mobile phone B.
  • mobile phone A when mobile phone A stops sending the image information corresponding to the chat interface between the user and contact Li Hua in App4 to mobile phone B, mobile phone A can also determine the currently displayed video.
  • the display interface of the app is suitable for splicing display in full-screen mode, then mobile phone A can expand the display interface of the video app by 1 times and then divide it, so as to display a part of the divided image information and send another part of the divided image information to mobile phone B .
  • the mobile phone B In response to receiving the other part of the image information after being divided, the mobile phone B can display the other part of the image information. Therefore, mobile phone A and mobile phone B can be spliced and displayed in full-screen mode.
  • FIG. 16 shows another set of GUIs in the embodiment of the present application.
  • mobile phone A and mobile phone B are currently in dual application mode, mobile phone A displays the display interface of App3 on mobile phone A, and mobile phone A can also display the corresponding display interface of the photo application on mobile phone A.
  • the canvas is sent to mobile phone B, so that mobile phone B displays the display interface of the photo application.
  • a message prompt box 1601 is displayed, wherein the message prompt box 1601 includes the contact information (for example, Li Hua) for sending the notification message, The content of the message (eg, "There is a meeting at 9 am").
  • the contact information for example, Li Hua
  • the content of the message eg, "There is a meeting at 9 am"
  • a GUI as shown in (c) of FIG. 16 can be displayed.
  • mobile phone A in response to mobile phone A detecting that the user clicks on the notification message prompt box 1601, mobile phone A can send the canvas corresponding to the chat interface between the user and contact Li Hua in App4 to mobile phone B, Thus, the mobile phone B displays a chat interface with the contact Li Hua.
  • mobile phone A in response to mobile phone A detecting that the user clicks on the notification message prompt box 1601, mobile phone A can also send the canvas corresponding to the home page of App4 to mobile phone B, so that mobile phone B displays the home page of App4.
  • mobile phone A may choose to replace an application opened earlier according to the opening time sequence of App3 and the photo application.
  • the mobile phone A determines that the opening time of App3 is earlier than that of the photo application, then when the mobile phone A detects that the user clicks the operation of the message prompt box 1601, the mobile phone A can use the chat between the user and the contact Li Hua in App4.
  • the interface replaces the previously displayed display interface of App3, and mobile phone A can continue to send the display interface of the photo application to mobile phone B, so that mobile phone B continues to display the display interface of the photo application.
  • the mobile phone A determines that the opening time of the photo application is earlier than that of the App3, then when the mobile phone A detects that the user clicks the operation of the message prompt box 1601, the mobile phone A can continue to display the display interface of the App3 and the mobile phone A can display the display interface of the App3.
  • the mobile phone B sends the image information corresponding to the chat interface between the user and the contact Li Hua in App4. In response to receiving the image information, the mobile phone B can display the chat interface between the user and the contact Li Hua in App4.
  • the mobile phone A may determine which application to replace according to the state of the user's focused application. Exemplarily, if mobile phone A receives the touch event sent by mobile phone B within a preset time period before detecting that the user clicks on the message prompt box 1601 and does not detect the touch operation of the user on mobile phone A, then mobile phone A can determine that The application focused by the user is the photo application displayed on the mobile phone B. Thus mobile phone A can use the chat interface of the user and contact Li Hua in App4 to replace the display interface of App3 displayed before and mobile phone A can continue to send the display interface of the photo application to mobile phone B, so that mobile phone B continues to display the display interface of the photo application. .
  • the mobile phone A can determine that The application focused by the user is App3 displayed on mobile phone A. Therefore, the mobile phone A can continue to display the display interface of App3, and the mobile phone A can send the image information corresponding to the chat interface between the user and the contact Li Hua in App4 to the mobile phone B. In response to receiving the image information, the mobile phone B can display the chat interface between the user and the contact Li Hua in App4.
  • the mobile phone A can detect whether the user's touch operation on the display interface of App3 or receive the indication information of the mobile phone B within the preset time period, and the indication information is used to instruct the mobile phone B to display the photo application.
  • User input is detected on the interface to determine which application the user is focused on.
  • the mobile phone A can replace the display interface of the application program not focused by the user through the chat interface between the user and the contact Li Hua.
  • the mobile phone A may also turn on the camera and collect the user's iris information when detecting the user's input to the message prompt box 1601, thereby determining that the electronic device focused by the user is the mobile phone A or the mobile phone B.
  • mobile phone A determines that the electronic device focused by the user is mobile phone A
  • mobile phone A can continue to display the display interface of App3 and mobile phone A can send mobile phone B image information corresponding to the chat interface between the user and contact Li Hua.
  • the mobile phone B can display the chat interface between the user and the contact Li Hua in App4.
  • mobile phone A determines that the electronic device focused by the user is mobile phone B
  • mobile phone A can use the chat interface between the user and contact Li Hua in App4 to replace the previously displayed display interface of App3, and mobile phone A can continue to send the message to mobile phone B.
  • the display interface of the photo application so that the mobile phone B continues to display the display interface of the photo application.
  • FIG. 17 shows another set of GUIs in the embodiment of the present application.
  • mobile phone A and mobile phone B are currently in full-screen mode, and mobile phone A and mobile phone B are displaying a video playback interface.
  • the display interface of the mobile phone B further includes a parallel mode, a dual application mode, a paging mode, and an exit control.
  • a message prompt box 1701 can be displayed on the video playback interface through mobile phone A and mobile phone B, wherein the message prompt box 1701 includes sending The contact information of the notification message (for example, "Li Hua"), and the content of the message (for example, "there is a meeting at 9 am").
  • the contact information of the notification message for example, "Li Hua”
  • the content of the message for example, "there is a meeting at 9 am”
  • mobile phone A can determine that mobile phone A and mobile phone B switch from splicing display in full screen mode to splicing display in dual application mode.
  • mobile phone A can display the video playback interface, and mobile phone A can send the canvas corresponding to the chat interface between the user and contact Li Hua in App4 on mobile phone A to mobile phone B, thereby making the mobile phone B B shows the chat interface with the contact Li Hua.
  • mobile phone B when mobile phone B detects that the user clicks on the message prompt box 1701, mobile phone B can send a click event and corresponding coordinate information to mobile phone A. After mobile phone A determines that the user has clicked the message prompt box 1701 on mobile phone B through the click event and the corresponding coordinate information, mobile phone A can determine to switch mobile phone A and mobile phone B from splicing display in full screen mode to splicing display in dual application mode. .
  • the display interface of the mobile phone B further includes a parallel mode, a full-screen mode, a paging mode, and an exit control.
  • FIG. 18 shows another set of GUIs in the embodiment of the present application.
  • mobile phone A and mobile phone B are currently in full-screen mode, and mobile phone A and mobile phone B are displaying a video playback interface.
  • a message prompt box 1801 can be displayed on the video playback interface through mobile phone A and mobile phone B, wherein the message prompt box 1801 includes the contact information for sending the notification message (for example, "Li. Hua"), the message content is (for example, "There is a meeting at 9:00 am").
  • the GUI shown in (b) of FIG. 18 can be displayed.
  • the position of the message prompt box 1801 displayed on the mobile phone A and the mobile phone B may occur. Move (the position of the message prompt box 1801 moves to the lower left).
  • the mobile phone A and the mobile phone B can display the GUI as shown in (c) of FIG. 18 .
  • mobile phone A and mobile phone B can display the GUI shown in (c) of FIG. 18 .
  • mobile phone A may determine to switch mobile phone A and mobile phone B from splicing through full screen mode to splicing through dual application mode.
  • mobile phone A can determine to display the chat interface between the user and contact Li Hua in App4 on mobile phone A according to the direction in which the user drags the message prompt box 1801, and sends the canvas corresponding to the display interface of the video application to mobile phone B.
  • Mobile phone A may display a chat interface between the user and contact Li Hua.
  • the mobile phone B can display the video playback interface.
  • mobile phone A can determine to send the canvas corresponding to the chat interface between the user and contact Li Hua in App4 to mobile phone B. , so that the mobile phone B displays the chat interface between the user and the contact Li Hua in App4. Mobile phone A may also determine to display the video playback interface on mobile phone A.
  • FIG. 19 shows another set of GUIs in the embodiment of the present application.
  • mobile phone A and mobile phone B are currently in full-screen mode, and mobile phone A and mobile phone B are displaying a video playback interface.
  • a message prompt box 1901 can be displayed on the video playback interface through mobile phone A and mobile phone B, wherein the message prompt box 1901 includes the contact information for sending the notification message (for example, "Li. Hua"), the message content is (for example, "There is a meeting at 9:00 am").
  • the GUI shown in (b) of FIG. 19 can be displayed.
  • the mobile phone A in response to the mobile phone A detecting that the user has long pressed the message prompt box 1901 and dragged it downward, the mobile phone A may move with the position of the message prompt box 1901 displayed on the mobile phone B. (The position of the message prompt box 1901 is moved directly downward).
  • the mobile phone A and the mobile phone B can display the GUI as shown in (c) of FIG. 19 .
  • mobile phone A and mobile phone B can display the chat interface between the user and contact Li Hua in App4 in full screen mode.
  • mobile phone A and mobile phone B when mobile phone A and mobile phone B display the first application program in full-screen mode, if mobile phone A obtains the message of the second application program, mobile phone A can determine whether to switch to the dual application program according to the user's operation model. If the mobile phone A determines to switch to the dual application mode, the mobile phone A can also determine the display mode of the first application program and the second application program according to the user's operation.
  • mobile phone A when mobile phone A detects that the user drags the message prompt box 1801 to the lower left, mobile phone A can determine that the user wants to chat with the contact Li Hua in App4. The interface is displayed on the device on the left (ie, phone A). Then mobile phone A and mobile phone B can switch from splicing display in full screen mode to splicing display in dual application mode. As shown in (c) in Figure 18, mobile phone A can display the chat interface between the user and contact Li Hua in App4; mobile phone A can also send the image information corresponding to the video playback interface to mobile phone B, so that mobile phone B displays this Video playback interface.
  • mobile phone A when mobile phone A detects that the user drags the message prompt box 1801 to the lower right, mobile phone A can display the video playback interface; mobile phone A can also send mobile phone B the chat interface between the user and contact Li Hua in App4. Corresponding image information, so that the mobile phone B displays the chat interface between the user and the contact Li Hua in App4.
  • mobile phone A when mobile phone A detects that the user drags the message prompt box 1901 directly downward, mobile phone A can determine that the user wishes to display the user and contact person in App4 in full screen mode.
  • Hua's chat interface As shown in (c) of Figure 19, mobile phone A can display the chat interface between the user and contact Li Hua in App4 in full screen mode.
  • FIG. 20 shows another set of GUIs in the embodiment of the present application.
  • the mobile phone A and the mobile phone B can display the home page of the social application and the chat interface between the user and the contact father in the social application in a parallel mode, respectively.
  • mobile phone A receives a notification message from App7 (for example, "The price of commodity 1 you are concerned about has been reduced!”) and mobile phone A detects that the user clicks on the message prompt box 2001, mobile phone A and mobile phone B can display as shown in Figure 20 The GUI shown in (b).
  • mobile phone A in response to detecting that the user clicks on the message prompt box 2001, can determine to switch mobile phone A and mobile phone B from splicing display in parallel mode to splicing display in dual application mode .
  • Mobile phone A can determine to display the home page of the social application, and send the canvas corresponding to the display interface of the product 1 in App7 to mobile phone B.
  • the mobile phone B In response to receiving the canvas, the mobile phone B can display the display interface of the commodity 1 .
  • the mobile phone A in response to detecting that the user clicks on the message prompt box 2001, can also display the display interface of the product 1 in App7 and the mobile phone A continues to send the chat interface between the user and the contact father in the social application to the mobile phone B. Corresponding image information, so that the mobile phone B continues to display the chat interface between the user and the contact father in the social networking application.
  • the mobile phone A in response to detecting that the user clicks on the message prompt box 2001, can display the chat interface between the user and the contact father in the social application, and the mobile phone A can send the display interface of the product 1 in the App7 to the mobile phone B. corresponding image information, so that the mobile phone B displays the display interface of the commodity 1 .
  • the mobile phone A and the mobile phone B when the mobile phone A and the mobile phone B are in parallel mode, if the mobile phone A receives a message and the mobile phone A detects the user's input on the message prompt box, then the mobile phone A can determine that the mobile phone A and the mobile phone B have passed the Switch from splicing display in parallel mode to splicing display in dual application mode, so that users can view two applications on mobile phone A and mobile phone B without exiting the current application, which helps to improve user experience.
  • FIG. 21 shows another set of GUIs in the embodiment of the present application.
  • mobile phone A displays the display interface of content 1 in App4, and the display interface of content 1 includes link 2101, wherein link 2101 is associated with a pop-up window, which is used to prompt the user in App5 View content in 1.
  • Mobile phone B displays the desktop of mobile phone B. When the mobile phone A detects that the user clicks on the link 2101, the mobile phone A can display the GUI as shown in (b) of FIG. 21 .
  • the mobile phone A in response to the user's operation of clicking on the link 2101, can display a pop-up window 2102, wherein the pop-up window 2102 includes the prompt information "whether to open App5 to view the content 1", cancel the control 2103, and Control 2104 is determined.
  • the mobile phone A detects the user's operation of clicking the control 2104, the mobile phone A and the mobile phone B display the GUI shown in (c) of FIG. 21 .
  • mobile phone A in response to detecting that the user clicks on the control 2104, mobile phone A can continue to display the display interface of content 1 in App4, and mobile phone A can send the display interface of content 1 in App5 to mobile phone B corresponding image information. In response to receiving the image information, the mobile phone B can display the display interface of the content 1 in App5.
  • FIG. 22 shows another set of GUIs in the embodiment of the present application.
  • the mobile phone A displays the display interface of App6, and the mobile phone B displays the desktop of the mobile phone B.
  • the mobile phone A detects the operation of the user sliding from the right side to the left side of the screen, the mobile phone A can display the GUI as shown in (b) of FIG. 22 .
  • mobile phone A may display a list 2201 of applications, wherein the list 2201 of applications may include App1, App2, App3, and App4.
  • the App1, App2, App3, and App4 may be several application programs launched by the mobile phone A before launching the App6, or, the App1, App2, App3, and App4 may also be the ones most frequently used by the user within a preset time. several applications.
  • mobile phone A When mobile phone A detects that the user clicks the icon of App3, mobile phone A can continue to display the display interface of App6 and mobile phone A can send mobile phone B image information corresponding to the display interface of App3.
  • the mobile phone A when mobile phone A detects that the user drags the icon of App3 from the list 2201 to the outside of the list 2201, the mobile phone A can continue to display the display interface of App6 and The mobile phone A may send the image information corresponding to the display interface of the App3 to the mobile phone B.
  • the mobile phone B in response to receiving the image information, can display the display interface of the App3.
  • the mobile phone A in response to detecting that the user clicks the icon of App3, or detects that the user drags the icon of App3 from the list 2201 to the outside of the list 2201, the mobile phone A can also display the display interface of the App3 and the mobile phone A can send image information corresponding to the display interface of App6 to mobile phone B, so that mobile phone B displays the display interface of App6.
  • mobile phone A when mobile phone A detects that the user drags the icon of App3 from the left to the right, mobile phone A can continue to display the display interface of App6 and mobile phone A
  • the image information corresponding to the display interface of App3 can be sent to mobile phone B, so that mobile phone B displays the display interface of App3; when mobile phone A detects that the user drags the icon of App3 from the right to the left, mobile phone A can display the display interface of App3.
  • the display interface of App3 and mobile phone A can send image information corresponding to the display interface of App6 to mobile phone B, so that mobile phone B displays the display interface of App6.
  • mobile phone A when mobile phone A detects that the user clicks on the icon of App3, mobile phone A can display a floating window while displaying the display interface of App6.
  • the display interface of App3 can be displayed.
  • mobile phone B detects that the user slides from the left side to the right side of the screen, mobile phone B can send indication information to mobile phone A, where the indication information is used to instruct mobile phone A and mobile phone B to splicing display.
  • mobile phone A can display the display interface of App6 and mobile phone A can send image information corresponding to the display interface of App3 to mobile phone B, so that mobile phone B displays the display interface of App3.
  • mobile phone A when mobile phone A detects the operation that the user drags App3 to the lower left, mobile phone A can display the display interface of App6 and the display interface of App3 by splitting the screen up and down.
  • mobile phone B detects that the user slides from the left side to the right side of the screen, mobile phone B can send indication information to mobile phone A, where the indication information is used to instruct mobile phone A and mobile phone B to splicing display.
  • mobile phone A can display the display interface of App6 and mobile phone A can send image information corresponding to the display interface of App3 to mobile phone B, so that mobile phone B displays the display interface of App3.
  • mobile phone A when mobile phone A detects that the user displays the display interface of App6 and displays the display interface of App3 through the floating window, or when mobile phone A detects that the user displays the display interface of App6 and App3 on a split screen (for example, During the operation of split screen up and down), mobile phone A can determine that mobile phone A and mobile phone B display the display interfaces of App6 and App3 respectively through the dual application mode. This avoids the occlusion of the previously displayed application when displayed through the floating window, and also avoids that the application can only display part of the display interface when the split screen is displayed, and different applications are displayed through multiple devices, which helps to improve user usage. experience.
  • FIG. 23 shows a schematic structural diagram of a source-end device and a sink-end device implementing a full-screen mode according to an embodiment of the present application.
  • the source device may be the mobile phone A shown in FIG. 3
  • the sink device may be the mobile phone B shown in FIG. 3 ; or, the source device may be the mobile phone B shown in FIG. 10 , and the sink device may be It is the mobile phone A shown in Figure 10.
  • the source end device includes an application (App) layer, an application framework (Framework) layer, a local (Native) layer and a service (Server) layer.
  • the App layer may include multiple application programs;
  • the Framework layer includes layer management and input subsystems, the layer management is used to manage the layer information corresponding to the application program interface, and the input subsystem is used to process user input events Or reverse the input event.
  • the Native layer includes a layer composition module and a layer cropping module.
  • the layer composition module is used to synthesize images according to the level information, and the layer cropping module is used to crop the canvas according to the level information.
  • the server layer includes an audio and video stream capture module, a virtual screen management module and a network module.
  • the audio and video stream capture mode is used to capture audio streams or video streams
  • the virtual screen management module is used to manage the creation and release of virtual screens (displays).
  • the network module is used to transmit the audio stream or video stream to the sink device and receive the reverse input event sent by the sink device.
  • the functions implemented by the modules in the local layer and the service layer of the source device in FIG. 23 may depend on the modules in the system library and the kernel layer in FIG. 2 or drive hardware.
  • the hierarchical synthesis module and the hierarchical cropping module in the local layer can be the media library, the three-dimensional graphics processing library and the image processing library etc. in the system library of Fig. 2, the audio and video stream capture module and the virtual screen of the service layer.
  • the management module may depend on the display driver, the audio driver in the kernel layer of FIG. 2, and the three-dimensional graphics processing library and the image processing library in the system library.
  • the sink device includes an application (App) layer, an application framework (Framework) layer, a local (Native) layer and a service (Server) layer.
  • the Framework layer includes a sound system, an input system and a display system.
  • the sound system is used to play sound after audio decoding
  • the display system is used to display the interface after video decoding
  • the input system is used to receive user touch operations.
  • the Native layer includes a video rendering module, an audio and video decoding module, and a reverse input event capture module.
  • the audio and video decoding module is used to decode the audio stream or video stream received from the source, and the video rendering module is used to decode the decoded
  • the video stream is rendered and displayed, and the reverse input event capture module is used to capture user reverse input events.
  • the server layer includes a network module, and the network module is used to receive the audio stream or video stream sent by the source device and the reverse input event sent to the source device.
  • the functions implemented by the modules in the local layer of the sink device in FIG. 23 may depend on the modules in the system library and the kernel layer in FIG. 2 or drive hardware.
  • the video rendering module may depend on the three-dimensional graphics processing library, the image processing library and the display driver in the kernel layer in the system library of FIG. 2 .
  • the audio and video decoding module may depend on the three-dimensional graphics processing library, the image processing library, etc. in the system library of FIG. 2 .
  • the reverse input event capture module may be a touch panel (touch panel, TP) driver that depends on the hardware layer in FIG. 2 , and the like.
  • FIG. 24 shows a schematic flowchart of a method 2400 for displaying the source-end device and the sink-end device in a full-screen mode.
  • the method 2400 includes:
  • the source end device establishes a wireless connection with the sink end device 1.
  • mobile phone A and mobile phone B when mobile phone A detects that the distance between mobile phone A and mobile phone B is less than or equal to the preset distance, mobile phone A and mobile phone B can be connected through a near-field wireless connection. network, so that mobile phone A and mobile phone B establish a wireless connection.
  • mobile phone A and mobile phone B can perform AP networking, or mobile phone A and mobile phone B can perform P2P networking.
  • the sink device 1 instructs the source device to enter the full-screen mode.
  • the mobile phone A is the source end device
  • the mobile phone B is the sink end device 1 .
  • mobile phone B may send indication information to mobile phone A, where the indication information is used to indicate that mobile phone B wishes to enter the full screen mode.
  • the mobile phone A is the source end device
  • the mobile phone B is the sink end device 1 .
  • mobile phone B detects that the user drag control 1102 is partially or completely overlapped with the icons in full screen mode
  • mobile phone B can send indication information to mobile phone A, which is used to indicate that mobile phone B wishes to enter full screen mode.
  • the mobile phone B is the source end device, and the mobile phone A is the sink end device 1 .
  • mobile phone A may send indication information to mobile phone B, where the indication information is used to indicate that mobile phone A wishes to enter the full screen mode.
  • the mobile phone B is the source end device, and the mobile phone A is the sink end device 1 .
  • mobile phone A detects that the user drag control 1201 is partially or completely overlapped with the icons in full screen mode, mobile phone A can send indication information to mobile phone B, the indication information is used to indicate that mobile phone A wishes to enter full screen mode.
  • the sink device when the sink device 1 detects that the user slides to the right or the sink device 1 detects that the dragging of the control to the right is partially or completely overlapped with the icon of a certain mode, the sink device can determine that the The device on the left is the source device; when the sink device 1 detects that the user slides to the left or the sink device 1 detects that the leftward drag of the control is partially or completely overlapped with the icon of a certain mode, the sink device It can be determined that the device on the right is the source device.
  • the source end device may determine the distance between the source end device and the sink end device 1 and the source end device and the sink end device through positioning technologies such as Bluetooth, ultra-wideband (UWB), and ultrasound
  • positioning technologies such as Bluetooth, ultra-wideband (UWB), and ultrasound
  • the source end device has a Bluetooth/Wi-Fi antenna array (or, the source end device has an angle of arrival (angle of arrival, AOA) computing capability), and the sink end device 1 has a Bluetooth/Wi-Fi antenna array (or, The sink device 1 has AOA computing capability).
  • the source device can calculate the orientation of sink device 1, the Bluetooth/Wi-Fi antenna array of the source device can receive the wireless signal of sink device 1, and calculate sink device 1 according to formulas (1) and (2). Orientation:
  • d is the distance between the Bluetooth/Wi-Fi antenna array of the source device and the Bluetooth/Wi-Fi antenna of sink device 1
  • is the wavelength of the Bluetooth signal sent by sink device 1
  • is the angle of arrival.
  • the source-end device calculates the orientation of the sink-end device 1, and it can also be understood that the source-end device can calculate the Bluetooth/Wi-Fi antenna array of the source-end device and the Bluetooth/Wi-Fi antenna array of the sink-end device 1. The bearing of the Wi-Fi antenna connection is calculated.
  • the sink device 1 can also use the above formulas (1) and (2) to calculate the orientation of the source device.
  • the sink device 1 when the sink device 1 detects the first operation of the user, the sink device 1 can send a user datagram protocol (UDP) data packet to the source device, and the UDP data packet can carry The indication information is used to indicate that the sink device 1 wishes to enter the full screen mode.
  • the UPD packet includes the data portion of the IP datagram.
  • the data portion of an IP datagram may include extensible bits.
  • the sink-end device 1 and the source-end device can agree on the content of a certain extensible bit. When a certain extensible bit is 1, the source device can know that the sink device 1 wants to enter the full screen mode.
  • the sink device 1 when the sink device 1 detects the first operation of the user, the sink device 1 can send a transmission control protocol (transmission control protocol, TCP) message to the source device, and the TCP message can carry the Indication information, the indication information is used to indicate that the sink device 1 wishes to enter the full screen mode.
  • TCP transmission control protocol
  • a TCP message includes a TCP header and a TCP data part, wherein the TCP header includes a reserved field.
  • the sink device 1 and the source device can agree on the content of a reserved field. When the bit of a reserved field is 1, the source device can know that the sink device 1 wants to enter the full screen mode.
  • the source device in response to receiving the instruction of sink device 1, the source device doubles the size of the canvas on the current interface.
  • the size of the canvas originally displayed by the source device is 1080x2340.
  • the source device receives the instruction information of sink device 1, the source device can determine that sink device 1 wishes to form a dual-screen splicing with the source device, and the source device expands the display canvas size of the main device to (1080x2) x 2340 , which is 2160x2340.
  • the source device after the source device receives the indication information of the sink device 1, if the source device and the sink device 1 are distributed horizontally (or left and right), the source device can expand the main The device display canvas size is 2160x2340. If the source-end device determines that the source-end device and sink-end device 1 are vertically distributed (or vertically distributed), then the source-end device can also expand the display canvas size of the main device to 1080x (2340x2), that is, 1080x4680. The following description will be given by taking the horizontal distribution of the source end device and the sink end device 1 (and the sink end device 2 ) as an example.
  • the source-end device clips the doubled-enlarged canvas, and the cropped part is displayed on the source-end device, and the other part is placed in the virtual screen (display1) created by the source-end device.
  • the manner in which the source device cuts the enlarged canvas may include, but is not limited to, a segmentation manner and a masking manner.
  • Fig. 25 shows the process of the source-end device cropping by means of division and displaying the cropped canvas on the virtual screen.
  • the layer information corresponding to each display screen is managed by the Android component SurfaceFlinger. Since no application is started in the display, the display has no layer information.
  • SurfaceFlinger composites a screen, it composites each display in turn. For example, when SurfaceFlinger synthesizes display0, the layer is clipped, and the clipping area 1 is [0, 0, 1080, 2340].
  • the source device can put the canvas shown in clipping area 1 into display0, so that the source device can display display0; when SurfaceFlinger synthesizes display1, it copies all layer information from display0, and then clips, where clipping area 2 is [1080, 0 , 2160, 2340], and put the cropped area 2 into display1 after translating it to the left by 1080.
  • the source device can cast display1 to the sink device 1.
  • Fig. 26 shows the process that the source device performs masking and displays the cropped canvas on the virtual screen.
  • the source end device first determines the area to be cropped (for example, the area 402 shown in (b) in FIG. 4 ) according to the distance between the source end device and the sink end device 1 . It should be understood that, for the process of determining the pixel value of the region to be cropped by the source device, reference may be made to the description in the foregoing embodiment, which is not repeated here for brevity. Assume that the pixel value to be cropped determined by the source is x.
  • the layer is clipped, and the clipping area 3 is [0, 0, 1080, 2340].
  • the source device can put the canvas shown in cropping area 3 into display0, so that the source device can display display0; when SurfaceFlinger synthesizes display1, copy all layer information from display0, and then crop, where cropping area 4 is [1080+x , 0, 2160, 2340], and translate the cropped area to the left by 1080+x and put it in display1.
  • the source device can cast display1 to the sink device 1.
  • the source device projects display1 to the sink device 1.
  • the screen resolution of the sink device 1 is 1080 ⁇ 2340
  • the canvas can be displayed in the center and displayed on both sides of the screen. Leave a black border of size x/2.
  • the canvas can also be displayed on the left side of the screen and a black border of size x is left on the right side of the screen.
  • the canvas can also be displayed on the right side of the screen and a black border of size x is left on the left side of the screen.
  • the sink device 1 may also send information of its screen resolution to the source device when instructing the source device to join the full-screen mode.
  • the source device knows the screen resolution of sink device 1 and the distance between the source device and sink device 1, it can first expand the canvas size to 2160x2340.
  • the source side device can stretch the expanded canvas horizontally and stretch it to (2160+x)x2340 according to the pixel value (for example, x) that needs to be cropped.
  • crop region 3 can be [0, 0, 1080, 2340]
  • crop region 4 has dimensions [1080+x, 0, 2160+x, 2340].
  • the source device can translate the cropped area 4 to the left by 1080+x and put it into display1.
  • the source device can cast display1 to the sink device 1.
  • the sink device 1 does not need to leave black borders on the screen when displaying.
  • the sink device 1 may be used to trigger the source device and the sink device to enter the full screen mode.
  • the source device and the sink device 1 may also be triggered to enter the full screen mode through the source device. model.
  • the source-end device detects that the user slides from the left side of the screen to the right, the source-end device can send a part of the image information and indication information to the sink-end device 1, where the indication information is used to indicate The source device and sink device 1 enter full-screen mode.
  • the sink-end device 1 displays the image information through the sink-end device 1 in response to receiving the segmented part of the image information and the indication information.
  • the source end device establishes a wireless connection with the sink end device 2.
  • mobile phone B when mobile phone C is close to mobile phone B, mobile phone B can determine that the distance between mobile phone C and mobile phone B is less than or equal to the preset distance by means of RSSI ranging, Mobile phone A, mobile phone B, and mobile phone C can form a network through a near-field wireless connection, so that mobile phone A, mobile phone B, and mobile phone B establish a wireless connection.
  • mobile phone A, mobile phone B, and mobile phone C can perform AP networking, or mobile phone A, mobile phone B, and mobile phone C can perform P2P networking.
  • the sink device 2 instructs the source device to enter the full screen mode.
  • the mobile phone A is the source device
  • the mobile phone B is the sink device 1
  • the mobile phone C is the sink device 2 .
  • the mobile phone C can send indication information to the mobile phone A, where the indication information is used to indicate that the mobile phone C wishes to enter the full screen mode.
  • the process of sending the indication information (the indication information is used to indicate that the sink device 2 wishes to enter the full-screen mode) to the source end device by the sink end device 2 may refer to the description of the above S2402, which will not be repeated here.
  • the sink device 2 when the sink device 2 detects that the user slides to the right or the sink device 2 detects that the dragging of the control to the right is partially or completely overlapped with the icon of a certain mode, the sink device can determine the The device on the left is the source device. If the sink device 2 detects that its left side includes multiple devices, the sink device 2 can use the leftmost device as the source device.
  • the sink device 2 can determine the right side of the The device is the source device. If the sink device 2 detects that its right side includes multiple devices, the sink device 2 can use the rightmost device as the source device.
  • the sink device 2 may also send distance information between the sink device 1 and the sink device 2 to the source device.
  • the sink device 2 may send a UDP data packet to the source device, and the UDP data packet may carry distance information between the sink device 1 and the sink device 2 .
  • the UPD packet includes the data portion of the IP datagram.
  • the data portion of an IP datagram may include extensible bits.
  • the sink-end device 2 and the source-end device can agree on the content of a certain extensible bit. For example, when an extensible bit is 000, the source device can know that the distance between sink device 1 and sink device 2 is 1 cm; when an extensible bit is 001, the source The end device can know that the distance between sink end device 1 and sink end device 2 is 2 cm; when a certain scalable bit is 011, the source end device can know the distance between sink end device 1 and sink end device 2. The distance between them is 3 cm.
  • the source device expands the canvas on the current interface by two times.
  • the size of the canvas originally displayed by the source device is 1080x2340.
  • the source device receives the instruction information of sink device 2
  • the source device can determine that sink device 1 and sink device 2 want to form three screen splices with the source device, then the source device expands the size of the main device's display canvas is (1080x3)x2340, which is 3240x2340.
  • the source device will crop the canvas that has been enlarged twice.
  • the first part of the cropped part is displayed on the source device, the second part is placed in the virtual screen (display1) created by the source device, and the third part is placed in the source device.
  • the virtual screen displayed on the source device
  • the second part is placed in the virtual screen (display1) created by the source device
  • the third part is placed in the source device.
  • another virtual screen created by the terminal device.
  • the source device first determines the two areas to be cropped based on the distance between the source device and sink device 1 and the distance between sink device 1 and sink device 2 (For example, the area 405 and the area 407 shown in (d) in FIG. 4 ). It should be understood that, for the process of determining the pixel value of the region to be cropped by the source device, reference may be made to the description in the foregoing embodiment, which is not repeated here for brevity. Assume that the pixel values to be cropped determined by the source are x and y.
  • FIG. 28 is only a method for cropping the enlarged canvas in the masking method, and the embodiment of the present application is not limited to this.
  • the cropping region 5 may also be [1080+x, 0, 2160, 2340].
  • the source device can translate the cropped area 5 to the left by 1080+x and put it into display1.
  • the source device can cast the canvas in display1 to the sink device 1.
  • the crop region 6 can also be [2160+y, 0, 3240, 2340].
  • the source device can move the cropped area 6 to the left by 2160+y and put it into display2.
  • the source device can project the canvas in display2 to the sink device 2.
  • the source device projects display1 to sink device 1, and projects display2 to sink device 2.
  • the process of projecting the screen of display1 to the sink device 1 by the source device and the process of projecting the display2 to the sink device 2 may refer to the existing screen projection technology, which is not repeated here for brevity.
  • the sink device 2 detects an operation of the user to exit the sink device 2 from the full screen mode.
  • the operation may be that the mobile phone C detects that the user clicks the exit control.
  • the sink device 2 in response to detecting the operation, sends indication information to the source device, where the indication information is used to instruct the sink device 2 to exit the full-screen mode.
  • the exit control may be drawn by the source device and added to the enlarged canvas.
  • the sink device 2 detects the operation, it can obtain the user’s Touch events and corresponding coordinate points.
  • the sink device 2 can send the touch event and the converted coordinate point to the source device.
  • the sink device 2 can send the touch event and the converted coordinate points to the source device.
  • the source-end device can determine that the user has clicked the exit control on the sink-end device 2, thereby performing S2413.
  • the cropping area 6 determined by the source-end device is [2160+x+y, 0, 3240, 2340] . Then the sink device 2 detects that the user has clicked on the screen (100, 100), then the sink device determines that the event is a touch event, and the coordinate point is converted to (100+2160+x+y, 100), that is ( 2260+x+y, 100).
  • the sink device 2 when it detects the operation, it can send a UDP data packet to the source device, and the UDP data packet can carry the information of the converted coordinate point, and the indication information is used to indicate the sink device. 2Exit full screen mode.
  • the UPD packet includes the data portion of the IP datagram.
  • the data portion of an IP datagram may include extensible bits.
  • the sink-end device 2 and the source-end device can agree on the content of a certain extensible bit. For example, when a certain extensible bit is 1, it indicates that the touch event is a click event.
  • the sink device 2 may use an encoding manner such as GBK, ISO8859-1, or Unicode (for example, UTF-8, UTF-16) to encode the converted coordinate point information.
  • the source-end device can learn the touch event detected by the sink-end device 2 and the information of the converted coordinate points through the information on the extensible bits. Therefore, the source-end device can determine that the sink-end device 2 detects the user's operation of exiting the sink-end device 2 from the full-screen mode.
  • the exit control can also be drawn by the sink device 2.
  • the sink device 2 detects that the user clicks the exit control, the sink device 2 can send the source device to the source.
  • the device sends indication information, which is used to instruct the sink device 2 to exit the full-screen mode.
  • the sink device 2 when it detects the operation, it may send a UDP data packet to the source device, and the UDP data packet may carry the indication information, and the indication information is used to instruct the sink device 2 to exit the full-screen mode.
  • the UPD packet includes the data portion of the IP datagram.
  • the data portion of an IP datagram may include extensible bits.
  • the sink-end device 2 and the source-end device can agree on the content of a certain extensible bit. When a certain extensible bit is 1, the source device can know that the sink device 2 exits the full screen mode.
  • the source end device in response to receiving the indication information, the source end device adjusts the canvas size, and re-splices with the sink end device 1.
  • the source device can perform the operations of S2403-S2405, which are not repeated here for brevity.
  • the sink device 1 detects an operation of the user to exit the sink device 1 from the full screen mode.
  • the operation may be that the mobile phone B detects that the user clicks the exit control.
  • the sink device 1 in response to detecting the operation, sends indication information to the source device, where the indication information is used to instruct the sink device 1 to exit the full-screen mode.
  • the exit control may be drawn by the source device and added on the enlarged canvas.
  • the sink device 1 detects the operation, it can obtain the user’s Touch events and corresponding coordinate points.
  • the sink device 1 can send the touch event and the converted coordinate point to the source.
  • the source device can determine that the sink device 1 detects that the user has clicked the exit control according to the touch event and the converted coordinate point.
  • the sink device 1 can send the touch event and the converted coordinate points to the source device. After receiving the touch event and the converted coordinate point, the source device can determine that the user has clicked the exit control on the sink device 1, thereby performing S2416.
  • the cropping area 5 determined by the source-end device is [1080+x, 0, 2160+x, 2340] . Then the sink device 2 detects that the user has clicked on the screen (100, 100), then the sink device determines that the event is a touch event, and the coordinate points are converted to (100+1080+x, 100), that is (1180+ x, 100).
  • the source-end device in response to receiving the indication information, does not expand the canvas, and restores the single-device display of the source-end device.
  • multiple devices can be connected wirelessly through a wireless network, and the screens of the multiple devices can be spliced through a preset gesture, thereby realizing a full-screen mode on the multiple devices, which helps to improve the user experience.
  • FIG. 29 shows a schematic flowchart of a method 2900 for displaying source-end devices and sink-end devices in a paging mode.
  • the method 2900 includes:
  • the source end device and the sink end device establish a wireless connection.
  • the sink device instructs the source device to enter the paging mode.
  • the icon of the paging mode displayed on the mobile phone B may be drawn by the mobile phone A.
  • the mobile phone A may add an icon in parallel mode, an icon in dual application mode, an icon in paging mode, and an exit control on the cropped canvas.
  • the mobile phone B when the mobile phone B detects the operation of clicking the paging mode, it can acquire the touch event of the user and the corresponding coordinate point. Mobile phone B can send the touch event and the transformed coordinate point to mobile phone A. The mobile phone A can switch from the full screen mode to the paging mode according to the touch event and the converted coordinate point.
  • mobile phone B detects that the user has clicked on the screen (100, 100), then mobile phone B determines that the event is a touch event, and the coordinate point is converted to (100+1080, 100), that is, (1180, 100) . Therefore, the mobile phone B can send the touch event and the transformed coordinate point to the mobile phone A. After receiving the touch event and the transformed coordinate point, the mobile phone A can determine that the user has clicked the paging mode on the mobile phone B, so as to perform S2903.
  • the mobile phone B may also send the information of the coordinate point (for example, (100, 100)) at which the user detects the click operation on the screen of the mobile phone B to the mobile phone A.
  • the mobile phone A After the mobile phone A receives the coordinate information sent by the mobile phone B, the mobile phone A can convert the coordinate point to a coordinate point on the canvas enlarged by 1 times (for example, (1180, 100)). Therefore, the mobile phone A can determine that the user has clicked the icon of the paging mode on the mobile phone B.
  • the icon of the paging mode displayed on the mobile phone B may be drawn by the mobile phone B.
  • the mobile phone B can add the icon of the parallel mode, the icon of the dual application mode, the icon of the paging mode and the exit control on the canvas.
  • the mobile phone B detects that the user clicks the icon of the paging mode, the mobile phone B sends the indication information to the mobile phone A, and the indication information is used to indicate that the mobile phone B wishes to enter the paging mode.
  • mobile phone B when mobile phone B detects that the user clicks the icon of the paging mode, mobile phone B can send a UDP data packet to mobile phone A, and the UDP data packet can carry the indication information, and the indication information is used to indicate that mobile phone B wishes to Enter paging mode.
  • the UPD packet includes the data portion of the IP datagram.
  • the data portion of an IP datagram may include extensible bits.
  • Mobile phone B and mobile phone A can agree on the content of a certain scalable bit. When a certain extensible bit is 1, mobile phone A can know that mobile phone B wishes to join the paging mode.
  • the mobile phone B when the mobile phone B detects that the user drags the control 1102 to partially or completely coincident with the controls corresponding to the paging mode, the mobile phone B sends the instruction information to the mobile phone A, The indication information is used to indicate that mobile phone B wishes to enter the paging mode.
  • the mobile phone B when the mobile phone B detects that the user drags the control 1102 to partially or completely coincident with the controls corresponding to the paging mode, the mobile phone B can send a UDP data packet to the mobile phone A, and the UDP data packet can carry the instruction. information, the indication information is used to indicate that mobile phone B wishes to enter the paging mode.
  • the UPD packet includes the data portion of the IP datagram.
  • the data portion of an IP datagram may include extensible bits.
  • Mobile phone A and mobile phone B can agree on the content of a certain scalable bit. When a certain extensible bit is 1, the mobile phone A can know that the mobile phone B wishes to enter the paging mode.
  • the source device and the sink device perform splicing display in a paging mode.
  • Figure 30 shows a schematic diagram of the desktop of the source side being displayed in a paging mode.
  • the source device receives the instruction from the sink device, the source device doubles the size of the display canvas, and the source adjusts the desktop layout.
  • the App of the source device overrides the protected void onSizeChanged(int w, int h, int oldw, int oldh) method to change the icon ( icon) is displayed in area 3001, and the icon of the second page is displayed in area 3002.
  • the paging mode is different from mirroring one desktop and also different from two independent desktops.
  • the first page displays the first desktop page
  • the second page is a natural extension of the first desktop page.
  • the second page may not include the function bar at the bottom.
  • each page supports click, swipe, and gesture operations. For example, each page can be swiped, and the sliding between two pages is linked.
  • the App of the source device monitors the change of the canvas size, it judges which mode it is currently in by calling the interface getSplicingModeType() shown in Table 1. For example, when the return value is 2, it can be determined that it is currently in the paging mode, and then switch to the paging mode.
  • Table 1 shows the correspondence between package names, interface prototypes and return values.
  • the App When the App determines that it is currently in paging mode, the App displays the icon of the first page in area 3001 by overriding the protected void onSizeChanged(int w, int h, int oldw, int oldh) method, and the second page icon is displayed in area 3002.
  • the source device can translate the canvas in the cropped area 3002 to the left by 1080 and put it into display1, so that display1 is projected to the sink device.
  • the source device cuts out half of the doubled canvas for display, creates a virtual screen (display1), and places the other half of the cropped canvas in display1.
  • the source device projects display1 to the sink device.
  • the source device (mobile phone A) can display the icon of the first desktop page
  • the sink device (mobile phone B) can display the icon of the second desktop page.
  • the source device detects the user's operation of starting the App, and loads the content of the second page on the doubled canvas.
  • the source device will crop half of the doubled canvas for display, and display the other half on display1.
  • mobile phone A can display the canvases of the first page and the second page (PDF1/8 and PDF2/8) in App5 on the doubled canvas.
  • Mobile phone A can crop the content of the first page (PDF1/8) and display it on the display screen, and translate the cropped second page (PDF2/8) canvas to the left by 1080 and put it into dispaly1.
  • Phone A can cast display1 to phone B.
  • the source device projects display1 to the sink device.
  • the sink device (mobile phone B) can display the content of the second page (PDF2/8).
  • the embodiment of the present application provides a way to enter the paging mode, and the application program can adjust the layout in this mode, which helps to improve a better user experience when splicing multiple devices.
  • FIG. 31 shows a schematic flowchart of a method 3100 for displaying a source-end device and a sink-end device in a parallel mode.
  • the method 3100 includes:
  • the source end device and the sink end device establish a wireless connection.
  • S3101 may refer to the above-mentioned process of S2401, which is not repeated here for brevity.
  • the sink device instructs the source device to enter the parallel mode.
  • the icon in the parallel mode displayed on the mobile phone B may be drawn by the mobile phone A.
  • the mobile phone A may add an icon in parallel mode, an icon in dual application mode, an icon in paging mode, and an exit control on the cropped canvas.
  • the mobile phone B when the mobile phone B detects the operation of clicking the paging mode, it can acquire the touch event of the user and the corresponding coordinate point information. Mobile phone B can send the touch event and coordinate point information to mobile phone A. The mobile phone A can switch from the full-screen mode to the parallel mode according to the touch event and the coordinate point information.
  • the mobile phone B can determine that the event is a touch event, and the coordinate point is converted to (100+1080, 100), that is, (1180, 100 ). Therefore, the mobile phone B can send the touch event and the transformed coordinate point to the mobile phone A. After receiving the touch event and the transformed coordinate point, the mobile phone A can determine that the user has clicked the parallel mode on the mobile phone B, so as to perform S3103.
  • the mobile phone B may also send the information of the coordinate point (for example, (100, 100)) at which the user detects the click operation on the screen of the mobile phone B to the mobile phone A.
  • the mobile phone A After the mobile phone A receives the coordinate information sent by the mobile phone B, the mobile phone A can convert the coordinate point to a coordinate point on the canvas enlarged by 1 times (for example, (1180, 100)). Therefore, the mobile phone A can determine that the user has clicked the icon of the paging mode on the mobile phone B.
  • the icon of the paging mode displayed on the mobile phone B may be drawn by the mobile phone B.
  • the mobile phone B can add the icon of the parallel mode, the icon of the dual application mode, the icon of the paging mode and the exit control on the canvas.
  • mobile phone B detects that the user clicks the icon of the parallel mode, mobile phone B sends indication information to mobile phone A, where the indication information is used to indicate that mobile phone B wishes to enter the parallel mode.
  • mobile phone B when mobile phone B detects that the user clicks the icon in the parallel mode, mobile phone B can send a UDP data packet to mobile phone A, and the UDP data packet can carry the indication information, and the indication information is used to indicate that mobile phone B wishes to Enter parallel mode.
  • the UPD packet includes the data portion of the IP datagram.
  • the data portion of an IP datagram may include extensible bits.
  • Mobile phone B and mobile phone A can agree on the content of a certain scalable bit. When a certain extensible bit is 1, the mobile phone A can know that the mobile phone B wishes to enter the parallel mode.
  • the mobile phone B when the mobile phone B detects that the user drags the control 1102 to partially or completely coincident with the controls corresponding to the parallel mode, the mobile phone B can send the instruction information to the mobile phone A, The indication information is used to indicate that the mobile phone B wishes to enter the parallel mode.
  • the source device doubles the size of the display canvas.
  • mobile phone A can display the home page of App6 on the left half of the doubled canvas, and mobile phone A can cut the left half of the canvas and put it into display0, so that mobile phone A displays all the contents of display0. canvas shown.
  • the content displayed in the right half is not limited.
  • Phone A can display phone A's desktop on the right half of the doubled canvas.
  • Mobile phone A can cut the right half of the screen and put it into display1, so that mobile phone A can cast display1 to mobile phone B.
  • the mobile phone A may not cast the screen to the mobile phone B, and the mobile phone B displays the desktop of the mobile phone B.
  • Figure 32 shows the process of opening an active page in the application hierarchy in parallel mode.
  • the source device detects that the user has clicked on an active page on the home page of the app, it can display the content of the home page of the app in the doubled area 3201, and display it in the area 3202.
  • the content of the event page can crop the doubled canvas, put the canvas in area 3201 into display0, and display the canvas in display0 through the display screen; the source device can translate the canvas in area 3202 to the left by 1080 and put it in display1, and cast display1 to the sink device, so that the sink device displays the activity page.
  • the source device when the source device detects that the user starts an App, it can display the home page of the App on the left half of the doubled canvas by default.
  • the source device detects that the user clicks on an active page, the source device can call the API to display the active page on the right half.
  • Table 2 shows the correspondence between package names, interface prototypes and return values.
  • the source end device judges which mode it is currently in by calling the interface getSplicingModeType() interface shown in Table 1. For example, when the return value is 3, it can be determined that it is currently in the parallel mode.
  • the App can call the startActivity() interface shown in Table 2, and set the activity page to start to the specified position in the Intent extension field "position".
  • the current device detects that the user clicks on the active page 1 in the App, and expects to start on the screen of another device (the other device is located on the right side of the current device), then write intent.putExtra("position ",RIGHT), the activity page 1 can be displayed on another device after calling startActivity(Intent).
  • App6 can call the getSplicingModeType( ) interface to determine that it is currently in parallel mode.
  • App6 determines that an activity page corresponding to the chat interface between the user and the contact's father needs to be activated.
  • Mobile phone A determines that mobile phone B is located on the right side of mobile phone A, then mobile phone A can write intent.putExtra("position",RIGHT) in the Intent, and call startActivity(Intent) to correspond the user to the chat interface of the contact's father
  • the activity page of is displayed on device B.
  • Mobile phone A can crop the doubled canvas, and display the canvas corresponding to the chat list between the user and multiple contacts on the display screen; mobile phone A can crop the activity page corresponding to the chat interface between the user and the contact’s father. Put it into display1, and cast display1 to phone B. Thus, the mobile phone B displays the activity page.
  • the source terminal device crops the doubled canvas, and displays the cropped half on the display screen; displays the other half in display1.
  • the source device projects the content of display1 to the sink device.
  • the embodiment of the present application provides a way of displaying the upper and lower activity pages of an App in a parallel mode, and displaying multiple activity pages through multiple devices in the parallel mode brings a better user experience when splicing multiple devices.
  • FIG. 33 shows a schematic flowchart of a method 3300 for displaying a source-end device and a sink-end device in a dual application mode.
  • the method 3300 includes:
  • the source end device and the sink end device establish a wireless connection.
  • the sink device instructs the source device to enter the dual application mode.
  • the icon of the dual application mode displayed on the mobile phone B may be drawn by the mobile phone A.
  • the mobile phone A may add an icon in parallel mode, an icon in dual application mode, an icon in paging mode, and an exit control on the cropped canvas.
  • the mobile phone B when the mobile phone B detects the operation of clicking on the dual application mode, it can acquire the touch event of the user and the corresponding coordinate point. Mobile phone B can send the touch event and the transformed coordinate point to mobile phone A. The mobile phone A can switch from the full screen mode to the dual application mode according to the touch event and the coordinate point information.
  • mobile phone B can determine that the touch event is a click event and convert the coordinate point to (100+1080, 100), that is, (1180 , 100). Therefore, the mobile phone B can send the touch event and the transformed coordinate point to the mobile phone A. After receiving the touch event and the transformed coordinate point, the mobile phone A can determine that the user has clicked on the icon of the dual application mode on the mobile phone B, and thus proceeds to S3303.
  • the mobile phone B may also send the information of the coordinate point (for example, (100, 100)) at which the user detects the click operation on the screen of the mobile phone B to the mobile phone A.
  • the mobile phone A After the mobile phone A receives the coordinate information sent by the mobile phone B, the mobile phone A can convert the coordinate point to a coordinate point on the canvas enlarged by 1 times (for example, (1180, 100)). Therefore, the mobile phone A can determine that the user has clicked the icon of the dual application mode on the mobile phone B.
  • the icon of the dual application mode displayed on the mobile phone B may be drawn by the mobile phone B.
  • the mobile phone B can add the icon of the parallel mode, the icon of the dual application mode, the icon of the paging mode and the exit control on the canvas.
  • mobile phone B detects that the user clicks the icon of the dual application mode, mobile phone B sends indication information to mobile phone A, where the indication information is used to indicate that mobile phone B wishes to enter the dual application mode.
  • mobile phone B when mobile phone B detects that the user clicks the icon of the dual application mode, mobile phone B can send a UDP data packet to mobile phone A, and the UDP data packet can carry the indication information, and the indication information is used to indicate mobile phone B.
  • the UPD packet includes the data portion of the IP datagram.
  • the data portion of an IP datagram may include extensible bits.
  • Mobile phone B and mobile phone A can agree on the content of a certain scalable bit. When a certain extensible bit is 1, the mobile phone A can know that the mobile phone B wishes to enter the dual application mode.
  • the source device in response to receiving the instruction of the sink device, creates a virtual screen (display), and puts the canvas corresponding to the desktop of the source device into the display.
  • the source device projects the display to the sink device.
  • the source device may create a dispaly after receiving the instruction that the sink device wishes to enter the dual application mode.
  • the source device can use the Android second desktop (secondLauncher) mechanism to start the secondLauncher on the display, and then cast the display to the sink device, so that the sink device can display the desktop of the source device.
  • secondLauncher Android second desktop
  • the mobile phone A can connect the desktop of the mobile phone A (including the network standard and Wi-Fi connection in the status bar to status, battery level, weather information, date information, and icons of App1-App12) are activated on the display, and then the display is projected to the mobile phone B, so that the mobile phone B can also display the desktop of the mobile phone A.
  • the desktop of the mobile phone A including the network standard and Wi-Fi connection in the status bar to status, battery level, weather information, date information, and icons of App1-App12
  • the sink device detects that the user clicks on the icon of the first application, and sends a touch event and coordinate point information to the source device.
  • the source device in response to receiving the touch event and the coordinate point information, the source device may start the first application, and run the first application in the display.
  • the mobile phone B when the mobile phone B detects that the user clicks the icon of App4, the mobile phone B can send the touch event to the mobile phone A and the coordinate point information collected by the mobile phone B when the user clicks. . After receiving the touch event and the coordinate information information, the mobile phone A can determine that the mobile phone B has detected that the user has clicked on the screen and the corresponding coordinate point information. Therefore, the mobile phone A can determine that the user has clicked the icon of App4 on the mobile phone B. Phone A can start App4, run App4 on the display, and then cast the display to phone B.
  • the source device projects the display to the sink device.
  • the mobile phone B can display the interface of App4.
  • the embodiment of the present application provides a method for displaying different application programs of the source device on the source device and the sink device respectively through the dual application mode, which avoids the process of switching back and forth between different applications by the user using the source device. Better user experience when splicing multiple devices.
  • FIG. 34 shows a schematic flowchart of a method 3400 for switching display in multiple modes provided by an embodiment of the present application. As shown in Figure 34, the method 3400 includes:
  • the source end device establishes a wireless connection with the sink end device.
  • the source end device is determined to perform splicing display with the sink end device.
  • the sink device when the sink device detects the user's first operation (for example, the user slides from the left to the right of the screen), it can send indication information to the source device, where the indication information is used to indicate the source.
  • the device and the sink device are spliced and displayed.
  • the source device may perform S3403.
  • mobile phone B when mobile phone B detects that the user slides the control 1502 to the right, mobile phone B can instruct mobile phone A and mobile phone B to perform screen mosaic display.
  • the sink device instructs the source device and the sink device to perform screen splicing display.
  • the source device can determine according to the current state of the source device and the sink device through full screen mode, paging mode, parallel mode or dual application. display in any of the modes.
  • the sink device when the sink device detects the first operation of the user, the sink device may send a UDP data packet to the source device, and the UDP data packet may carry the indication information, and the indication information is used to indicate the source end.
  • the device and sink device perform screen splicing display.
  • the UPD packet includes the data portion of the IP datagram.
  • the data portion of an IP datagram may include extensible bits.
  • the sink device and the source device can agree on the content of a certain extensible bit. When an extensible bit is 1, the source-end device can know that the sink-end device wishes to perform screen splicing display with the source-end device.
  • the sink device 1 when the sink device 1 detects the first operation of the user, the sink device 1 may send a TCP packet to the source device, and the TCP packet may carry the indication information.
  • a TCP message includes a TCP header and a TCP data part, wherein the TCP header includes a reserved field.
  • the sink device 1 and the source device can agree on the content of a reserved field.
  • the source-end device can know that the sink-end device wishes to perform screen splicing display with the source-end device.
  • the source device when the source device detects the user's second operation (for example, the user slides from the left side of the screen to the right), the source device can determine to perform spliced display with the sink device, thereby performing S3403.
  • the source device when the source device detects the user's second operation (for example, the user slides from the left side of the screen to the right), the source device can determine to perform spliced display with the sink device, thereby performing S3403.
  • the source-end device determines, according to the current state of the source-end device, to perform splicing and display with the sink-end device through the first mode.
  • the source-end device may determine to perform splicing display with the sink-end device. Then, the source-end device can send the image information to the sink-end device to be displayed by the sink-end device, and also can send instruction information to the source-end device, and the instruction information is used to instruct the sink-end device and the source-end device to perform splicing display.
  • the first mode includes any one of a full-screen mode, a paging mode, a parallel mode, and a dual-application mode.
  • the source-end device determines, according to the current state of the source-end device, to perform splicing and display with the sink-end device through the first mode, including: the source-end device determines, according to the current interface of the source-end device, that the sink-end device passes through the first mode. mode for mosaic display.
  • Table 3 shows a mapping relationship between the application to which the current display interface of the source device belongs and the first mode.
  • mapping relationship between the interface currently displayed by the source device and the first mode shown in Table 3 is only schematic, and the embodiment of the present application is not limited thereto.
  • the source-end device may store the package name information of multiple applications and the mapping relationship of corresponding types. After the source-end device receives the indication information of the sink-end device, it can first determine the current display of the source-end device. content displayed on the interface. If the source device determines that the currently displayed application is an application, the source device can query the package name information of the application in the application layer, and pass the package name information of the application and the package name information of multiple applications and their corresponding information. The mapping relationship of the corresponding type determines the type of the application. Exemplarily, Table 4 shows the mapping relationship between package name information of multiple applications and corresponding types.
  • mapping relationship between the package name information of the multiple application programs and the corresponding types shown in Table 4 is only illustrative, and the embodiment of the present application is not limited thereto.
  • the source device may further determine the type of the application currently displayed by the source by identifying interface elements. Exemplarily, if the source device recognizes that the current display interface includes a playback control, it can determine that the application displayed on the current interface is a music app or a video app. If the source device determines that the current display interface also includes a video playback window, it can determine that the application displayed on the current interface is a video-type App.
  • the source-end device determines, according to the current state of the source-end device, to perform splicing display with the sink-end device through the first mode, including: the source-end device currently displays the display interface of the first application, and when the source-end device receives When the notification message of the second application program is detected and the first input to the message prompt box corresponding to the notification message is detected, it is determined to perform splicing display with the sink device through the dual application mode.
  • mobile phone A when mobile phone A receives the instruction of mobile phone B, mobile phone A displays the playback interface of the video application and mobile phone A receives a notification message from App4. A can be sure to perform splicing display with mobile phone B through dual application mode.
  • mobile phone A and mobile phone B are performing spliced display in full-screen mode.
  • mobile phone A and mobile phone B can display the message prompt box 1701 in full screen mode.
  • the mobile phone A detects that the user clicks on the message prompt box 1701 the mobile phone A and the mobile phone B can switch from the full screen mode to the dual application mode.
  • mobile phone A and mobile phone B can display the message prompt box 1801 in full screen mode.
  • mobile phone A detects that the user drags the message prompt box 1801 to the lower left mobile phone A and mobile phone B can switch from full-screen mode to dual-application mode.
  • Mobile phone A can also determine to display the chat interface between the user and contact Li Hua in App4 on mobile phone A according to the dragging direction (drag to the lower left), and mobile phone A can send the canvas corresponding to the video display interface to mobile phone B, thereby Make the mobile phone B display the video display interface.
  • the source end device currently displays the display interface of the first application, if the source end device receives the indication information of the sink end device and the source end device receives the notification message of the second application program, the source end device can determine and The sink device performs splicing display through dual application mode.
  • the source-end device may receive the notification message of the second application within a preset time period before receiving the instruction of the sink-end device, then the source-end device may determine that it is spliced with the sink-end device through the dual application mode. show.
  • the source-end device may receive the notification message of the second application within a preset time period after receiving the instruction of the sink-end device, and the source-end device may determine to perform splicing and display with the sink-end device in a dual application mode.
  • mobile phone A when mobile phone A displays the playing interface of the video application and mobile phone A receives a notification message from App4, mobile phone A may start a timer upon receiving the notification message. If the mobile phone A receives the instruction of the mobile phone B before the timer expires, the mobile phone A can determine that the mobile phone A and the mobile phone B perform splicing display in the dual application mode. In this way, when mobile phone A receives the instruction of mobile phone B, the prompt box of the notification message may have been hidden but the timer is still running, and mobile phone A can still determine to perform splicing display with mobile phone B in dual application mode.
  • the source-end device can display the display interface of the first application, and project the display interface of the second application to the sink. on the device.
  • the source device when the source device has run the first application and receives a notification message from the second application, it can display the first application and the second application separately with the sink device in dual application mode.
  • the display interface of the program does not require the user to switch from the first application to the second application on the source device, which helps to improve the efficiency of human-computer interaction, and also enhances the intelligence of electronic devices and the friendliness of human-computer interaction. sex.
  • FIG. 35 shows a schematic flowchart of a method 3500 for switching display in multiple modes provided by an embodiment of the present application. As shown in Figure 35, the method 3500 includes:
  • the source end device and the sink end device perform splicing display in a dual application mode, the source end device displays the display interface of the first application program, and the sink end device displays the display interface of the second application program.
  • mobile phone A is a source device
  • mobile phone B is a sink device.
  • Phone A and Phone B are currently being displayed in dual app mode.
  • the source device displays a message prompt box on the display interface of the first application program, where the message prompt box is used to prompt the user to receive a message from the third application program.
  • the mobile phone A displays a message prompt box 1601 on the display interface of App3, wherein the message prompt box 1601 includes the contact information (for example, Li Hua) for sending the notification message, The content of the message (eg, "There is a meeting at 9 am").
  • the contact information for example, Li Hua
  • the content of the message eg, "There is a meeting at 9 am"
  • the source end device projects the display interface of the third application to the sink end device.
  • the mobile phone A in response to the operation of the user clicking on the message prompt box 1601 , the mobile phone A can project the chat interface between the user and the contact Li Hua in the App4 to the mobile phone B.
  • the sink device when the source device has run the first application and receives a notification message from the third application, the sink device switches from the display interface of the second application to the display of the third application interface, so that the source device and the sink device respectively display the display interface of the first application and the third application through the dual application mode, and the user does not need to switch the third application from the first application on the source device, which helps In order to improve the efficiency of human-computer interaction, it also enhances the intelligence of electronic equipment and the friendliness of human-computer interaction.
  • FIG. 36 shows a schematic flowchart of a method 3600 for switching display in multiple modes provided by an embodiment of the present application. As shown in Figure 36, the method 3600 includes:
  • the source device and the sink device display the display interface of the first application in a full screen mode.
  • mobile phone A and mobile phone B can display the video playback interface in full screen mode.
  • the source device and the sink device in response to receiving the notification message of the second application, the source device and the sink device display a message prompt box on the display interface of the first application, where the message prompt box is used to prompt the user to receive a message from the first application.
  • Second application message in response to receiving the notification message of the second application, the source device and the sink device display a message prompt box on the display interface of the first application, where the message prompt box is used to prompt the user to receive a message from the first application.
  • a message prompt box 1701 can be displayed on the video playback interface through mobile phone A and mobile phone B, wherein the message prompts Block 1701 includes sending the contact information of the notification message (eg, "Li Hua") with the message content (eg, "There is a meeting at 9 am").
  • the source device switches from the full-screen mode with the sink device to the spliced display with the sink device through the dual application mode.
  • the source device in response to the source device switching to and the sink device for splicing display through the dual application mode, displays the display interface of the first application and the source projects the display interface of the second application to the sink equipment.
  • the mobile phone A and the mobile phone B in response to detecting that the user clicks on the message prompt box 1701, can switch from the full-screen mode to the dual application mode, and the mobile phone A can display the video playback interface. , the mobile phone B can display the chat interface between the user and the contact Li Hua in App4.
  • the above S3603-S3604 may be that when the source device detects the user's input for the message prompt box, it switches from the full-screen mode to the dual application mode; in this embodiment of the present application, it can also be that the sink device detects the user's input for the message prompt box. , trigger the source device to switch from full-screen mode to dual-application mode.
  • the sink device detects the user's input for the message prompt box, and sends a touch event and corresponding coordinate information to the source device.
  • the sink device may send the click event and the transformed coordinate information to the source device when detecting that the user clicks on the message prompt box.
  • the source device determines that the user has inputted the message prompt box on the sink device according to the touch event and the coordinate information.
  • the source device and the sink device switch from full-screen mode for splicing display to dual-application mode for splicing display.
  • the source device in response to the source device switching to and the sink device for splicing display through the dual application mode, displays the display interface of the first application and the source projects the display interface of the second application to the sink equipment.
  • the source-end device and the sink-end device display the display interface of the first application in full-screen mode
  • the source-end device receives a notification message of the second application
  • the source-end device or the sink-end device detects that When the user inputs the message prompt box
  • the source device can determine to switch from the full-screen mode to the dual application mode, so that it can display the display interface of the first application and the second application separately with the sink device through the dual application mode, without the need for the user Switching from the first application to the third application on the source device helps to improve the efficiency of human-computer interaction, and also enhances the intelligence of electronic devices and the friendliness of human-computer interaction.
  • FIG. 37 shows a schematic flowchart of a method 3700 for mosaic display provided by an embodiment of the present application.
  • the first electronic device may be the above-mentioned source end device
  • the second electronic device may be the above-mentioned sink end device
  • the first electronic device includes a first application program and a second application program
  • the first electronic device is connected with the short-range wireless connection with communicating with the second electronic device
  • the method 3700 includes:
  • the first electronic device displays a first interface, where the first interface is a display interface of a first application.
  • the mobile phone A displays the display interface of App3.
  • the mobile phone A displays a part of the video playing interface of the video application.
  • the mobile phone A displays the home page of the social application.
  • the first electronic device detects a first input of the user while displaying the first interface, where the first input is an input for starting a second application program.
  • the second application may be App4, and the first input may be an operation of the user clicking on the message prompt box 1601 .
  • the second application may be App4, and the first input may be the operation of the user clicking on the message prompt box 1701 .
  • the second application program may be App7, and the first input may be the operation of the user clicking on the message prompt box 2001 .
  • the second application may be App3
  • the first input may be an operation of the user clicking on the icon of App3.
  • the second application may be App3
  • the first input may be an operation of dragging the icon of App3 by the user.
  • the first electronic device In response to the first input, the first electronic device sends image information corresponding to the display interface of the second application to the second electronic device.
  • the second electronic device displays the display interface of the second application in response to receiving the image information.
  • the mobile phone A in response to the user clicking on the message prompt box 1601 , can send the image information corresponding to the chat interface between the user and the contact Li Hua in App4 to the mobile phone B.
  • the mobile phone B in response to receiving the image information, can display the chat interface between the user and the contact Li Hua in App4.
  • the mobile phone A in response to the user clicking the message prompt box 1701 , can send the image information corresponding to the chat interface between the user and the contact Li Hua in App4 to the mobile phone B.
  • the mobile phone B in response to receiving the image information, can display the chat interface between the user and the contact Li Hua in App4.
  • the mobile phone A in response to the user clicking the message prompt box 2001 , can send the image information corresponding to the display interface of the commodity 1 in App7 to the mobile phone B.
  • the mobile phone B in response to receiving the image information, can display the display interface of the commodity 1 in the App7.
  • the method before the first electronic device detects the user's first input, the method further includes: receiving a notification message sent by a server corresponding to the second application while displaying the first interface; When the notification message arrives, a message prompt box is displayed, and the message prompt box is used to prompt the user to receive the notification message; wherein, the first input is the user's input for the message prompt box.
  • the mobile phone A displays a message prompt box 1601 in response to receiving the notification message sent by the server corresponding to App4.
  • mobile phone A and mobile phone B display a message prompt box 1701 in full screen mode.
  • the first input is an operation of the user clicking on the message prompt box; or, the first input is an operation of dragging the message prompt box in a first direction.
  • the first input may be an operation of the user clicking on the message prompt box 1601 .
  • the first input may be an operation of the user clicking on the message prompt box 1701 .
  • the first input may be an operation of the user dragging the message prompt box 1801 to the lower right.
  • the first interface includes a first interface element
  • the first interface element is associated with a pop-up window
  • the pop-up window is used to prompt the user to open the second application
  • the first electronic device detects the user's first
  • the method further includes: detecting a second input of the user to the first interface element on the first interface; displaying the pop-up window in response to the second input; wherein the first input is for the pop-up window input.
  • the mobile phone A in response to detecting that the user clicks on the link 2101 , the mobile phone A may display a pop-up window 2102 .
  • the mobile phone A can send the image information corresponding to the display interface of the content 1 in the App5 to the mobile phone B.
  • the mobile phone B in response to receiving the image information, the mobile phone B can display the display interface of the content 1 in App5.
  • the method before the first electronic device detects the user's first input, the method further includes: the first electronic device receives first indication information sent by the second electronic device, where the first indication information is used to indicate The first electronic device and the second electronic device perform splicing display; in response to receiving the first indication information, the first electronic device sends the image information corresponding to the desktop of the first electronic device to the second electronic device to causing the second electronic device to display the desktop of the first electronic device.
  • the mobile phone B can display the desktop of the mobile phone B.
  • mobile phone B detects the user's input (for example, the operation of sliding from the left side of the screen to the right side)
  • mobile phone B can send first indication information to mobile phone A, where the first indication information is used to instruct mobile phone A and mobile phone B to perform mosaic display.
  • mobile phone A may determine to perform spliced display with mobile phone B in a dual application mode. Therefore, the mobile phone A can send the image information corresponding to the desktop of the mobile phone A to the mobile phone B.
  • the mobile phone B can display the desktop of the mobile phone A.
  • the first electronic device further includes a third application
  • the method 3600 further includes: the first electronic device receives second indication information sent by the second electronic device, where the second indication information is used to indicate the The second electronic device detects the third input, and the third input is an input for the third application; the first electronic device sends the third input to the second electronic device in response to receiving the second indication information Image information corresponding to the display interface of the application, so that the second electronic device displays the display interface of the third application; the first electronic device displays the first interface in response to the user's operation of opening the first application.
  • mobile phone B detects the user's input when displaying the desktop of mobile phone A (the input may be an input for a third application (for example, a photo application) on the desktop of mobile phone A), then mobile phone B can send a message to mobile phone A. Send touch events and corresponding coordinate information. Therefore, the mobile phone A can determine that the user has clicked the icon of the third application program on the mobile phone B according to the touch event and the corresponding coordinate information.
  • Mobile phone A may send mobile phone B image information corresponding to the display of the photo application. In response to receiving the image information, the mobile phone B can display the display interface of the photo application. After mobile phone A sends the image information corresponding to the display of the photo application to mobile phone B, mobile phone A detects the user's operation of starting the first application (eg, App3), and mobile phone A can start the first application in response to the operation.
  • the first application eg, App3
  • mobile phone A may be a display interface that starts the third application program and sends the third application program to mobile phone B, and then starts the first application program and displays the display interface of the first application program through mobile phone A. Then in S3702, when the first electronic device (eg, mobile phone A) detects the first input, it can choose to replace the third application program that was opened earlier, so that mobile phone A can continue to display the display interface of the first application program and mobile phone A Image information corresponding to the display interface of the third application program may be sent to the mobile phone B, so that the mobile phone B displays the display interface of the third application program.
  • the first electronic device eg, mobile phone A
  • the first electronic device further includes a third application
  • the method further includes: the first electronic device receives second indication information sent by the second electronic device, where the second indication information is used to indicate the first electronic device.
  • the second electronic device detects the third input, and the third input is an input for the third application; the first electronic device sends the third application to the second electronic device in response to receiving the second indication information
  • the image information corresponding to the display interface of the program so that the second electronic device displays the display interface of the third application; the first electronic device is sending the image corresponding to the display interface of the second application to the second electronic device
  • the third indication information is used to indicate that the second electronic device is in the The user's input is detected on the display interface of the third application program.
  • the mobile phone A determines that the user's input on the display interface of the first application is detected within a preset period of time, and The third indication information sent by the mobile phone B is not received, and the third indication information is used to instruct the mobile phone B to detect that the mobile phone B detects the user's input on the display interface of the third application, and the mobile phone A can determine that the user's focus application is the mobile phone A.
  • the first application displayed on the mobile phone A can continue to display the display interface of the first application program, and the mobile phone A can send the image information corresponding to the display interface of the third application program to the mobile phone B.
  • the first electronic device when determining to replace the display interface of a certain application, may first determine the application focused by the user. If the application focused by the user is the first application displayed on the first electronic device, Then the first electronic device can determine to send the image information corresponding to the display interface of the third application to the second electronic device, so that the third electronic device uses the display interface of the third application to replace the display of the second application that is not focused by the user interface.
  • the method before the first electronic device displays the first interface, the method further includes: the first electronic device displays a second interface; the first electronic device receives fourth indication information sent by the first electronic device, The fourth indication information is used to instruct the first electronic device and the second electronic device to perform splicing display; the first electronic device is also used to respond to receiving the fourth indication information, the image corresponding to the second interface expanding the information; dividing the expanded image information to obtain the first part of the image information and the second part of the image information, wherein the first part of the image information is the image information displayed on the first interface; the first electronic device, It is also used for displaying the first interface and sending the second part of the image information to the second electronic device, so that the second electronic device displays the second part of the image information.
  • the first electronic device before the first electronic device and the second electronic device respectively display the display interface of the first application and the display interface of the third application in the dual application mode, the first electronic device may be triggered by the second electronic device first. and the second electronic device to perform splicing display through the full screen mode.
  • the full-screen mode when the first electronic device detects the user's first input, the first electronic device may determine to switch from performing splicing through the full-screen mode and the second electronic device to performing splicing through the dual-application mode and the second electronic device show. This eliminates the need for the user to switch applications and helps improve the user experience.
  • the method further includes: in response to the first input, the first electronic device displays the second interface and sends image information corresponding to the display interface of the second application to the second electronic device.
  • the first electronic device in response to the first input, may display a display interface of the second application and send image information corresponding to the second interface to the second electronic device.
  • FIG. 38 shows a schematic block diagram of an apparatus 3800 provided by an embodiment of the present application.
  • the apparatus 3800 can be set in the first electronic device in the above-mentioned FIG. 37 , and the apparatus 3800 includes: a display unit 3810 for displaying a first interface, and the first interface is a display interface of a first application; a detection unit 3820, used to detect the user's first input, where the first input is an operation to start the second application; the sending unit 3830 is used to send, in response to the first input, an image corresponding to the display interface of the second application to the second electronic device information, so that the second electronic device displays the display interface of the second application.
  • FIG. 39 shows a schematic structural diagram of an electronic device 3900 provided by an embodiment of the present application.
  • the electronic device includes: one or more processors 3910, one or more memories 3920, and the one or more memory stores 3920 store one or more computer programs, the one or more computer programs Include instructions.
  • the instruction is executed by the one or more processors 3910, the first electronic device or the second electronic device executes the technical solutions in the foregoing embodiments.
  • An embodiment of the present application provides a system including a first electronic device and a second electronic device, and the system is used to implement the technical solutions in the foregoing embodiments.
  • the implementation principle and technical effect thereof are similar to the related embodiments of the above method, and are not repeated here.
  • the embodiments of the present application provide a computer program product, which enables the first electronic device to execute the technical solutions in the foregoing embodiments when the computer program product runs on a first electronic device (or a source device).
  • the implementation principle and technical effect thereof are similar to the related embodiments of the above method, and are not repeated here.
  • An embodiment of the present application provides a readable storage medium, where the readable storage medium contains instructions, and when the instructions are run on a first electronic device (or a source device), the first electronic device is made to perform the above implementation example technical solution.
  • the implementation principle and technical effect thereof are similar, and are not repeated here.
  • An embodiment of the present application provides a chip, which is used for executing instructions, and when the chip is running, executes the technical solutions in the foregoing embodiments.
  • the implementation principle and technical effect thereof are similar, and are not repeated here.
  • the disclosed system, apparatus and method may be implemented in other manners.
  • the apparatus embodiments described above are only illustrative.
  • the division of the units is only a logical function division. In actual implementation, there may be other division methods.
  • multiple units or components may be combined or Can be integrated into another system, or some features can be ignored, or not implemented.
  • the shown or discussed mutual coupling or direct coupling or communication connection may be through some interfaces, indirect coupling or communication connection of devices or units, and may be in electrical, mechanical or other forms.
  • the units described as separate components may or may not be physically separated, and components displayed as units may or may not be physical units, that is, may be located in one place, or may be distributed to multiple network units. Some or all of the units may be selected according to actual needs to achieve the purpose of the solution in this embodiment.
  • each functional unit in each embodiment of the present application may be integrated into one processing unit, or each unit may exist physically alone, or two or more units may be integrated into one unit.
  • the functions, if implemented in the form of software functional units and sold or used as independent products, may be stored in a computer-readable storage medium.
  • the technical solution of the present application can be embodied in the form of a software product in essence, or the part that contributes to the prior art or the part of the technical solution.
  • the computer software product is stored in a storage medium, including Several instructions are used to cause a computer device (which may be a personal computer, a server, or a network device, etc.) to execute all or part of the steps of the methods described in the various embodiments of the present application.
  • the aforementioned storage medium includes: U disk, mobile hard disk, read-only memory (ROM), random access memory (RAM), magnetic disk or optical disk and other media that can store program codes .

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Multimedia (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

La présente demande concerne un procédé d'affichage en mosaïque, un dispositif électronique et un système. Le procédé comprend les étapes suivantes : un premier dispositif électronique affiche une première interface, la première interface étant une interface d'affichage d'une première application ; le premier dispositif électronique détecte une première entrée d'un utilisateur lors de l'affichage de la première interface, la première entrée étant une opération de démarrage de la seconde application ; en réponse à la première entrée, le premier dispositif électronique envoie au second dispositif électronique des informations d'image correspondant à une interface d'affichage de la seconde application ; en réponse à la réception des informations d'image correspondant à l'interface d'affichage de la seconde application, le second dispositif électronique affiche l'interface d'affichage de la seconde application. Selon des modes de réalisation de la présente demande, un utilisateur peut afficher de multiples applications sur de multiples dispositifs sans commutation de va-et-vient entre des applications, de sorte que l'expérience d'utilisateur soit améliorée.
PCT/CN2022/073727 2021-02-27 2022-01-25 Procédé d'affichage en mosaïque, dispositif électronique et système WO2022179371A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN202110221567.1 2021-02-27
CN202110221567.1A CN114968146A (zh) 2021-02-27 2021-02-27 一种拼接显示的方法、电子设备和系统

Publications (1)

Publication Number Publication Date
WO2022179371A1 true WO2022179371A1 (fr) 2022-09-01

Family

ID=82972795

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2022/073727 WO2022179371A1 (fr) 2021-02-27 2022-01-25 Procédé d'affichage en mosaïque, dispositif électronique et système

Country Status (2)

Country Link
CN (1) CN114968146A (fr)
WO (1) WO2022179371A1 (fr)

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104657101A (zh) * 2015-02-12 2015-05-27 武汉新蜂乐众网络技术有限公司 一种图像拼接显示方法及系统
US20180329587A1 (en) * 2017-05-12 2018-11-15 Apple Inc. Context-specific user interfaces
CN110851220A (zh) * 2019-10-30 2020-02-28 维沃移动通信有限公司 一种信息输出方法及电子设备
CN111031108A (zh) * 2019-11-29 2020-04-17 维沃移动通信有限公司 一种同步方法及电子设备
CN112416200A (zh) * 2020-11-26 2021-02-26 维沃移动通信有限公司 显示方法、装置、电子设备和可读存储介质

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104657101A (zh) * 2015-02-12 2015-05-27 武汉新蜂乐众网络技术有限公司 一种图像拼接显示方法及系统
US20180329587A1 (en) * 2017-05-12 2018-11-15 Apple Inc. Context-specific user interfaces
CN110851220A (zh) * 2019-10-30 2020-02-28 维沃移动通信有限公司 一种信息输出方法及电子设备
CN111031108A (zh) * 2019-11-29 2020-04-17 维沃移动通信有限公司 一种同步方法及电子设备
CN112416200A (zh) * 2020-11-26 2021-02-26 维沃移动通信有限公司 显示方法、装置、电子设备和可读存储介质

Also Published As

Publication number Publication date
CN114968146A (zh) 2022-08-30

Similar Documents

Publication Publication Date Title
WO2021013158A1 (fr) Procédé d'affichage et appareil associé
WO2020224485A1 (fr) Procédé de capture d'écran et dispositif électronique
CN114397978B (zh) 一种应用显示方法及电子设备
WO2020062294A1 (fr) Procédé de commande d'affichage pour une barre de navigation de système, interface utilisateur graphique et dispositif électronique
WO2022017393A1 (fr) Système d'interaction d'affichage, procédé d'affichage, et dispositif
WO2023273543A1 (fr) Procédé et appareil de gestion de dossier
WO2022143180A1 (fr) Procédé d'affichage collaboratif, dispositif terminal et support de stockage lisible par ordinateur
WO2022063159A1 (fr) Procédé de transmission de fichier et dispositif associé
WO2022179390A1 (fr) Procédé d'affichage en mosaïque, dispositif électronique et système
WO2023160179A1 (fr) Procédé de commutation de grossissement et appareil de commutation de grossissement
EP4293997A1 (fr) Procédé d'affichage, dispositif électronique et système
WO2022152174A1 (fr) Procédé de projection d'écran et dispositif électronique
WO2022179371A1 (fr) Procédé d'affichage en mosaïque, dispositif électronique et système
WO2021218544A1 (fr) Système de fourniture de connexion sans fil, procédé et appareil électronique
CN115225753A (zh) 拍摄方法、相关装置及系统
WO2022179364A1 (fr) Procédé d'affichage de mur d'écrans, dispositif électronique et système
WO2022179327A1 (fr) Procédé de stockage de contenu, dispositif électronique et système
WO2022206769A1 (fr) Procédé de combinaison de contenu, dispositif électronique et système
WO2022268009A1 (fr) Procédé de partage d'écran et dispositif associé
WO2022206766A1 (fr) Procédé d'affichage en mosaïque, dispositif électronique et système
WO2024067037A1 (fr) Procédé et système d'appel de service, dispositif électronique
EP4287014A1 (fr) Procédé d'affichage, dispositif électronique et système
CN117762279A (zh) 控制方法、电子设备、存储介质及程序产品

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 22758737

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 22758737

Country of ref document: EP

Kind code of ref document: A1