WO2023005900A1 - Procédé de projection d'écran, dispositif électronique et système - Google Patents

Procédé de projection d'écran, dispositif électronique et système Download PDF

Info

Publication number
WO2023005900A1
WO2023005900A1 PCT/CN2022/107783 CN2022107783W WO2023005900A1 WO 2023005900 A1 WO2023005900 A1 WO 2023005900A1 CN 2022107783 W CN2022107783 W CN 2022107783W WO 2023005900 A1 WO2023005900 A1 WO 2023005900A1
Authority
WO
WIPO (PCT)
Prior art keywords
electronic device
display
orientation
physical
virtual display
Prior art date
Application number
PCT/CN2022/107783
Other languages
English (en)
Chinese (zh)
Inventor
王波
Original Assignee
华为技术有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 华为技术有限公司 filed Critical 华为技术有限公司
Publication of WO2023005900A1 publication Critical patent/WO2023005900A1/fr

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/14Digital output to display device ; Cooperation and interconnection of the display device with other functional units
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F9/00Arrangements for program control, e.g. control units
    • G06F9/06Arrangements for program control, e.g. control units using stored programs, i.e. using an internal store of processing equipment to receive or retain programs
    • G06F9/44Arrangements for executing specific programs
    • G06F9/451Execution arrangements for user interfaces
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range

Definitions

  • the solution relates to wireless display technology, in particular to a screen projection method, electronic equipment and a system.
  • Screen projection technology enables image resources displayed on electronic devices such as mobile phones and tablets to be shared with electronic devices such as TVs and smart screens.
  • electronic devices such as TVs and smart screens.
  • different screen projection technologies may be used, wherein the screen projection technology includes heterogeneous screen projection technology and homologous screen projection technology.
  • the display direction of the screen-casting window on the screen-casting device does not support changing with the screen-changing of the screen-casting device.
  • a tablet as the projected device as an example, when the user rotates the tablet from portrait to landscape, the display direction of the projection window on the tablet is still vertical, and the projection window on the tablet cannot follow the rotation of the tablet
  • switching to horizontal screen display does not conform to the user's viewing angle, and the user's screen switching experience is poor.
  • the present application provides a screen projection method, electronic equipment and system, which can switch the display direction of the screen projection window of the screen projection device during the screen projection process, and improve user experience.
  • an embodiment of the present application provides a screen projection method applied to a first electronic device, and the method includes:
  • the first electronic device determines the display direction of the virtual display based on the physical direction
  • the first electronic device determines the display direction of the virtual display as the display direction of the application window displayed on the virtual display
  • the first electronic device sends display data of the virtual display to the second electronic device.
  • the foregoing physical orientation may include a landscape orientation, an anti-landscape orientation, a portrait orientation, and an anti-portrait orientation.
  • the first electronic device can adjust the display direction of the virtual display based on the physical direction of the second electronic device, and then, the application on the virtual display The display method of the window is adjusted to the display direction of the virtual display. Therefore, when the second electronic device displays the display data of the above-mentioned virtual display, the display direction of the screen projection window rendered and displayed by the display data is determined based on the physical direction of the second electronic device, which conforms to the user's viewing habits and greatly improves the user experience. User experience.
  • the first electronic device determines the display direction of the virtual display based on the physical direction, including:
  • the first electronic device determines the physical direction as the display direction of the virtual display.
  • the display direction of the display data of the virtual display displayed on the second electronic device is the physical direction of the second electronic device.
  • the direction is in line with the user's viewing habits and can improve the user's sense of experience.
  • the first electronic device determines the display direction of the virtual display based on the physical direction, including:
  • the first electronic device obtains the display orientation supported by the application window
  • the first electronic device determines the physical orientation as the display orientation of the virtual display.
  • the first electronic device determines the physical orientation as the display orientation of the virtual display, including:
  • the first electronic device determines the display orientation of the virtual display as the landscape orientation; or,
  • the first electronic device determines the display orientation of the virtual display as an anti-landscape orientation; or,
  • the first electronic device determines the display orientation of the virtual display as a vertical screen orientation; or,
  • the first electronic device determines the display orientation of the virtual display as the anti-portrait orientation.
  • the application window is the first application window
  • the first electronic device determines the display direction of the virtual display based on the physical direction, including:
  • the first electronic device receives a first user operation sent by the second electronic device, where the first user operation is used to trigger display of the first application window;
  • the first electronic device acquires the display direction supported by the first application window based on the first user operation
  • the first electronic device adjusts the display orientation of the current virtual display to the physical orientation
  • the first electronic device does not adjust the display orientation of the current virtual display.
  • the user can open the first application window on the second electronic device.
  • the second electronic device can trigger the user operation to display the first application window when sensing that the user opens the first application window. sent to the first electronic device, and then the first electronic device can determine the display direction of the virtual display based on the display direction supported by the first application window and the current physical direction of the second electronic device.
  • the display direction of the screen projection window can be adjusted, thereby improving user experience.
  • the method further includes:
  • the first electronic device displays the application window in a non-full screen on the virtual display.
  • the second electronic device may display the projected application window in a non-full screen, which may avoid distortion of the projected content displayed on the second electronic device. Understandably, when the above application window does not support the display orientation of the virtual display, if the above application window is displayed in full screen, operations such as stretching and other operations need to be performed on the display content of the application window, which may cause deformation of the display content and affect the user's perception.
  • the embodiment of the present application provides a screen projection method, which is characterized in that it is applied to the second electronic device, and the method includes:
  • the second electronic device sends the physical direction of the second electronic device to the first electronic device, and the physical direction is used to determine the display direction of the virtual display of the first electronic device;
  • the second electronic device receives the display data sent by the first electronic device, the display data is the display data of the virtual display, and the display direction of the application window displayed on the virtual display is the same as the display direction of the virtual display;
  • the second electronic device displays the data.
  • sending the physical direction of the second electronic device to the first electronic device by the second electronic device includes:
  • the second electronic device obtains the physical direction of the second electronic device through the sensor
  • the second electronic device sends the physical direction to the first electronic device.
  • the change in the physical orientation of the second electronic device includes that the second electronic device is in four directions: landscape orientation, reverse landscape orientation, portrait orientation, and reverse portrait orientation.
  • sending the physical direction of the second electronic device to the first electronic device by the second electronic device includes:
  • the second electronic device After the second electronic device establishes a communication connection with the first electronic device, it displays prompt information and the first control, and the prompt information is used to prompt the user whether to use heterogeneous screen projection for screen projection;
  • the second electronic device receives a user operation input for the first control, and the user operation is used to instruct the first electronic device to perform heterologous screen projection to the second electronic device;
  • the second electronic device sends a physical direction to the first electronic device in response to a user operation.
  • the display direction of the virtual display is a physical direction.
  • the display orientation of the virtual display is the physical orientation; or, when the display orientation of the application window does not include the physical orientation, the display orientation of the virtual display is the application orientation. Display orientation supported by the window.
  • the present application provides an electronic device.
  • the electronic device can include memory and a processor.
  • memory can be used to store computer programs.
  • the processor can be used to call a computer program, so that the electronic device executes the first aspect or any possible implementation manner of the first aspect.
  • the present application provides an electronic device.
  • the electronic device can include memory and a processor.
  • memory can be used to store computer programs.
  • the processor can be used to call a computer program, so that the electronic device executes the second aspect or any possible implementation manner in the second aspect.
  • the present application provides a computer program product containing instructions, which is characterized in that, when the above-mentioned computer program product is run on an electronic device, the electronic device is made to execute any possible method according to the first aspect or the first aspect. Method to realize.
  • the present application provides a computer program product containing instructions, which is characterized in that, when the above-mentioned computer program product is run on an electronic device, the electronic device is made to execute any possible method according to the second aspect or the second aspect. Method to realize.
  • the present application provides a computer-readable storage medium, including instructions, which is characterized in that, when the above-mentioned instructions are run on the electronic device, the electronic device is made to perform any possible operation as in the first aspect or the first aspect. Method to realize.
  • the present application provides a computer-readable storage medium, including instructions, which is characterized in that, when the above-mentioned instructions are run on an electronic device, the electronic device is made to execute any of the possible steps in the second aspect or the second aspect. Method to realize.
  • an embodiment of the present application provides a screen projection system, the screen projection system includes a first electronic device and a second electronic device, the first electronic device is the electronic device described in the third aspect, and the second electronic device It is the electronic device described in the fourth aspect.
  • the electronic devices provided in the third and fourth aspects, the computer program products provided in the fifth and sixth aspects, and the computer-readable storage media provided in the seventh and eighth aspects are all used to implement the present application
  • FIG. 1 is a schematic diagram of image resources displayed by a first electronic device and a second electronic device according to an embodiment of the present application;
  • FIG. 2 is a schematic diagram of the principle of a heterogeneous screen projection technology provided in an embodiment of the present application
  • FIG. 3 is a schematic diagram of the physical orientation of an electronic device provided in an embodiment of the present application.
  • FIG. 4A and FIG. 4B are schematic diagrams of display directions of windows provided by the embodiments of the present application.
  • Fig. 5 is a schematic diagram of displaying image resources when the second electronic device switches screens according to an embodiment of the present application
  • FIG. 6 is a schematic structural diagram of a screen projection system provided by an embodiment of the present application.
  • FIG. 7A is a schematic diagram of a hardware structure of a first electronic device 100 provided in an embodiment of the present application.
  • FIG. 7B is a block diagram of the software structure of the first electronic device 100 provided by the embodiment of the present application.
  • FIG. 7C is a schematic diagram of a hardware structure of a second electronic device 200 provided by an embodiment of the present application.
  • FIG. 8 is a schematic flowchart of a screen projection method provided by an embodiment of the present application.
  • 9A-9E are user interfaces for establishing a communication connection between the first electronic device and the second electronic device according to the embodiment of the present application.
  • FIG. 10 is a schematic diagram of creating a virtual display provided by an embodiment of the present application.
  • Fig. 11 is a schematic diagram of a full-screen display provided by an embodiment of the present application.
  • Fig. 12 is a schematic diagram of an application window provided by an embodiment of the present application.
  • FIG. 13A is a schematic diagram of a screen switching scene of a second electronic device provided by an embodiment of the present application.
  • Fig. 13B is a schematic diagram of adjusting the display direction of a virtual display provided by an embodiment of the present application.
  • Fig. 13C is a schematic diagram of a display interface of a virtual display provided by an embodiment of the present application.
  • Fig. 13D is a schematic diagram of a display interface of a second electronic device provided by an embodiment of the present application.
  • Fig. 14A is a schematic diagram of a scenario in which a user opens a new window provided by an embodiment of the present application
  • FIG. 14B is a schematic diagram of a display interface of another second electronic device provided by an embodiment of the present application.
  • first and second are used for descriptive purposes only, and cannot be understood as implying or implying relative importance or implicitly specifying the quantity of indicated technical features. Therefore, the features defined as “first” and “second” may explicitly or implicitly include one or more of these features. In the description of the embodiments of the present application, unless otherwise specified, the “multiple” The meaning is two or more.
  • UI user interface
  • the term "user interface (UI)” in the following embodiments of this application is a medium interface for interaction and information exchange between an application program or an operating system and a user, and it realizes the difference between the internal form of information and the form acceptable to the user. conversion between.
  • the user interface is the source code written in a specific computer language such as java and extensible markup language (XML).
  • the source code of the interface is parsed and rendered on the electronic device, and finally presented as content that can be recognized by the user.
  • the commonly used form of user interface is the graphical user interface (graphic user interface, GUI), which refers to the user interface related to computer operation displayed in a graphical way. It may be text, icons, buttons, menus, tabs, text boxes, dialog boxes, status bars, navigation bars, Widgets, and other visible interface elements displayed on the display screen of the electronic device.
  • screen projection technology When performing screen projection between two electronic devices, different screen projection technologies may be used, wherein the screen projection technology includes heterogeneous screen projection technology and homologous screen projection technology.
  • the first electronic device is also called a source device or a screen-casting device
  • the second electronic device is also called a target device or a screen-casted device.
  • the electronic device that sends screen-casting content to other devices is called the first electronic device
  • the device that receives and displays the screen-casting content is called the second electronic device.
  • FIG. 1 illustrates a schematic diagram of image resources displayed by a first electronic device and a second electronic device in the above two screen projection technologies.
  • the first electronic device can project its display screen to the second electronic device, and there will be a screen projection window on the second electronic device to display the screen content of the first electronic device .
  • the screen projection screen of the screen projection window on the second electronic device will also change accordingly.
  • the screen on the first electronic device will also change and refresh synchronously with the screen projection window of the second electronic device, which is homologous screen projection.
  • same-source projection means that the display content of the screen projection window on the target device (ie, the second electronic device) is always consistent with that displayed on the source device (ie, the first electronic device), and changes synchronously of.
  • Heterogeneous screen projection means that the display content on the first electronic device is inconsistent with the display content of the screen projection window of the second electronic device.
  • the content displayed on the second electronic device is projected from the screen of the first electronic device, while the content displayed on the first electronic device is another content, that is, the content of the first electronic device
  • the display content is inconsistent with the display content of the screen projection window on the second electronic device and is independent of each other.
  • the display content When the display content is operated on the first electronic device, it will not trigger the refresh of the screen projection window content on the second electronic device.
  • the content of the projection window on the second electronic device is triggered to be refreshed, the content displayed on the first electronic device will not change.
  • FIG. 2 exemplarily shows a schematic diagram of a principle of a heterogeneous screen projection technology.
  • the first electronic device has two displays (displays), which are respectively a main display and a virtual display.
  • the picture of the main display is visible on the first electronic device, and the picture displayed on the virtual display is invisible to the user of the first electronic device.
  • the first electronic device will create a virtual display for the second electronic device, and then mirror and project the display content of the virtual display to the second electronic device, that is, the display content of the virtual display will pass through It is transmitted across devices to the second electronic device for display.
  • the virtual display in this embodiment of the present application is a virtual display that mirrors the displayed content onto the second electronic device.
  • the first electronic device may create two displays, which are respectively a main display and a virtual display corresponding to the first electronic device.
  • the display content of the main display is displayed on the display screen of the first electronic device
  • the display content of the virtual display is displayed on the display screen of the second electronic device, that is, the virtual display is invisible to the user of the first electronic device
  • the primary display is only visible to the user of the first electronic device.
  • the main display and the virtual display can run two tasks independently, that is to say, the main display and the virtual display can respectively run two independent applications.
  • the first electronic device transmits the display data in the virtual display when the application is running in the form of a data stream to the second electronic device in real time through a cross-device channel.
  • the screen projection application on the second electronic device receives the data After streaming, it will be displayed in the screen projection window on the display screen of the second electronic device. Therefore, what the user sees in the screen projection window on the second electronic device is the application display of the virtual display on the first electronic device. Content, it can be seen that the application actually runs on the first electronic device.
  • the inventors of the present application found that during the heterogeneous screen projection process from the first electronic device to the second electronic device, the content of the screen projection on the second electronic device does not support changes following the screen transfer of the second electronic device.
  • the experience is poor, which reduces the sense of user experience.
  • the physical orientation (also referred to as physical state or sensor orientation) of an electronic device is a physical attribute value, specifically, a user's holding state of the electronic device.
  • the physical direction of the electronic device refers to the physical direction of the display screen of the electronic device.
  • the physical direction of the display screen of the electronic device is collectively referred to as the physical direction of the electronic device.
  • the physical orientation of the electronic device can be divided into landscape orientation and portrait orientation.
  • the side of the display screen of the electronic device that is basically parallel to the horizontal direction (or with a small angle) can be called the width of the electronic device; the display screen of the electronic device is basically parallel to the vertical direction (or with a small angle).
  • the smaller) side can be called the high side of the electronic device.
  • the landscape orientation refers to a state in which the width of the electronic device is greater than the height; the portrait orientation refers to the state in which the height of the electronic device is greater than the width.
  • the electronic device when a user uses an electronic device, the electronic device may be in a landscape orientation or in a portrait orientation.
  • the display screen of the electronic device In the horizontal screen orientation, the display screen of the electronic device is basically in the shape of a horizontal bar.
  • the display screen of the electronic device In the portrait orientation, the display screen of the electronic device is basically in the shape of vertical bars.
  • the aspect ratio of the display screen of the electronic device is different between the horizontal screen orientation and the vertical screen orientation.
  • the aspect ratio of the display screen can also be referred to as the aspect ratio of the display screen, which is the ratio of the height to the width of the display screen.
  • the height of the display screen In the landscape orientation, the height of the display screen is the length of the short side of the display screen, and the width of the display screen is the length of the long side of the display screen.
  • the height of the display screen is the length of the long side of the display screen
  • the width of the display screen is the length of the short side of the display screen.
  • the long sides of the display screen are the two longer sides that are parallel and equal to each other among the four sides of the display screen
  • the short sides of the display screen are the two shorter sides that are parallel and equal to each other among the four sides of the display screen.
  • FIG. 3 is a schematic diagram of a physical orientation of an electronic device provided in an embodiment of the present application.
  • the electronic device when the electronic device is in the landscape orientation as shown in (a) in Figure 3, the height of the display screen is Y, the width of the display screen is X, and the aspect ratio of the display screen is Y/X, where Y/X ⁇ 1.
  • the electronic device when the electronic device is tilted or rotated at a small angle (for example, the angle is not greater than the first preset threshold, such as 20°, 15°, 5°, etc.), the electronic device is still considered to be in landscape orientation.
  • the electronic device in the horizontal screen orientation shown in (a) in Figure 3, the electronic device rotates clockwise by an angle of ⁇ , so that the electronic device is in the state shown in (b) in Figure 3, and ⁇ is not greater than the first preset threshold , the electronic device regards the state shown in (b) in FIG. 3 as a landscape orientation.
  • the electronic device when the electronic device is in the vertical screen orientation shown in (c) in FIG. Y>1.
  • the electronic device is tilted or rotated at a small angle (for example, the angle is not greater than the second preset threshold, such as 20°, 15°, , 5°, etc.), the electronic device is still considered to be in portrait orientation.
  • the electronic device rotates counterclockwise by an angle of ⁇ , so that the electronic device is in the state shown in (d) in Figure 3, and ⁇ is not greater than the second preset threshold , the electronic device regards the state shown in (d) in FIG. 3 as a portrait orientation.
  • the first preset threshold and the second preset threshold may be the same or different, and may be set according to actual needs, which is not limited thereto.
  • the physical orientation of the electronic device can also be divided into four physical orientations, which are landscape orientation, reverse landscape orientation, portrait orientation and reverse portrait orientation.
  • the physical orientation of the electronic device rotated 180 degrees clockwise may be the anti-portrait orientation
  • the physical orientation of the electronic device in the portrait orientation rotated 360 degrees clockwise may be the anti-landscape orientation.
  • how to determine which state is the landscape orientation and the anti-landscape orientation may be determined according to the user's habit or the position of the camera, etc., and is not limited here.
  • there are buttons below the electronic device shown in (c) in FIG. 3 so the physical orientation of the electronic device shown in (c) in FIG. 3 is determined as the vertical screen orientation.
  • the change of the physical orientation of the electronic device may mean that the electronic device is switched from a landscape orientation or a portrait orientation, or that the electronic equipment is rotated by a certain angle in a landscape orientation or a portrait orientation, specifically Refer to the following related examples, which are not limited here.
  • the screen display modes of the electronic device include a horizontal screen display mode and a vertical screen display mode.
  • the content is displayed on the screen.
  • the current screen display mode is the vertical screen display mode, and the display direction of the current screen content is consistent with the vertical direction of the long side of the electronic device.
  • the display direction of displayed characters, user interface and other content is also vertical.
  • the current screen display mode is the horizontal screen display mode, and the display direction of the current screen content is consistent with the vertical direction of the short side of the electronic device, and the display direction of fonts and user interfaces displayed on the screen is also Vertical.
  • the electronic device can also use the horizontal screen orientation to display the interface in accordance with the landscape orientation; Orientation display interface.
  • the display direction of the window refers to the display direction of the content displayed in the window.
  • the display direction of the window is consistent with the vertical direction of the long side of the display screen of the electronic device.
  • the display direction of the window is the vertical screen direction; as shown in (B) in Figure 4A
  • the display direction of the displayed content of the window is consistent with the vertical direction of the short side of the display screen of the electronic device.
  • the display direction of the window is the horizontal screen direction.
  • the display orientation of the window can also be divided into landscape orientation, reverse landscape orientation, portrait orientation and reverse portrait orientation.
  • the vertical screen display direction can be rotated 180 degrees clockwise to obtain the anti-vertical orientation shown in (B) in FIG. 4B . screen direction; if it is determined that the display direction of the electronic device shown in (C) in Figure 4B is a horizontal screen direction, then the vertical screen display direction can be rotated 180 degrees clockwise to obtain the reverse direction shown in (B) in Figure 4B Landscape orientation.
  • the portrait orientation may include the portrait orientation as shown in (A) in FIG. 4B and the reverse portrait orientation as shown in (B) in FIG. 4B
  • the landscape orientation may include the orientation as shown in FIG. 4B The landscape orientation shown in (C) and the reverse landscape orientation as shown in (D) in FIG. 4B .
  • FIG. 5 exemplarily shows a schematic diagram of image resources displayed by the second electronic device when the second electronic device switches screens in the existing heterogeneous screen projection technology.
  • the display content of the second electronic device is a landscape display, which conforms to the viewing angle of the user.
  • the user holds the second electronic device in the landscape orientation
  • the display content of the second electronic device can be viewed normally.
  • the second electronic device is rotated from the physical landscape to the physical portrait, the display content on the second electronic device does not follow the rotation of the second electronic device.
  • the display content of the second electronic device is still displayed in landscape, which does not meet the The user's viewing angle, the user's visual experience is poor.
  • the following embodiments of the present application provide a screen projection method.
  • the first electronic device such as a mobile phone and a tablet and the second electronic device share image resources using the heterogeneous screen projection technology.
  • the second electronic device sends the physical direction of the second electronic device to the first electronic device, so that the first electronic device adjusts the display direction of the screen projection window on the second electronic device based on the physical direction of the second electronic device, specifically
  • relevant descriptions of subsequent embodiments and details are not repeated here.
  • the display direction of the screen projection window on the second electronic device can be automatically adjusted during the screen projection process, thereby improving the user's visual experience during the screen projection process.
  • screen projection means that the first electronic device (such as a mobile phone, a tablet, etc.) directly sends a point-to-point to a second electronic device (such as a TV, a smart screen, etc.)
  • the electronic device transmits the image resource
  • the second electronic device displays the screen projection technology of the image resource.
  • the first electronic device may also be called a sending end/source end (source end)
  • the second electronic device may also be called a receiving end (sink end).
  • the communication connection established between the first electronic device and the second electronic device may include but not limited to: wireless fidelity direct (Wi-Fi direct) (also known as wireless fidelity peer-to- peer, Wi-Fi P2P)) communication connection, Bluetooth communication connection, near field communication (near field communication, NFC) connection, etc.
  • Wi-Fi direct wireless fidelity peer-to- peer
  • Wi-Fi P2P wireless fidelity peer-to- peer
  • Bluetooth wireless fidelity peer-to- peer
  • NFC near field communication
  • the screen projection technology is just a term used in this embodiment, and its representative meaning has been recorded in this embodiment, and its name does not constitute any limitation to this embodiment.
  • the screen projection technology may also be called other terms such as multi-screen interaction, full sharing screen projection, wireless display, and the like.
  • the image resources shared between the first electronic device and the second electronic device may include any one or a combination of multiple items of the following: video, text, picture, photo, audio or form, and so on.
  • image resources can be movies, TV dramas, short videos, musicals, and so on.
  • the image resources shared between the first electronic device and the second electronic device may be network image resources, local image resources, or a combination of network image resources and local image resources.
  • the network image resource refers to the image resource obtained by the first electronic device from the network, for example, the video obtained by the first electronic device from a server providing video service when running a video application program.
  • Local image resources refer to image resources locally stored or generated by the first electronic device, such as pictures or tables locally stored by the first electronic device.
  • the screen projection system 10 may include a first electronic device 100 and a second electronic device 200 .
  • the first electronic device 100 can establish a wireless connection with the second electronic device 200 through a wireless communication method (for example, wireless fidelity (wireless fidelity, Wi-Fi), Bluetooth, etc.).
  • the first electronic device 100 may transmit file data to the second electronic device 200 through a wireless connection, or the first electronic device 100 may project an application interface to the second electronic device 200 for display and the like.
  • Real Time Streaming Protocol can be used to control the transmission of real-time data.
  • RTSP is a multimedia streaming protocol used to control audio or video, and allows multiple simultaneous streams to be controlled on demand.
  • the first electronic device 100 can control the transmission of the data stream through RTSP.
  • the first electronic device 100 may compress the compressed H.264 format video or advanced audio coding (advanced audio coding, AAC) Format audio is mixed into a transport stream (transport stream, TS) file, and the TS file is sent to the second electronic device 200 through Wi-Fi P2P using the RTSP protocol, and the second electronic device 200 receives data from the first electronic device through the RTSP protocol.
  • AAC advanced audio coding
  • H.264 is a video codec protocol
  • ACC is an audio codec protocol.
  • the first electronic device 100 may be a mobile phone, a tablet computer, a desktop computer, a laptop computer, a handheld computer, a notebook computer, an ultra-mobile personal computer (ultra-mobile personal computer, UMPC), Netbooks, as well as cellular phones, personal digital assistants (PDAs), augmented reality (AR) devices, virtual reality (VR) devices, artificial intelligence (AI) devices, wearable devices, in-vehicle devices, smart home devices and/or smart city devices, etc.
  • PDAs personal digital assistants
  • AR augmented reality
  • VR virtual reality
  • AI artificial intelligence
  • the second electronic device 200 can be a flat panel, a monitor, a television, a desktop computer, a laptop computer, a handheld computer, a notebook computer, a super mobile personal computer, a netbook, an augmented reality device, a virtual reality device, an artificial intelligence device, a vehicle-mounted device , smart home devices, and more.
  • the screen projection system 10 may further include: a Wi-Fi access point 300 and a server 400 .
  • the first electronic device 100 can access the Wi-Fi access point 300 .
  • the server 400 can provide network audio and video services.
  • the server 400 may be a server storing various image resources, such as a Huawei video server providing audio and video services.
  • the number of servers 400 may be one or more.
  • FIG. 7A shows a schematic diagram of the hardware structure of the first electronic device 100 .
  • electronic device 100 may have more or fewer components than shown in the figures, may combine two or more components, or may have a different configuration of components.
  • the various components shown in the figures may be implemented in hardware, software, or a combination of hardware and software including one or more signal processing and/or application specific integrated circuits.
  • the electronic device 100 may include: a processor 110, an external memory interface 120, an internal memory 121, a universal serial bus (universal serial bus, USB) interface 130, a charging management module 140, a power management module 141, a battery 142, an antenna 1, and an antenna 2.
  • Mobile communication module 150 wireless communication module 160, audio module 170, speaker 170A, receiver 170B, microphone 170C, earphone jack 170D, sensor module 180, button 190, motor 191, indicator 192, camera 193, display screen 194 and A subscriber identification module (subscriber identification module, SIM) card interface 195 and the like.
  • SIM subscriber identification module
  • the sensor module 180 may include a pressure sensor 180A, a gyroscope sensor 180B, an air pressure sensor 180C, a magnetic sensor 180D, an acceleration sensor 180E, a distance sensor 180F, a proximity light sensor 180G, a fingerprint sensor 180H, a temperature sensor 180J, a touch sensor 180K, an ambient light sensor 180L, bone conduction sensor 180M, etc.
  • the structure illustrated in the embodiment of the present application does not constitute a specific limitation on the electronic device 100 .
  • the electronic device 100 may include more or fewer components than shown in the figure, or combine certain components, or separate certain components, or arrange different components.
  • the illustrated components can be realized in hardware, software or a combination of software and hardware.
  • the processor 110 may include one or more processing units, for example: the processor 110 may include an application processor (application processor, AP), a modem processor, a graphics processing unit (graphics processing unit, GPU), an image signal processor (image signal processor, ISP), controller, memory, video codec, digital signal processor (digital signal processor, DSP), baseband processor, and/or neural network processor (neural-network processing unit, NPU) wait. Wherein, different processing units may be independent devices, or may be integrated in one or more processors.
  • application processor application processor, AP
  • modem processor graphics processing unit
  • GPU graphics processing unit
  • image signal processor image signal processor
  • ISP image signal processor
  • controller memory
  • video codec digital signal processor
  • DSP digital signal processor
  • baseband processor baseband processor
  • neural network processor neural-network processing unit, NPU
  • the controller may be the nerve center and command center of the electronic device 100 .
  • the controller can generate an operation control signal according to the instruction opcode and timing signal, and complete the control of fetching and executing the instruction.
  • a memory may also be provided in the processor 110 for storing instructions and data.
  • the memory in processor 110 is a cache memory.
  • the memory may hold instructions or data that the processor 110 has just used or recycled. If the processor 110 needs to use the instruction or data again, it can be called directly from the memory. Repeated access is avoided, and the waiting time of the processor 110 is reduced, thereby improving the efficiency of the system.
  • processor 110 may include one or more interfaces.
  • the interface may include an integrated circuit (inter-integrated circuit, I2C) interface, an integrated circuit built-in audio (inter-integrated circuit sound, I2S) interface, a pulse code modulation (pulse code modulation, PCM) interface, a universal asynchronous transmitter (universal asynchronous receiver/transmitter, UART) interface, mobile industry processor interface (mobile industry processor interface, MIPI), general-purpose input and output (general-purpose input/output, GPIO) interface, subscriber identity module (subscriber identity module, SIM) interface, and /or universal serial bus (universal serial bus, USB) interface, etc.
  • I2C integrated circuit
  • I2S integrated circuit built-in audio
  • PCM pulse code modulation
  • PCM pulse code modulation
  • UART universal asynchronous transmitter
  • MIPI mobile industry processor interface
  • GPIO general-purpose input and output
  • subscriber identity module subscriber identity module
  • SIM subscriber identity module
  • USB universal serial bus
  • the I2C interface is a bidirectional synchronous serial bus, including a serial data line (serial data line, SDA) and a serial clock line (derail clock line, SCL).
  • processor 110 may include multiple sets of I2C buses.
  • the processor 110 can be respectively coupled to the touch sensor 180K, the charger, the flashlight, the camera 193 and the like through different I2C bus interfaces.
  • the processor 110 may be coupled to the touch sensor 180K through the I2C interface, so that the processor 110 and the touch sensor 180K communicate through the I2C bus interface to realize the touch function of the electronic device 100 .
  • the I2S interface can be used for audio communication.
  • processor 110 may include multiple sets of I2S buses.
  • the processor 110 may be coupled to the audio module 170 through an I2S bus to implement communication between the processor 110 and the audio module 170 .
  • the audio module 170 can transmit audio signals to the wireless communication module 160 through the I2S interface, so as to realize the function of answering calls through the Bluetooth headset.
  • the PCM interface can also be used for audio communication, sampling, quantizing and encoding the analog signal.
  • the audio module 170 and the wireless communication module 160 may be coupled through a PCM bus interface.
  • the audio module 170 can also transmit audio signals to the wireless communication module 160 through the PCM interface, so as to realize the function of answering calls through the Bluetooth headset. Both the I2S interface and the PCM interface can be used for audio communication.
  • the UART interface is a universal serial data bus used for asynchronous communication.
  • the bus can be a bidirectional communication bus. It converts the data to be transmitted between serial communication and parallel communication.
  • a UART interface is generally used to connect the processor 110 and the wireless communication module 160 .
  • the processor 110 communicates with the Bluetooth module in the wireless communication module 160 through the UART interface to realize the Bluetooth function.
  • the audio module 170 can transmit audio signals to the wireless communication module 160 through the UART interface, so as to realize the function of playing music through the Bluetooth headset.
  • the MIPI interface can be used to connect the processor 110 with peripheral devices such as the display screen 194 and the camera 193 .
  • MIPI interface includes camera serial interface (camera serial interface, CSI), display serial interface (display serial interface, DSI), etc.
  • the processor 110 communicates with the camera 193 through the CSI interface to realize the shooting function of the electronic device 100 .
  • the processor 110 communicates with the display screen 194 through the DSI interface to realize the display function of the electronic device 100 .
  • the GPIO interface can be configured by software.
  • the GPIO interface can be configured as a control signal or as a data signal.
  • the GPIO interface can be used to connect the processor 110 with the camera 193 , the display screen 194 , the wireless communication module 160 , the audio module 170 , the sensor module 180 and so on.
  • the GPIO interface can also be configured as an I2C interface, I2S interface, UART interface, MIPI interface, etc.
  • the SIM interface can be used to communicate with the SIM card interface 195 to realize the function of transmitting data to the SIM card or reading data in the SIM card.
  • the USB interface 130 is an interface conforming to the USB standard specification, specifically, it can be a Mini USB interface, a Micro USB interface, a USB Type C interface, and the like.
  • the USB interface 130 can be used to connect a charger to charge the electronic device 100 , and can also be used to transmit data between the electronic device 100 and peripheral devices. It can also be used to connect headphones and play audio through them. This interface can also be used to connect other electronic devices, such as AR devices.
  • the interface connection relationship between modules shown in the embodiment of the present application is only a schematic illustration, and does not constitute a structural limitation of the electronic device 100.
  • the electronic device 100 may also adopt different interface connection manners in the foregoing embodiments, or a combination of multiple interface connection manners.
  • the charging management module 140 is configured to receive a charging input from a charger.
  • the charger may be a wireless charger or a wired charger.
  • the power management module 141 is used for connecting the battery 142 , the charging management module 140 and the processor 110 .
  • the power management module 141 receives the input from the battery 142 and/or the charging management module 140 to provide power for the processor 110 , the internal memory 121 , the external memory, the display screen 194 , the camera 193 , and the wireless communication module 160 .
  • the wireless communication function of the electronic device 100 can be realized by the antenna 1 , the antenna 2 , the mobile communication module 150 , the wireless communication module 160 , a modem processor, a baseband processor, and the like.
  • Antenna 1 and Antenna 2 are used to transmit and receive electromagnetic wave signals.
  • Each antenna in electronic device 100 may be used to cover single or multiple communication frequency bands. Different antennas can also be multiplexed to improve the utilization of the antennas.
  • Antenna 1 can be multiplexed as a diversity antenna of a wireless local area network.
  • the antenna may be used in conjunction with a tuning switch.
  • the mobile communication module 150 can provide wireless communication solutions including 2G/3G/4G/5G applied on the electronic device 100 .
  • the mobile communication module 150 may include at least one filter, switch, power amplifier, low noise amplifier (low noise amplifier, LNA) and the like.
  • the mobile communication module 150 can receive electromagnetic waves through the antenna 1, filter and amplify the received electromagnetic waves, and send them to the modem processor for demodulation.
  • the mobile communication module 150 can also amplify the signals modulated by the modem processor, and convert them into electromagnetic waves through the antenna 1 for radiation.
  • at least part of the functional modules of the mobile communication module 150 may be set in the processor 110 .
  • at least part of the functional modules of the mobile communication module 150 and at least part of the modules of the processor 110 may be set in the same device.
  • a modem processor may include a modulator and a demodulator.
  • the modulator is used for modulating the low-frequency baseband signal to be transmitted into a medium-high frequency signal.
  • the demodulator is used to demodulate the received electromagnetic wave signal into a low frequency baseband signal. Then the demodulator sends the demodulated low-frequency baseband signal to the baseband processor for processing.
  • the low-frequency baseband signal is passed to the application processor after being processed by the baseband processor.
  • the application processor outputs sound signals through audio equipment (not limited to speaker 170A, receiver 170B, etc.), or displays images or videos through display screen 194 .
  • the modem processor may be a stand-alone device.
  • the modem processor may be independent from the processor 110, and be set in the same device as the mobile communication module 150 or other functional modules.
  • the wireless communication module 160 can provide wireless local area networks (wireless local area networks, WLAN) (such as wireless fidelity (Wireless Fidelity, Wi-Fi) network), bluetooth (bluetooth, BT), global navigation satellite, etc. applied on the electronic device 100.
  • System global navigation satellite system, GNSS
  • frequency modulation frequency modulation, FM
  • near field communication technology near field communication, NFC
  • infrared technology infrared, IR
  • the wireless communication module 160 may be one or more devices integrating at least one communication processing module.
  • the wireless communication module 160 receives electromagnetic waves via the antenna 2 , frequency-modulates and filters the electromagnetic wave signals, and sends the processed signals to the processor 110 .
  • the wireless communication module 160 can also receive the signal to be sent from the processor 110 , frequency-modulate it, amplify it, and convert it into electromagnetic waves through the antenna 2 for radiation.
  • the antenna 1 of the electronic device 100 is coupled to the mobile communication module 150, and the antenna 2 is coupled to the wireless communication module 160, so that the electronic device 100 can communicate with the network and other devices through wireless communication technology.
  • the wireless communication technology may include global system for mobile communications (GSM), general packet radio service (general packet radio service, GPRS), code division multiple access (code division multiple access, CDMA), broadband Code division multiple access (wideband code division multiple access, WCDMA), time division code division multiple access (time-division code division multiple access, TD-SCDMA), long term evolution (long term evolution, LTE), BT, GNSS, WLAN, NFC , FM, and/or IR techniques, etc.
  • GSM global system for mobile communications
  • GPRS general packet radio service
  • code division multiple access code division multiple access
  • CDMA broadband Code division multiple access
  • WCDMA wideband code division multiple access
  • time division code division multiple access time-division code division multiple access
  • TD-SCDMA time-division code division multiple access
  • the GNSS may include a global positioning system (global positioning system, GPS), a global navigation satellite system (global navigation satellite system, GLONASS), a Beidou navigation satellite system (beidou navigation satellite system, BDS), a quasi-zenith satellite system (quasi -zenith satellite system (QZSS) and/or satellite based augmentation systems (SBAS).
  • GPS global positioning system
  • GLONASS global navigation satellite system
  • Beidou navigation satellite system beidou navigation satellite system
  • BDS Beidou navigation satellite system
  • QZSS quasi-zenith satellite system
  • SBAS satellite based augmentation systems
  • the electronic device 100 realizes the display function through the GPU, the display screen 194 , and the application processor.
  • the GPU is a microprocessor for image processing, and is connected to the display screen 194 and the application processor. GPUs are used to perform mathematical and geometric calculations for graphics rendering.
  • Processor 110 may include one or more GPUs that execute program instructions to generate or change display information.
  • the display screen 194 is used to display images, videos and the like.
  • the display screen 194 includes a display panel.
  • the display panel can be a liquid crystal display (LCD), an organic light-emitting diode (OLED), an active matrix organic light emitting diode or an active matrix organic light emitting diode (active-matrix organic light emitting diode, AMOLED), flexible light-emitting diode (flex light-emitting diode, FLED), Miniled, MicroLed, Micro-oLed, quantum dot light emitting diodes (quantum dot light emitting diodes, QLED), etc.
  • the electronic device 100 may include 1 or N display screens 194 , where N is a positive integer greater than 1.
  • the electronic device 100 can realize the shooting function through the ISP, the camera 193 , the video codec, the GPU, the display screen 194 and the application processor.
  • the ISP is used for processing the data fed back by the camera 193 .
  • the light is transmitted to the photosensitive element of the camera through the lens, and the light signal is converted into an electrical signal, and the photosensitive element of the camera transmits the electrical signal to the ISP for processing, and converts it into an image visible to the naked eye.
  • ISP can also perform algorithm optimization on image noise, brightness, and skin color.
  • ISP can also optimize the exposure, color temperature and other parameters of the shooting scene.
  • the ISP may be located in the camera 193 .
  • Camera 193 is used to capture still images or video.
  • the object generates an optical image through the lens and projects it to the photosensitive element.
  • the photosensitive element may be a charge coupled device (CCD) or a complementary metal-oxide-semiconductor (CMOS) phototransistor.
  • CMOS complementary metal-oxide-semiconductor
  • the photosensitive element converts the light signal into an electrical signal, and then transmits the electrical signal to the ISP to convert it into a digital image signal.
  • the ISP outputs the digital image signal to the DSP for processing.
  • DSP converts digital image signals into standard RGB, YUV and other image signals.
  • the electronic device 100 may include 1 or N cameras 193 , where N is a positive integer greater than 1.
  • Digital signal processors are used to process digital signals. In addition to digital image signals, they can also process other digital signals. For example, when the electronic device 100 selects a frequency point, the digital signal processor is used to perform Fourier transform on the energy of the frequency point.
  • Video codecs are used to compress or decompress digital video.
  • the electronic device 100 may support one or more video codecs.
  • the electronic device 100 can play or record videos in various encoding formats, for example: moving picture experts group (moving picture experts group, MPEG) 1, MPEG2, MPEG3, MPEG4 and so on.
  • MPEG moving picture experts group
  • the NPU is a neural-network (NN) computing processor.
  • NN neural-network
  • Applications such as intelligent cognition of the electronic device 100 can be implemented through the NPU, such as image recognition, face recognition, speech recognition, text understanding, and the like.
  • the external memory interface 120 can be used to connect an external memory card, such as a Micro SD card, so as to expand the storage capacity of the electronic device 100.
  • the external memory card communicates with the processor 110 through the external memory interface 120 to implement a data storage function. Such as saving music, video and other files in the external memory card.
  • the internal memory 121 may be used to store computer-executable program codes including instructions.
  • the processor 110 executes various functional applications and data processing of the electronic device 100 by executing instructions stored in the internal memory 121 .
  • the internal memory 121 may include an area for storing programs and an area for storing data. Wherein, the stored program area can store an operating system, at least one application required by a function (such as a face recognition function, a fingerprint recognition function, a mobile payment function, etc.) and the like.
  • the data storage area can store data created during use of the electronic device 100 (such as face information template data, fingerprint information template, etc.) and the like.
  • the internal memory 121 may include a high-speed random access memory, and may also include a non-volatile memory, such as at least one magnetic disk storage device, flash memory device, universal flash storage (universal flash storage, UFS) and the like.
  • the electronic device 100 can implement audio functions through the audio module 170 , the speaker 170A, the receiver 170B, the microphone 170C, the earphone interface 170D, and the application processor. Such as music playback, recording, etc.
  • the audio module 170 is used to convert digital audio information into analog audio signal output, and is also used to convert analog audio input into digital audio signal.
  • the audio module 170 may also be used to encode and decode audio signals.
  • the audio module 170 may be set in the processor 110 , or some functional modules of the audio module 170 may be set in the processor 110 .
  • Speaker 170A also referred to as a "horn" is used to convert audio electrical signals into sound signals.
  • Electronic device 100 can listen to music through speaker 170A, or listen to hands-free calls.
  • Receiver 170B also called “earpiece” is used to convert audio electrical signals into sound signals.
  • the receiver 170B can be placed close to the human ear to receive the voice.
  • the microphone 170C also called “microphone” or “microphone” is used to convert sound signals into electrical signals. When making a phone call or sending a voice message, the user can put his mouth close to the microphone 170C to make a sound, and input the sound signal to the microphone 170C.
  • the electronic device 100 may be provided with at least one microphone 170C. In some other embodiments, the electronic device 100 may be provided with two microphones 170C, which may also implement a noise reduction function in addition to collecting sound signals. In some other embodiments, the electronic device 100 can also be provided with three, four or more microphones 170C to collect sound signals, reduce noise, identify sound sources, and realize directional recording functions, etc.
  • the earphone interface 170D is used for connecting wired earphones.
  • the earphone interface 170D can be a USB interface 130, or a 3.5mm open mobile terminal platform (OMTP) standard interface, or a cellular telecommunications industry association of the USA (CTIA) standard interface.
  • OMTP open mobile terminal platform
  • CTIA cellular telecommunications industry association of the USA
  • the pressure sensor 180A is used to sense the pressure signal and convert the pressure signal into an electrical signal.
  • pressure sensor 180A may be disposed on display screen 194 .
  • pressure sensors 180A such as resistive pressure sensors, inductive pressure sensors, and capacitive pressure sensors.
  • a capacitive pressure sensor may be comprised of at least two parallel plates with conductive material.
  • the electronic device 100 determines the intensity of pressure according to the change in capacitance.
  • the electronic device 100 detects the intensity of the touch operation according to the pressure sensor 180A.
  • the electronic device 100 may also calculate the touched position according to the detection signal of the pressure sensor 180A.
  • touch operations acting on the same touch position but with different touch operation intensities may correspond to different operation instructions. For example: when a touch operation with a touch operation intensity less than the first pressure threshold acts on the short message application icon, an instruction to view short messages is executed. When a touch operation whose intensity is greater than or equal to the first pressure threshold acts on the icon of the short message application, the instruction of creating a new short message is executed.
  • the gyro sensor 180B can be used to determine the motion posture of the electronic device 100 .
  • the angular velocity of the electronic device 100 around three axes may be determined by the gyro sensor 180B.
  • the gyro sensor 180B can be used for image stabilization. Exemplarily, when the shutter is pressed, the gyro sensor 180B detects the shaking angle of the electronic device 100, calculates the distance that the lens module needs to compensate according to the angle, and allows the lens to counteract the shaking of the electronic device 100 through reverse movement to achieve anti-shake.
  • the gyro sensor 180B can also be used for navigation and somatosensory game scenes.
  • the air pressure sensor 180C is used to measure air pressure.
  • the electronic device 100 calculates the altitude based on the air pressure value measured by the air pressure sensor 180C to assist positioning and navigation.
  • the magnetic sensor 180D includes a Hall sensor.
  • the electronic device 100 may use the magnetic sensor 180D to detect the opening and closing of the flip leather case.
  • the electronic device 100 when the electronic device 100 is a clamshell machine, the electronic device 100 can detect opening and closing of the clamshell according to the magnetic sensor 180D.
  • features such as automatic unlocking of the flip cover are set.
  • the acceleration sensor 180E can detect the acceleration of the electronic device 100 in various directions (generally three axes). When the electronic device 100 is stationary, the magnitude and direction of gravity can be detected. It can also be used to identify the posture of electronic devices, and can be used in applications such as horizontal and vertical screen switching, pedometers, etc.
  • the distance sensor 180F is used to measure the distance.
  • the electronic device 100 may measure the distance by infrared or laser. In some embodiments, when shooting a scene, the electronic device 100 may use the distance sensor 180F for distance measurement to achieve fast focusing.
  • Proximity light sensor 180G may include, for example, light emitting diodes (LEDs) and light detectors, such as photodiodes.
  • the light emitting diodes may be infrared light emitting diodes.
  • the electronic device 100 emits infrared light through the light emitting diode.
  • Electronic device 100 uses photodiodes to detect infrared reflected light from nearby objects. When sufficient reflected light is detected, it may be determined that there is an object near the electronic device 100 . When insufficient reflected light is detected, the electronic device 100 may determine that there is no object near the electronic device 100 .
  • the electronic device 100 can use the proximity light sensor 180G to detect that the user is holding the electronic device 100 close to the ear to make a call, so as to automatically turn off the screen to save power.
  • the proximity light sensor 180G can also be used in leather case mode, automatic unlock and lock screen in pocket mode.
  • the ambient light sensor 180L is used for sensing ambient light brightness.
  • the electronic device 100 can adaptively adjust the brightness of the display screen 194 according to the perceived ambient light brightness.
  • the ambient light sensor 180L can also be used to automatically adjust the white balance when taking pictures.
  • the ambient light sensor 180L can also cooperate with the proximity light sensor 180G to detect whether the electronic device 100 is in the pocket, so as to prevent accidental touch.
  • the fingerprint sensor 180H is used to collect fingerprints.
  • the electronic device 100 can use the collected fingerprint characteristics to implement fingerprint unlocking, access to application locks, take pictures with fingerprints, answer incoming calls with fingerprints, and the like.
  • the temperature sensor 180J is used to detect temperature.
  • the electronic device 100 uses the temperature detected by the temperature sensor 180J to implement a temperature treatment strategy. For example, when the temperature reported by the temperature sensor 180J exceeds the threshold, the electronic device 100 may reduce the performance of the processor located near the temperature sensor 180J, so as to reduce power consumption and implement thermal protection.
  • the electronic device 100 when the temperature is lower than another threshold, the electronic device 100 heats the battery 142 to prevent the electronic device 100 from being shut down abnormally due to the low temperature.
  • the electronic device 100 boosts the output voltage of the battery 142 to avoid abnormal shutdown caused by low temperature.
  • Touch sensor 180K also known as "touch panel”.
  • the touch sensor 180K can be disposed on the display screen 194, and the touch sensor 180K and the display screen 194 form a touch screen, also called a “touch screen”.
  • the touch sensor 180K is used to detect a touch operation on or near it.
  • the touch sensor can pass the detected touch operation to the application processor to determine the type of touch event.
  • Visual output related to the touch operation can be provided through the display screen 194 .
  • the touch sensor 180K may also be disposed on the surface of the electronic device 100 , which is different from the position of the display screen 194 .
  • the keys 190 include a power key, a volume key and the like.
  • the key 190 may be a mechanical key. It can also be a touch button.
  • the electronic device 100 can receive key input and generate key signal input related to user settings and function control of the electronic device 100 .
  • the motor 191 can generate a vibrating reminder.
  • the motor 191 can be used for incoming call vibration prompts, and can also be used for touch vibration feedback.
  • touch operations applied to different applications may correspond to different vibration feedback effects.
  • the motor 191 may also correspond to different vibration feedback effects for touch operations acting on different areas of the display screen 194 .
  • Different application scenarios for example: time reminder, receiving information, alarm clock, games, etc.
  • the touch vibration feedback effect can also support customization.
  • the indicator 192 can be an indicator light, and can be used to indicate charging status, power change, and can also be used to indicate messages, missed calls, notifications, and the like.
  • the SIM card interface 195 is used for connecting a SIM card.
  • the SIM card can be connected and separated from the electronic device 100 by inserting it into the SIM card interface 195 or pulling it out from the SIM card interface 195 .
  • the electronic device 100 may support 1 or N SIM card interfaces, where N is a positive integer greater than 1.
  • SIM card interface 195 can support Nano SIM card, Micro SIM card, SIM card etc. Multiple cards can be inserted into the same SIM card interface 195 at the same time. The types of the multiple cards may be the same or different.
  • the SIM card interface 195 is also compatible with different types of SIM cards.
  • the SIM card interface 195 is also compatible with external memory cards.
  • the electronic device 100 interacts with the network through the SIM card to implement functions such as calling and data communication.
  • the electronic device 100 may execute the screen projection method through the processor 110 .
  • FIG. 7B is a block diagram of the software structure of the first electronic device 100 provided by the embodiment of the present application.
  • the layered architecture divides the software into several layers, and each layer has a clear role and division of labor. Layers communicate through software interfaces.
  • the Android system is divided into four layers, which are respectively the application program layer, the application program framework layer, the Android runtime (Android runtime) and the system library, and the kernel layer from top to bottom.
  • the application layer can consist of a series of application packages.
  • the application package may include application programs such as camera, gallery, calendar, call, map, navigation, WLAN, Bluetooth, music, video, short message, and screen projection management.
  • application programs such as camera, gallery, calendar, call, map, navigation, WLAN, Bluetooth, music, video, short message, and screen projection management.
  • the user can communicate with other devices through screen casting management, and then cast image resources on the interface of the device where the communication connection is established.
  • the application framework layer provides an application programming interface (application programming interface, API) and a programming framework for applications in the application layer.
  • the application framework layer includes some predefined functions.
  • the application framework layer may include a display (display) manager, a sensor (sensor) manager, a cross-device connection manager, an event manager, a task (activity) manager, a window manager, and a content provider. , view system, resource manager, notification manager, etc.
  • the display manager is used for the display management of the system and is responsible for the management of all display-related transactions, including creation, destruction, direction switching, size and state changes, etc.
  • display-related transactions including creation, destruction, direction switching, size and state changes, etc.
  • main display module there is only one default display module on a single device, that is, the main display module.
  • the first electronic device 100 may create multiple virtual display modules, and heterologous screen projection is to create a virtual display module to carry the display of an application and use it for screen projection.
  • the first electronic device may adjust the display direction of the application window on the virtual display to the display direction of the virtual display through the display manager.
  • the specific adjustment process please refer to the related content below.
  • the sensor manager is responsible for the status management of the sensor, and the management application listens to the sensor event to it, and reports the event to the application in real time.
  • the cross-device connection manager is used to establish a communication connection with the second electronic device 200, and send image resources to the second electronic device 200 based on the communication connection.
  • the event manager is used for the event management service of the system, responsible for receiving the events uploaded by the bottom layer and distributing them to each window, and completing the receiving and distributing of events.
  • the task manager is used for the management of task (Activity) components, including startup management, life cycle management, task direction management, etc.
  • a window manager is used to manage window programs.
  • the window manager can get the size of the display screen, determine whether there is a status bar, lock the screen, capture the screen, etc.
  • the window manager is also responsible for window display management, including window display mode, display size, display coordinate position, display level and other related management.
  • the sensor manager sends the physical orientation of the second electronic device to the display manager, and the manager adjusts the display orientation of the virtual display to the physical orientation of the second electronic device.
  • the display direction of the application window is adjusted to the display direction of the virtual display.
  • the sensor manager sends the physical orientation of the second electronic device to the display manager, and the display manager obtains the display orientation supported by the application window on the virtual display from the task manager, and the display manager then based on the second electronic device
  • the physical orientation of the device and the display orientation supported by the application window determine the display orientation of the virtual display.
  • the window manager adjusts the display orientation of the application window on the virtual display to the display orientation of the virtual display.
  • the task manager when receiving a user instruction to open a new application window from the user, notifies the window manager, and the window manager adjusts the display direction of the application window to the display direction of the current virtual display.
  • the task manager when the task manager determines that the user opens a new application window that does not support the display orientation of the current virtual display, it notifies the display manager to adjust the display orientation of the virtual display to a display orientation supported by the application window, and then, The window manager then adjusts the display direction of the application window on the virtual display to the display direction of the virtual display.
  • the application program can dynamically apply to the task manager for the display direction of the horizontal screen or the vertical screen; furthermore, the task manager can notify the display manager to adjust the display direction of the virtual display; the display manager determines the display direction of the current virtual display When the display direction is not the display direction dynamically requested by the application program, the display direction of the virtual display is adjusted to the display direction requested by the application program; finally, the window manager adjusts the display direction of the display window on the virtual display to the display direction of the virtual display.
  • Content providers are used to store and retrieve data and make it accessible to applications.
  • Said data may include video, images, audio, calls made and received, browsing history and bookmarks, phonebook, etc.
  • the view system includes visual controls, such as controls for displaying text, controls for displaying pictures, and so on.
  • the view system can be used to build applications.
  • a display interface can consist of one or more views.
  • a display interface including a text message notification icon may include a view for displaying text and a view for displaying pictures.
  • the resource manager provides various resources for the application, such as localized strings, icons, pictures, layout files, video files, and so on.
  • the notification manager enables the application to display notification information in the status bar, which can be used to convey notification-type messages, and can automatically disappear after a short stay without user interaction.
  • the notification manager is used to notify the download completion, message reminder, etc.
  • the notification manager can also be a notification that appears on the top status bar of the system in the form of a chart or scroll bar text, such as a notification of an application running in the background, or a notification that appears on the screen in the form of a dialog window.
  • prompting text information in the status bar issuing a prompt sound, vibrating the electronic device, and flashing the indicator light, etc.
  • the Android Runtime includes core library and virtual machine. The Android runtime is responsible for the scheduling and management of the Android system.
  • the core library consists of two parts: one part is the function function that the java language needs to call, and the other part is the core library of Android.
  • the application layer and the application framework layer run in virtual machines.
  • the virtual machine executes the java files of the application program layer and the application program framework layer as binary files.
  • the virtual machine is used to perform functions such as object life cycle management, stack management, thread management, security and exception management, and garbage collection.
  • a system library may include multiple functional modules. For example: surface manager (surface manager), media library (Media Libraries), 3D graphics processing library (eg: OpenGL ES), 2D graphics engine (eg: SGL) and event data, etc.
  • surface manager surface manager
  • media library Media Libraries
  • 3D graphics processing library eg: OpenGL ES
  • 2D graphics engine eg: SGL
  • event data etc.
  • the surface manager is used to manage the display subsystem and provides the fusion of 2D and 3D layers for multiple applications.
  • the system library is specifically used for data management in a cross-device scenario.
  • the first electronic device 100 sends the layer data, audio and video data in the system library to the second electronic device 200, and the second electronic device 200 sends the event data to the first electronic device 100 to complete the event process.
  • the second electronic device 200 transmits the status of the sensor data to the first electronic device 100 in real time, and only then can the display of the virtual display module in the first electronic device 100 be correct according to the sensor data of the second electronic device 200 state to control the display.
  • the media library supports playback and recording of various commonly used audio and video formats, as well as still image files, etc.
  • the media library can support a variety of audio and video encoding formats, such as: MPEG4, H.264, MP3, AAC, AMR, JPG, PNG, etc.
  • the 3D graphics processing library is used to implement 3D graphics drawing, image rendering, compositing, and layer processing, etc.
  • 2D graphics engine is a drawing engine for 2D drawing.
  • the kernel layer is the layer between hardware and software.
  • the kernel layer includes at least a display driver, a camera driver, an audio driver, and a sensor driver.
  • the kernel layer provides capabilities such as device discovery, device authentication, and device connection, and reports events such as device discovery and departure.
  • the second electronic device 200 may include: a processor 222 , a memory 223 , a wireless communication module 224 , a power switch 225 , a display screen 229 , and an audio module 230 .
  • the second electronic device 200 may further include a wired LAN communication processing module 226, a high definition multimedia interface (high definition multimedia interface, HDMI) communication processing module 227, a USB communication processing module 228, and the like.
  • the above modules can be connected through the bus. in:
  • Processor 222 may be used to read and execute computer readable instructions.
  • the processor 222 may mainly include a controller, an arithmetic unit, and a register.
  • the controller is mainly responsible for instruction decoding, and sends out control signals for the operations corresponding to the instructions.
  • the arithmetic unit is mainly responsible for performing fixed-point or floating-point arithmetic operations, shift operations, logic operations, etc., and can also perform address operations and conversions.
  • the register is mainly responsible for saving the register operands and intermediate operation results temporarily stored during the execution of the instruction.
  • the hardware architecture of the processor 222 may be an application specific integrated circuit (ASIC) architecture, a MIPS architecture, an ARM architecture, or an NP architecture, and so on.
  • ASIC application specific integrated circuit
  • the processor 222 can be used to analyze the signal received by the wireless communication module 224, such as the new URL sent by the first electronic device 100, and obtain multiple videos and associated videos in the playlist according to the new URL. .
  • the wireless communication module 224 may include a WLAN communication processing module.
  • the wireless communication module 224 may further include a Bluetooth (BT) communication processing module, an NFC processing module, a cellular mobile communication processing module (not shown) and the like.
  • BT Bluetooth
  • NFC NFC
  • cellular mobile communication processing module not shown
  • the wireless communication module 224 can be used to establish a communication connection with the first electronic device 100 .
  • the communication connection established between the wireless communication module 224 and the first electronic device 100 may be various.
  • the WLAN communication processing module can be used to establish a Wi-Fi direct communication connection with the first electronic device 100
  • the Bluetooth (BT) communication processing module can be used to establish a Bluetooth communication connection with the first electronic device 100
  • the NFC processing module can be used to establish a Bluetooth communication connection with the first electronic device 100.
  • An electronic device 100 establishes an NFC connection and so on.
  • the wireless communication module 224 may also be configured to establish a communication connection with the first electronic device 100, and receive the video stream sent by the first electronic device 100 based on the communication connection.
  • the communication connection established by the wireless communication module 224 and the first electronic device 100 can perform data transmission based on the HTTP protocol, and this application does not impose any restrictions on the communication connection type and data transmission protocol between devices.
  • the memory 223 is coupled with the processor 222 for storing various software programs and/or sets of instructions.
  • the memory 223 may include a high-speed random access memory, and may also include a non-volatile memory, such as one or more magnetic disk storage devices, flash memory devices or other non-volatile solid-state storage devices.
  • the memory 223 can store operating systems, such as embedded operating systems such as uCOS, VxWorks, and RTLinux.
  • the memory 223 can also store a communication program, which can be used to communicate with the first electronic device 100, one or more servers, or additional devices.
  • the power switch 225 can be used to control power supply to the second electronic device 200 .
  • the wired LAN communication processing module 226 can be used to communicate with other devices in the same LAN through the wired LAN, and can also be used to connect to the WAN through the wired LAN to communicate with devices in the WAN.
  • the HDMI communication processing module 227 can be used to communicate with other devices through an HDMI interface (not shown).
  • the USB communication processing module 228 can be used to communicate with other devices through a USB interface (not shown).
  • the display screen 229 can be used for screencasting pages, videos, etc.
  • the display screen 229 may adopt LCD, OLED, AMOLED, FLED, QLED and other display screens.
  • For the content displayed on the display screen 229 reference may be made to related descriptions of subsequent method embodiments.
  • the display screen 229 can realize continuous playing of multiple videos according to the wireless communication module 224 receiving the video streams of multiple videos such as the playlist and associated videos sent by the server 300 .
  • the audio module 230 can be used to output audio signals through the audio output interface, so that the large-screen display device 200 can support audio playback.
  • the audio module 230 is also used to receive audio data through an audio input interface.
  • the audio module 230 may include, but is not limited to: a microphone, a speaker, a receiver, and the like.
  • the second electronic device 200 may also include a serial interface such as an RS-232 interface.
  • the serial interface can be connected to other devices, such as audio speakers and other external audio devices, so that the display and audio external devices cooperate to play audio and video.
  • the structure shown in FIG. 3 does not constitute a specific limitation on the second electronic device 200 .
  • the second electronic device 200 may include more or fewer components than shown in the illustration, or combine some components, or separate some components, or arrange different components.
  • the illustrated components can be realized in hardware, software or a combination of software and hardware.
  • the second electronic device 200 may include the hardware included in the first electronic device 100 shown in FIG. 1 above.
  • the various components shown in the figures may be implemented in hardware, software, or a combination of hardware and software including one or more signal processing and/or application specific integrated circuits.
  • the software system of the second electronic device 200 may adopt a layered architecture, an event-driven architecture, a micro-kernel architecture, a micro-service architecture, or a cloud architecture and the like.
  • the software system of the second electronic device 200 may include but not limited to Linux or other operating systems. It is Huawei's Hongmeng system.
  • the second electronic device 200 is an Android system, which is divided into four layers, from top to bottom are application program layer, application program framework layer, Android runtime (Android runtime) and system library, and kernel layer.
  • the application layer can include screen projection management applications for device connection and screen projection
  • the application framework layer can include cross-device connection managers, event managers, window managers and display managers, etc.
  • the system library can include For the media library and event data, etc.
  • the kernel layer is used for device discovery, device authentication, and device connection, etc.
  • FIG. 8 exemplarily shows the flow of the screen projection method provided by the embodiment of the present application.
  • the screen projection method may include some or all of the following steps:
  • the first electronic device establishes a communication connection with the second electronic device.
  • the communication connection established between the first electronic device and the second electronic device may include but not limited to: Wi-Fi P2P communication connection, Bluetooth communication connection, NFC connection and so on.
  • 9A-9E exemplarily show user interfaces for establishing a communication connection between the first electronic device and the second electronic device.
  • FIG. 9A exemplarily shows a user interface 910 on the first electronic device for displaying application programs installed on the first electronic device.
  • user interface 910 may include: a status bar 911 , a page indicator 912 , a tray with frequently used application icons 913 , a navigation bar 914 , and a number of other application icons. in:
  • the status column 911 may include: one or more signal strength indicators (such as signal strength indicator 911A, signal strength indicator 911B) of a mobile communication signal (also referred to as a cellular signal), wireless fidelity (wireless fidelity, Wi- Fi) one or more signal strength indicator 911C, battery status indicator 911E, time indicator 911F of the signal.
  • signal strength indicators such as signal strength indicator 911A, signal strength indicator 911B
  • wireless fidelity wireless fidelity, Wi- Fi
  • an indicator 911D may also be displayed in the status bar 911, and the display 911D may be used to indicate that the first electronic device has successfully turned on the Bluetooth.
  • the page indicator 912 can be used to indicate the positional relationship between the currently displayed page and other pages.
  • the tray 913 with commonly used application icons may include multiple tray icons (such as camera application icons, address book application icons, phone application icons, and message application icons), and the tray icons are kept displayed when the page is switched.
  • the above tray icon is optional, which is not limited in this embodiment of the application.
  • Other application program icons may be multiple application icons, for example, a gallery application icon 915, a Huawei video application icon, a Huawei wallet application icon, a sound recorder application icon, a setting application icon, and the like. Icons of other application programs may be distributed on multiple pages, and the page indicator 912 may also be used to indicate the application program on which page the user is currently browsing. The user can slide the area of other application icons left and right to browse the application icons in other pages.
  • the navigation bar 914 may include system navigation keys such as a back key, a home screen key, a multitasking key, and the like.
  • the user interface 910 shown in FIG. 9A may be a home screen. It can be understood that FIG. 9A only exemplarily shows a user interface of the first electronic device, which should not limit the embodiment of the present application.
  • the first electronic device when the first electronic device detects a downward sliding gesture on the display screen, in response to the sliding gesture, the first electronic device may display a window 916 on the user interface 910 .
  • a control 916A may be displayed in the window 916, and the control 916A may receive an operation (such as a touch operation, a click operation) of turning on/off the screen projection function of the electronic device.
  • the expression form of the control 916A may include an icon and/or text (for example, the text "wireless screen projection", “screen projection", “multi-screen interaction", etc.).
  • Window 916 may also display switch controls for other functions such as Wi-Fi, Bluetooth, flashlight, and the like.
  • the first electronic device may detect a user operation on the control 916A for enabling the screen projection function.
  • the first electronic device may change the display form of the control 916A, such as adding a shadow when the control 916A is displayed.
  • the user may also input a gesture of sliding down on other interfaces to trigger the first electronic device to display the window 916 .
  • the first electronic device may also display a setting interface provided by a settings (settings) application.
  • a user operation is input on the control to enable the screen projection function of the first electronic device.
  • the first electronic device responds to the user operation detected in FIG. 9B for enabling the screen projection function, and enables one or more of WLAN, Bluetooth or NFC in the wireless communication module 160, and may An electronic device capable of projecting a screen near the first electronic device is discovered through one or more wireless communication technologies of Wi-Fi direct connection, Bluetooth, and NFC.
  • the first electronic device may scan probe requests (such as probe request frames) broadcast by other nearby devices through the Wi-Fi direct technology, and discover the nearby second electronic device and other electronic devices.
  • the user interface 910 displayed by the first electronic device includes a window 917.
  • the window 917 may display: an interface indicator 917A, an icon 917B, an image 917C and a logo 917D of the electronic device discovered by the electronic device.
  • the interface indicator 917A is used to indicate that the content displayed in the window 917 is the information of the electronic device found after the screen projection function is turned on.
  • the icon 917B is used to indicate that the first electronic device is still discovering other electronic devices around. When the first electronic device has not found any nearby electronic devices, the number of electronic devices displayed in the window 917 is 0.
  • Both the image 917C and the identification 917D of the electronic device may be carried in a probe request (such as a probe request frame) broadcast by the electronic device.
  • the image 917C and/or the logo 917D may receive a user operation (such as a touch operation), and in response to the user operation, the first electronic device may initiate a communication connection establishment request to the electronic device corresponding to the image 917C and/or the logo 917D.
  • a user operation such as a touch operation
  • FIG. 9D exemplarily shows a user interface 920 displayed on the display screen after the second electronic device receives a request for establishing a communication connection from the first electronic device.
  • the user interface 920 includes a window 921 .
  • the window 921 is used to prompt the user that the first electronic device requests to establish a communication connection.
  • the second electronic device may detect the operation of establishing a screen projection communication connection with the first electronic device, for example, the user clicks the control 921A on the remote control as shown in FIG. 9D .
  • An electronic device establishes a communication connection for screen projection.
  • the second electronic device detects an operation of refusing to establish a screen projection communication connection with the first electronic device, for example, the user clicks on the control 921B on the remote control as shown in FIG.
  • a communication connection for screen projection is established with the first electronic device.
  • the communication connection can also be established in other ways, which is not limited here.
  • the second electronic device can provide an NFC tag, which carries the identification (such as MAC address) of the second electronic device, and when the user brings the first electronic device close to the second electronic device, the first electronic device can read Get the identification in the NFC tag, and directly establish a communication connection with the second electronic device corresponding to the identification.
  • the communication connection may also be a Wi-Fi P2P connection, a Bluetooth connection, an NFC connection, and the like.
  • the first electronic device may also display a user interface 910 as shown in FIG. 9E .
  • user interface 910 includes a prompt 918A, a control 918B, and a control 918C.
  • the prompt information 918A is used to prompt the user to perform screen projection from a different source.
  • the first electronic device may detect that the user chooses to use heterogeneous screen projection for screen projection, for example, the user clicks the control 918B shown in FIG. 9E . S109, specifically refer to the detailed description below.
  • the first electronic device may detect that the user does not select the heterogeneous screen projection operation, for example, the user clicks the control 918A shown in FIG. 9E . In response to this operation, the first electronic device will continue to use the same source The method of screen projection shares image resources with the second electronic device.
  • the prompt information 918A may also be expressed in other forms.
  • the prompt information 918A may further explain that the interface displayed on the device may be inconsistent with the content displayed in the screen projection window on the interface to be projected during heterogeneous screen projection. Therefore, this embodiment of the present application does not impose any limitation on the prompt information 918A in the window 918 shown in FIG. 9E .
  • the user can independently choose whether to use heterogeneous screen projection for screen projection according to needs, which can improve user experience.
  • the second electronic device acquires sensor data of the second electronic device.
  • the second electronic device may obtain sensor data through a gyroscope sensor or a gravity sensor.
  • the second electronic device may collect sensor data of the second electronic device when receiving a user operation, or may collect sensor data of the second electronic device in real time.
  • the second electronic device sends a first message to the first electronic device based on the communication connection, where the first message includes sensor data of the second electronic device.
  • the second electronic device may send sensor data of the second electronic device or a physical direction of the second electronic device to the first electronic device based on a user operation. For example, after step S101, the user may touch, for example, the window 921A of the user interface 920 in FIG. , sending the first message to the first electronic device. Correspondingly, the first electronic device receives the first message from the second electronic device.
  • the second electronic device when determining that the physical orientation of the second electronic device changes, sends sensor data of the second electronic device or the physical orientation of the second electronic device to the first electronic device.
  • the second electronic device can obtain its own sensor data in real time, determine the physical orientation of the second electronic device based on the sensor data, and generate a first message when it is determined that its physical orientation changes, and the first message includes the second Sensor data from electronic devices. For example, when the user rotates the second electronic device from a landscape orientation to a portrait orientation when using the second electronic device, when the second electronic device senses that its physical orientation has changed, it sends the second electronic device's sensor data.
  • the first message includes the physical direction of the second electronic device, that is to say, the second electronic device can determine the physical direction according to the sensor data of the second electronic device, and then send its physical direction to the second electronic device. an electronic device.
  • the first electronic device determines the physical direction of the second electronic device based on the sensor data of the second electronic device.
  • the first electronic device may determine the physical orientation of the second electronic device according to sensor data of the second electronic device.
  • the first electronic device determines a display direction of the virtual display based on the physical direction of the second electronic device.
  • the first electronic device can update the display direction of the virtual display in real time, or can update the display direction of the virtual display when the target condition is met, and the display content of the virtual display is used to mirror and project the screen to the screen projection window of the second electronic device .
  • the first electronic device can receive the physical direction of the second electronic device in real time, and determine the display direction of the virtual display in real time when receiving the second electronic device; When the direction changes, the display direction of the virtual display is re-determined; for another example, the user can open a new application window through a user operation or the user can change the window direction through a user operation, then the first electronic device can display an application window on the virtual display.
  • the display direction of the virtual display is determined based on the physical direction of the second electronic device and the display direction of the application window.
  • the first electronic device may determine the physical direction of the second electronic device as the display direction of the virtual display.
  • the first electronic device determines the display orientation of the virtual display as a landscape orientation; Orientation, the first electronic device determines the display direction of the virtual display as an anti-landscape orientation; when determining that the physical orientation of the second electronic device is a portrait orientation, the first electronic device determines the display orientation of the virtual display as a portrait orientation ; When determining that the physical orientation of the second electronic device is the anti-portrait orientation, the first electronic device determines the display orientation of the virtual display as the anti-portrait orientation.
  • the second electronic device sends the portrait orientation to the first electronic device, and the first electronic device receives the physical orientation of the second electronic device Adjust the display orientation of the virtual display to portrait orientation after the orientation is portrait.
  • the first electronic device may determine the display direction of the virtual display based on the physical direction of the second electronic device and the display direction of the application window.
  • the first electronic device may determine the physical orientation of the second electronic device as the display orientation of the virtual display; the supported display orientation of the application window does not include the orientation of the second electronic device.
  • the display direction of the application window is determined, the display direction of the application window is determined as the display direction of the virtual display.
  • the first electronic device may determine the display orientation of the virtual display as landscape orientation when the second electronic device is in landscape orientation, and determine the display orientation of the virtual display as landscape orientation when the second electronic device is in portrait orientation.
  • the display direction of the virtual display is determined as portrait orientation; when the application window only supports landscape display, the display direction of the virtual display is determined as landscape orientation; when the application window only supports portrait display, the virtual display The display orientation of is determined as portrait orientation.
  • the second electronic device sends the portrait orientation to the first electronic device, and the first electronic device receives the second electronic device's physical orientation.
  • the display orientation supported by the application window on the virtual display is obtained. If the application window supports vertical screen orientation display, the first electronic device adjusts the display orientation of the virtual display to the vertical screen orientation; if the application window If only landscape orientation display is supported, the first electronic device determines the display orientation of the virtual display as landscape orientation.
  • the display orientation supported by the application window may be the display orientation preset by the application, or the display orientation set by the user for the application. For example, if the first news application only supports portrait orientation, the display orientation supported by the first news application is portrait orientation; for another example, the user sets the first news application through the setting application or the screen casting application, and the first news application If the display orientation of the application is set to landscape orientation, the display orientation supported by the First News application is landscape orientation.
  • the first electronic device creates a virtual display based on the display direction of the virtual display.
  • the first electronic device may determine the display direction of the virtual display by the height and width of the virtual display.
  • the width (width) of the virtual display is the side perpendicular to the display direction among the sides of the virtual display
  • the height (height) of the virtual display is the side parallel to the display direction among the sides of the virtual display.
  • the first electronic device determines that the display orientation of the virtual display is a portrait orientation and/or an anti-portrait orientation, it can create a virtual display whose width is smaller than its height, then The display direction of the virtual display is the vertical screen direction, so that when the display data of the virtual display is transmitted to the second electronic device, it is displayed in the screen projection window in the vertical screen direction; as shown in (B) in Figure 10, the second electronic device When it is determined that the display orientation of the virtual display is landscape orientation and/or anti-landscape orientation, a virtual display whose width is greater than its height can be created so that when the display data of the virtual display is transmitted to the second electronic device, the projection window will be in landscape orientation show.
  • the first electronic device determines that the display direction of the virtual display is a landscape orientation, it can also set the display orientation of the virtual display to be a landscape orientation or an anti-landscape orientation; after determining that the display orientation of the virtual display is After portrait orientation, you can also set the display orientation of the virtual display to be portrait orientation or anti-portrait orientation.
  • the aspect ratio of the virtual display may be determined according to the projection window of the first electronic device on the second electronic device, for example, when the second electronic device displays the projection content in full screen, the aspect ratio of the virtual display is the second The aspect ratio of the display screen of an electronic device.
  • step S107 is directly executed after step S105 is executed.
  • the first electronic device adjusts the display direction of the application window to the display direction of the virtual display; the application window is an interface of the application currently displayed on the virtual display.
  • the first electronic device adjusts the display direction of the application window to the display direction of the virtual display, where the virtual display can display application windows of different application programs, and the application windows are used to display interfaces of the application programs. For example, when the virtual display displays the application window of Huawei Mall in a horizontal screen, the first electronic device determines that the display direction of the virtual display is a vertical screen display, then the first electronic device adjusts the display direction of the application window, and the application window The display orientation is adjusted to portrait orientation.
  • the first electronic device after adjusting the display direction of the virtual display, adjusts the display direction of the application window to the adjusted display direction of the virtual display. For example, when the first electronic device determines that the physical orientation of the second electronic device has changed from landscape orientation to portrait orientation, and adjusts the display orientation of the virtual display from landscape orientation to portrait orientation, the first electronic device may change the virtual The display orientation of the current application window on the display is adjusted to a portrait orientation.
  • the first electronic device adjusts the display direction of the application window after the display direction of the application window is changed. For example, if the current display direction of the virtual display is landscape orientation, and the second electronic device receives a user operation from the user, the user operates the user to open an application window preset to be displayed in portrait orientation, then the first electronic device determines that the application window The window is inconsistent with the window of the current virtual display, and the display orientation of the application window is adjusted to the landscape orientation.
  • the first electronic device may display the application window in a non-full screen on the virtual display when the application window does not support the display direction of the virtual display.
  • the first electronic device determines the physical orientation of the second electronic device as the display orientation of the virtual display, and the display orientation of the application window on the virtual display is adjusted to the display orientation of the virtual display
  • the first electronic device The window of the application window can be adjusted from full screen display to non-full screen display. It should be noted that since the first electronic device determines the physical orientation of the second electronic device as the display orientation of the virtual display and adjusts the display orientation of the application window to the orientation of the virtual display, it may appear that the application window does not support the display of the virtual display. Orientation, at this time, the application window may be displayed on the virtual display in a non-full-screen manner.
  • full-screen display may refer to the situation in which the lengths of the four sides of the application window as shown in (A) in Figure 11 are respectively equal to the lengths of the four sides of the display screen of the second electronic device; In the case of displaying a title bar on the application window shown in (B), wherein, the application window has a side whose length is equal to the length of a side of the display screen of the second electronic device.
  • FIG. 12 exemplarily shows the display of the application window when the physical orientation of the second electronic device is rotated from the landscape orientation to the portrait orientation.
  • the display direction of the application window shown in (A) in Figure 12 supports landscape orientation and portrait orientation, so the application window can be displayed in full screen on the virtual display (that is, on the display screen of the second electronic device);
  • the display orientation of the application window shown in (B) in 12 only supports portrait orientation, so the application window can be displayed in a non-full screen when the display orientation of the application window is landscape orientation, and can be displayed in a non-full screen when the display orientation of the application window is portrait orientation.
  • the display orientation of the application window shown in (C) in Figure 12 only supports landscape orientation, so the application window can be displayed in full screen when the display orientation of the application window is landscape orientation.
  • the display orientation of the application window is the portrait orientation, it may not be displayed in full screen.
  • the first electronic device sends display data of the virtual display to the second electronic device.
  • the first electronic device displays the application window on the virtual display according to the display direction of the virtual display, obtains the display data of the virtual display, and further, displays the virtual display
  • the data is sent to the second electronic device, so that the second electronic device renders and displays the display data of the virtual display. It can be understood that the second electronic device renders and displays the display data of the virtual display on the screen projection window, and the screen projection content displayed in the screen projection window is consistent with the display content of the virtual display of the first electronic device.
  • the display data of the virtual display may be main interface data, video application interface data, memorandum application interface data, and the like.
  • the first electronic device may obtain the interface data of the first application, and process the interface data of the first application into display data consistent with the display direction of the virtual display, and the virtual display may The processed application window is displayed in a full-screen or non-full-screen manner, thereby obtaining display data of the virtual display.
  • the second electronic device displays the display data.
  • the second electronic device receives the data stream file from the first electronic device, and the second electronic device parses the data stream file to obtain screen projection data, and then obtains an adjusted display based on the screen projection data The projected screen image drawn in the direction, and the second electronic device displays the projected screen image.
  • the above screen projection method will be described in detail below by taking the scene in which the user rotates the second electronic device from a landscape orientation to a portrait orientation as shown in FIG. 13A as an example.
  • the first electronic device projects a screen from a different source to the second electronic device.
  • the second electronic device displays the display data from the first electronic device in full screen. The device rotates from landscape orientation to portrait orientation.
  • the screen projection method may include the following steps.
  • the second electronic device sends sensor data of the second electronic device to the first electronic device when it is determined that the physical orientation of the second electronic device is rotated from a landscape orientation to a portrait orientation.
  • the second electronic device can collect sensor data of the second electronic device in real time, obtain the physical orientation of the second electronic device according to the sensor data of the second electronic device, and then, when the user rotates the second electronic device from the landscape orientation to In the portrait state, the second electronic device senses that its physical orientation is rotated from the landscape orientation to the portrait orientation, and sends the sensor data of the second electronic device to the first electronic device.
  • the first electronic device After receiving the sensor data of the second electronic device, the first electronic device determines the physical direction of the second electronic device based on the sensor data of the second electronic device.
  • the first electronic device may determine that the current physical orientation of the second electronic device is the portrait orientation according to the sensor data of the second electronic device.
  • the first electronic device determines the physical direction of the second electronic device as the display direction of the virtual display.
  • the first electronic device may adjust the display orientation of the virtual display to a portrait orientation after determining that the physical orientation of the second electronic device is a portrait orientation.
  • the first electronic device may exchange the width and height of the virtual display, so as to adjust the display orientation of the virtual display from a landscape orientation to a portrait orientation.
  • FIG. 13B The original virtual display is shown in (A) in FIG. 13B.
  • the display orientation of the virtual display is horizontal, and the width is greater than the height; the first electronic device will be shown in (A) in FIG. 13B
  • the virtual display shown in (B) in Figure 13B can be obtained.
  • the height of the adjusted virtual display is greater than the width, and the display direction of the adjusted virtual display is Portrait orientation.
  • the first electronic device adjusts the display directions of the application windows on the virtual display to the display directions of the virtual display.
  • the application window is displayed on the virtual display in the portrait orientation.
  • the first electronic device may display the application window in a non-full screen. Please refer to FIG. 13C, assuming that the original application window is as shown in (A) in FIG. 13C, the first electronic device can draw the application window in a horizontal direction to obtain the application window as shown in (B) in FIG. 13C .
  • the first electronic device sends display data of the virtual display to the second electronic device.
  • the display data of the virtual display includes the display data of the application window.
  • the second electronic device displays the display data.
  • the second electronic device receives the display data from the first electronic device, and displays the display data. Please refer to FIG. 13D .
  • FIG. 13D exemplarily shows the display interface of the screen projection window on the second electronic device when the user rotates the second electronic device from a landscape orientation to a portrait orientation.
  • the second electronic device is in a portrait orientation
  • the display orientation of the projection window on the second electronic device is a portrait orientation
  • the display orientation of the virtual display on the first electronic device is In the portrait orientation
  • the first user opens the first application window through a first user operation.
  • the second electronic device receives a first user operation, where the first user operation is used to open a first application window.
  • the user exits the current application window by clicking a control on the screen projection window, that is, opens the first application window.
  • the second electronic device In response to the first user operation, the second electronic device sends the first user operation to the first electronic device.
  • the first electronic device After receiving the first user operation, acquires the first application window corresponding to the first user operation.
  • the first electronic device acquires display data corresponding to the first application window, such as display content of the first application window and display directions supported by the first application window.
  • the first electronic device adjusts the display direction of the virtual display based on the display direction supported by the first application window.
  • the first electronic device does not adjust the display direction of the virtual display when the first application window supports the portrait orientation; when the first application window does not support the portrait orientation, adjusts the display orientation of the virtual display to the first application window Supported display orientations. For example, if the first application window only supports the landscape orientation, the first electronic device may adjust the display orientation of the virtual display to the landscape orientation.
  • the first electronic device displays the first application window on the virtual display after the display direction is adjusted, and obtains first display data. determining the display direction of the virtual display as the display direction of the application window displayed on the virtual display
  • the display direction of the first application window is the display direction of the current virtual display
  • the first display data is the display data of the virtual display
  • the first application window is displayed on the virtual display in the first direction, and the display content of the virtual display is determined as the first display data.
  • the first orientation is the landscape orientation
  • the first electronic device may display the first application window in the landscape orientation after adjusting the display orientation of the virtual display to the landscape orientation.
  • the display content of the virtual display is determined as the first display data
  • the first display data is rendered and displayed as the display content on the virtual display.
  • the first electronic device sends the first display data to the second electronic device.
  • the second electronic device displays the first display data.
  • the interface displayed by the second electronic device may be as shown in (A) in FIG. 14B; if the first application window supports portrait orientation, the second electronic device The displayed interface may be as shown in (B) in FIG. 14B.
  • the embodiment of the present application also provides an electronic device, the electronic device includes one or more processors and one or more memories; wherein, the one or more memories are coupled with the one or more processors, and the one or more memories are used for For storing computer program codes, the computer program codes include computer instructions, and when one or more processors execute the computer instructions, the electronic device executes the methods described in the above embodiments.
  • the embodiment of the present application also provides a computer program product containing instructions, and when the computer program product is run on the electronic device, the electronic device is made to execute the method described in the foregoing embodiments.
  • the embodiment of the present application also provides a computer-readable storage medium, including instructions, and when the instructions are run on the electronic device, the electronic device is made to execute the method described in the foregoing embodiments.
  • all or part of them may be implemented by software, hardware, firmware or any combination thereof.
  • software When implemented using software, it may be implemented in whole or in part in the form of a computer program product.
  • the computer program product includes one or more computer instructions. When the computer program instructions are loaded and executed on the computer, the processes or functions according to the present application will be generated in whole or in part.
  • the computer can be a general purpose computer, a special purpose computer, a computer network, or other programmable devices.
  • the computer instructions may be stored in or transmitted from one computer-readable storage medium to another computer-readable storage medium, for example, the computer instructions may be transmitted from a website, computer, server or data center Transmission to another website site, computer, server, or data center by wired (eg, coaxial cable, optical fiber, DSL) or wireless (eg, infrared, wireless, microwave, etc.) means.
  • the computer-readable storage medium may be any available medium that can be accessed by a computer, or a data storage device such as a server or a data center integrated with one or more available media.
  • the available medium may be a magnetic medium (for example, a floppy disk, a hard disk, a magnetic tape), an optical medium (for example, DVD), or a semiconductor medium (for example, a Solid State Disk).
  • the processes can be completed by computer programs to instruct related hardware.
  • the programs can be stored in computer-readable storage media.
  • When the programs are executed may include the processes of the foregoing method embodiments.
  • the aforementioned storage medium includes: ROM or random access memory RAM, magnetic disk or optical disk, and other various media that can store program codes.

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Software Systems (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

La présente demande concerne un procédé de projection d'écran, un dispositif électronique et un système. Le procédé consiste : à obtenir, par un premier dispositif électronique, une direction physique d'un second dispositif électronique ; à déterminer, par le premier dispositif électronique, une direction d'affichage d'une unité d'affichage virtuelle en fonction de la direction physique ; à déterminer, par le premier dispositif électronique, la direction d'affichage de l'unité d'affichage virtuelle en tant que direction d'affichage d'une fenêtre d'application affichée sur l'unité d'affichage virtuelle ; à envoyer, par le premier dispositif électronique, des données d'affichage de l'unité d'affichage virtuelle au second dispositif électronique. Par la mise en œuvre de modes de réalisation de la présente demande, la commutation d'une direction d'affichage d'une fenêtre de projection d'écran sur un dispositif de réception peut être réalisée dans un processus de projection d'écran, et l'expérience d'utilisation d'un utilisateur est améliorée.
PCT/CN2022/107783 2021-07-28 2022-07-26 Procédé de projection d'écran, dispositif électronique et système WO2023005900A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN202110859935.5A CN115686401A (zh) 2021-07-28 2021-07-28 一种投屏方法、电子设备及系统
CN202110859935.5 2021-07-28

Publications (1)

Publication Number Publication Date
WO2023005900A1 true WO2023005900A1 (fr) 2023-02-02

Family

ID=85059209

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2022/107783 WO2023005900A1 (fr) 2021-07-28 2022-07-26 Procédé de projection d'écran, dispositif électronique et système

Country Status (2)

Country Link
CN (1) CN115686401A (fr)
WO (1) WO2023005900A1 (fr)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117135396A (zh) * 2023-02-14 2023-11-28 荣耀终端有限公司 投屏方法及其相关设备

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107105184A (zh) * 2017-04-01 2017-08-29 深圳市蓝莓派科技有限公司 一种移动终端在竖屏广告机上的同屏投射方法
CN108268225A (zh) * 2016-12-30 2018-07-10 乐视汽车(北京)有限公司 投屏方法及投屏装置
CN110347317A (zh) * 2019-06-11 2019-10-18 广州视源电子科技股份有限公司 一种窗口切换方法、装置、存储介质及交互智能平板
CN111443884A (zh) * 2020-04-23 2020-07-24 华为技术有限公司 投屏方法、装置和电子设备
CN112099705A (zh) * 2020-09-04 2020-12-18 维沃移动通信有限公司 投屏方法、装置及电子设备

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108268225A (zh) * 2016-12-30 2018-07-10 乐视汽车(北京)有限公司 投屏方法及投屏装置
CN107105184A (zh) * 2017-04-01 2017-08-29 深圳市蓝莓派科技有限公司 一种移动终端在竖屏广告机上的同屏投射方法
CN110347317A (zh) * 2019-06-11 2019-10-18 广州视源电子科技股份有限公司 一种窗口切换方法、装置、存储介质及交互智能平板
CN111443884A (zh) * 2020-04-23 2020-07-24 华为技术有限公司 投屏方法、装置和电子设备
CN112099705A (zh) * 2020-09-04 2020-12-18 维沃移动通信有限公司 投屏方法、装置及电子设备

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117135396A (zh) * 2023-02-14 2023-11-28 荣耀终端有限公司 投屏方法及其相关设备

Also Published As

Publication number Publication date
CN115686401A (zh) 2023-02-03

Similar Documents

Publication Publication Date Title
US11567623B2 (en) Displaying interfaces in different display areas based on activities
WO2021013158A1 (fr) Procédé d'affichage et appareil associé
WO2020259452A1 (fr) Procédé d'affichage plein écran pour terminal mobile et appareil
WO2021135730A1 (fr) Procédé d'adaptation d'interface d'affichage, procédé de conception d'adaptation d'interface d'affichage et dispositif électronique
WO2021000881A1 (fr) Procédé de division d'écran et dispositif électronique
WO2021213164A1 (fr) Procédé d'interaction entre des interfaces d'application, dispositif électronique et support de stockage lisible par ordinateur
WO2022258024A1 (fr) Procédé de traitement d'images et dispositif électronique
WO2022257977A1 (fr) Procédé de projection d'écran pour dispositif électronique, et dispositif électronique
WO2020093988A1 (fr) Procédé de traitement d'image et dispositif électronique
WO2022105445A1 (fr) Procédé de projection d'écran d'application basé sur un navigateur et appareil associé
WO2022017393A1 (fr) Système d'interaction d'affichage, procédé d'affichage, et dispositif
US20230117194A1 (en) Communication Service Status Control Method, Terminal Device, and Readable Storage Medium
WO2020155875A1 (fr) Procédé d'affichage destiné à un dispositif électronique, interface graphique personnalisée et dispositif électronique
WO2022083465A1 (fr) Procédé de projection d'écran de dispositif électronique, support associé et dispositif électronique
WO2022222924A1 (fr) Procédé de réglage de paramètres d'affichage par projection d'écran
WO2021143391A1 (fr) Procédé de partage d'écran sur la base d'un appel vidéo et dispositif mobile
WO2022143180A1 (fr) Procédé d'affichage collaboratif, dispositif terminal et support de stockage lisible par ordinateur
WO2023005900A1 (fr) Procédé de projection d'écran, dispositif électronique et système
EP4293997A1 (fr) Procédé d'affichage, dispositif électronique et système
US20240143262A1 (en) Splicing Display Method, Electronic Device, and System
WO2022143310A1 (fr) Procédé de projection sur écran à double canal et dispositif électronique
WO2023169237A1 (fr) Procédé de capture d'écran, dispositif électronique, et système
WO2024022307A1 (fr) Procédé de duplication d'écran et dispositif électronique
EP4287014A1 (fr) Procédé d'affichage, dispositif électronique et système
WO2023006035A1 (fr) Procédé et système de duplication d'écran et dispositif électronique

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 22848505

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE