WO2021018274A1 - Procédé de projection d'écran et dispositif électronique - Google Patents

Procédé de projection d'écran et dispositif électronique Download PDF

Info

Publication number
WO2021018274A1
WO2021018274A1 PCT/CN2020/106096 CN2020106096W WO2021018274A1 WO 2021018274 A1 WO2021018274 A1 WO 2021018274A1 CN 2020106096 W CN2020106096 W CN 2020106096W WO 2021018274 A1 WO2021018274 A1 WO 2021018274A1
Authority
WO
WIPO (PCT)
Prior art keywords
electronic device
screen
application
identification
user
Prior art date
Application number
PCT/CN2020/106096
Other languages
English (en)
Chinese (zh)
Inventor
周星辰
范振华
Original Assignee
华为技术有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 华为技术有限公司 filed Critical 华为技术有限公司
Publication of WO2021018274A1 publication Critical patent/WO2021018274A1/fr

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/14Digital output to display device ; Cooperation and interconnection of the display device with other functional units
    • G06F3/1454Digital output to display device ; Cooperation and interconnection of the display device with other functional units involving copying of the display data of a local workstation or window to a remote workstation or window so that an actual copy of the data is displayed simultaneously on two or more displays, e.g. teledisplay
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/14Digital output to display device ; Cooperation and interconnection of the display device with other functional units
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F9/00Arrangements for program control, e.g. control units
    • G06F9/06Arrangements for program control, e.g. control units using stored programs, i.e. using an internal store of processing equipment to receive or retain programs
    • G06F9/44Arrangements for executing specific programs
    • G06F9/451Execution arrangements for user interfaces
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M1/00Substation equipment, e.g. for use by subscribers
    • H04M1/72Mobile telephones; Cordless telephones, i.e. devices for establishing wireless links to base stations without route selection
    • H04M1/724User interfaces specially adapted for cordless or mobile telephones
    • H04M1/72448User interfaces specially adapted for cordless or mobile telephones with means for adapting the functionality of the device according to specific conditions
    • H04M1/72454User interfaces specially adapted for cordless or mobile telephones with means for adapting the functionality of the device according to specific conditions according to context-related or environment-related conditions

Definitions

  • This application relates to the field of terminal technology, and in particular to a screen projection method and electronic equipment.
  • Screen projection technology refers to the use of wireless communication technology to deliver the content on the electronic device A to the electronic device B, so that the electronic device B can display the content on the electronic device A.
  • content on electronic devices with smaller displays such as mobile phones, tablet computers, etc.
  • larger displays such as televisions, projectors, etc.
  • the content on the electronic device with the smaller display screen can be viewed on the electronic device with the larger display screen, so as to achieve a better viewing effect.
  • the embodiments of the present application provide a screen projection method and electronic device, which help reduce the complexity of the operation mode for triggering the electronic device to initiate the screen projection, and improve the efficiency of triggering the source device to initiate the screen projection.
  • a screen projection method is applied to a first electronic device, the first electronic device includes a first display screen, and the method includes:
  • the first electronic device receives the first operation
  • the first electronic device turns off the first display screen, and obtains screen projection content from a target application of at least one currently running application;
  • the first electronic device sends the projected content to the second electronic device.
  • the user can initiate a screen projection only by operating the first electronic device once, which helps to reduce the operation mode that triggers the electronic device to initiate a screen projection The complexity of triggering the first electronic device to initiate screen projection is improved.
  • the first operation is a screen-off operation.
  • the operation of clicking the power button the operation of closing the cover of the smart cover.
  • the first electronic device is a flip electronic device, and the first operation is an operation of the outer cover of the device.
  • the first operation may be an operation of folding the first display screen from an unfolded state to a closed state.
  • the first electronic device can turn off a part of the first display screen or turn off the entire first display screen in response to the operation of folding the first display screen from the unfolded state to the closed state. Turn off the screen.
  • the first operation may be an operation of shrinking the first display screen.
  • the first electronic device can turn off a part of the first display screen (ie, the contracted part) in response to the operation of shrinking the first display screen, or Turn off the entire first display screen.
  • the method in response to the first operation, after turning off the first display screen, and before acquiring the projected content from the target application in at least one currently running application, the method further includes:
  • the method before the first electronic device sends the projected content to the second electronic device, the method further includes:
  • the environment where the first electronic device is located is determined to identify whether the user intends to cast a screen. For example, when the first electronic device determines whether the display screen is blocked, for example, the first electronic device includes a first display screen, the first display screen is a foldable display screen and an inner screen, and the first electronic device also includes an outer screen. , The first electronic device can determine whether the external screen is blocked. As another example, the first electronic device only includes the first display screen, and the first electronic device can determine whether the first display screen is blocked. When the display screen of the first electronic device is not blocked, the first electronic device sends the projected content to the second electronic device. By judging whether the display screen is blocked, it can be recognized whether the user puts the first electronic device in the bag or pocket, etc.
  • the first electronic device when the user puts the first electronic device in the bag or pocket, the first electronic device is no longer used The device may not need to cast the screen. Further, the first electronic device can also learn the user's behavior habit of using the electronic device through the user's history of using the device, and further determine whether to send the screened content to the second electronic device according to the user's behavior habit of using the electronic device.
  • a control interface is displayed on the second display screen, and the control interface is used to implement quick operations on the target application.
  • the second display screen may be a part or all of the display area of the first display screen, or a display screen different from the first display screen. Thus, it is convenient for users to control the content of the screen.
  • the first electronic device determines a control interface corresponding to the type of the target application from a preset control interface corresponding to the type of the application, and displays it in the second The control interface corresponding to the target application type is displayed on the screen. Help simplify the implementation.
  • the first electronic device recognizes the virtual button with touch function in the target application, and displays the virtual button with touch function on the second display screen according to the virtual button with touch function.
  • the control interface for controlling the target application is displayed. It helps to improve the reliability of the displayed control interface.
  • the first electronic device when it does not recognize the virtual button with touch function in the target application, it determines from the preset control interface corresponding to the type of application A control interface corresponding to the type of the target application, and a control interface corresponding to the type of the target application is displayed on the external screen. Help simplify the implementation.
  • the first electronic device determines the target application from at least one currently running application, including:
  • the first electronic device displays the identification of at least one application that supports the screen projection function among the currently running at least one application on the second display screen; receiving an operation of selecting the identification of the application displayed on the external screen by the user Then, in response to the operation of the user selecting the identifier of the application displayed on the external screen, it is determined that the target application is the application identified by the identifier of the application selected by the user. Help improve the interaction between the device and the user.
  • the method further includes:
  • the first electronic device determines the identification of the target electronic device from the identification of the at least one electronic device, and the identification of the target electronic device is used to identify the second electronic device.
  • the first electronic device determining the identification of the target electronic device from the identification of the at least one electronic device includes:
  • the first electronic device determines, from the identification of the at least one electronic device, the identification used to identify the private electronic device as the identification of the target electronic device. Help reduce user operations.
  • the first electronic device determining the identification of the target electronic device from the identification of the at least one electronic device includes:
  • the first electronic device displays the identification of at least one electronic device on the second display screen; after receiving the user's operation to select the identification of the electronic device displayed on the second display screen, in response to the above operation, the The identification of the electronic device selected by the user is used as the identification of the target electronic device.
  • each of the identifications of the at least one electronic device is used to identify a public electronic device.
  • the method further includes:
  • the first electronic device receives the second operation, and in response to the second operation, stops projection. For example, after stopping the screen projection, the first electronic device displays the user interface where the screen projection content is located on the first display screen.
  • the second operation may be an unlocking operation.
  • the second operation may be an operation of unfolding the first display screen from a closed state to an unfolded state.
  • the first display screen is a retractable display screen
  • the second operation may be an operation of extending the first display screen, or the like.
  • an embodiment of the present application also provides a method for controlling projection, which is applied to a first electronic device, where the first electronic device includes a first application, and the method includes:
  • the first electronic device obtains the screencast content from the first application
  • the first electronic device sends the projected content to the second electronic device
  • the first electronic device determines a control interface corresponding to the type of the first application program from the preset control interface corresponding to the type of the application program after the screen projection is successful;
  • the first electronic device displays the determined control interface on a display screen, and the control interface includes a virtual button with a touch function. Wherein, after receiving an operation on a certain virtual button on the control interface, the first electronic device responds to the foregoing operation to realize the control of the projection content presented on the second electronic device.
  • control interface corresponding to the type of the application program is preset, which helps to simplify the way of displaying the control interface.
  • the first electronic device may configure a control interface corresponding to the type of application in the first electronic device before leaving the factory; or, the first electronic device may be installed in accordance with its own application Program, obtain the control interface corresponding to the type of application from the server.
  • the embodiments of the present application also provide another method of projection control, which is applied to a first electronic device, the first electronic device includes a first application, and the method includes:
  • the first electronic device obtains the screencast content from the first application
  • the first electronic device sends the projected content to the second electronic device
  • the first electronic device recognizes a virtual button with a touch function from the first application, and displays a control interface on the display screen according to the recognized virtual button.
  • the icons of the virtual buttons included on the control interface may be obtained by the first electronic device re-arranging, cropping and/or scaling the virtual buttons recognized from the first application.
  • the control interface includes a virtual button with the same function as the virtual button recognized by the first electronic device from the first application.
  • the first electronic device receives an operation on a certain virtual button on the control interface, in response to the above operation, it realizes the control of the projection content presented on the second electronic device.
  • the first electronic device recognizing the virtual button with touch function from the first application includes:
  • the first electronic device is based on the user's historical operation record of the first application, or a software development kit (SDK) interface provided by the first application, or a predefined in the first application
  • SDK software development kit
  • the position coordinates of the virtual button on the user interface, and the virtual button with touch function is recognized from the first application.
  • the first electronic device recognizes a virtual button with a touch function from the first application program by performing voice analysis on the first application program. Help simplify the implementation.
  • the first electronic device when it does not recognize the virtual button with touch function from the first application, it determines from the preset control interface corresponding to the type of application A control interface corresponding to the type of the first application program, and the determined control interface is displayed on the display screen. It helps to simplify the implementation while meeting user needs.
  • control interface further includes an identification of at least one alternative application program and an identification of at least one alternative screen projection device. It is helpful for users to switch screen-casting applications and/or screen-casting devices according to their needs.
  • control interface further includes a virtual button for canceling screen projection. It helps users to actively stop screencasting according to their needs and improve the interaction between the device and the user.
  • a chip provided by an embodiment of the present application is coupled with a memory in a device, so that the chip invokes program instructions stored in the memory during operation to implement the above-mentioned aspects and aspects of the embodiments of the present application. Any possible design method involved in all aspects.
  • a computer storage medium of an embodiment of the present application stores program instructions, when the program instructions run on an electronic device, the device executes the above aspects of the embodiments of the present application and the various aspects related to Any possible design method.
  • a computer program product of an embodiment of the present application when the computer program product runs on an electronic device, causes the electronic device to execute and implement the above-mentioned aspects of the embodiments of the present application and any possible aspects involved in each aspect Design method.
  • FIG. 1 is a schematic diagram of a scenario applied by an embodiment of this application
  • FIG. 2 is a schematic diagram of the hardware structure of an electronic device according to an embodiment of the application.
  • 3A is a schematic diagram of the physical form of an electronic device according to an embodiment of the application.
  • 3B is a schematic diagram of another physical form of the electronic device according to an embodiment of the application.
  • 4A is a schematic diagram of another physical form of an electronic device according to an embodiment of the application.
  • 4B is a schematic diagram of another physical form of the electronic device according to an embodiment of the application.
  • 4C is a schematic diagram of another physical form of the electronic device according to an embodiment of the application.
  • 5A is a schematic diagram of another physical form of an electronic device according to an embodiment of the application.
  • 5B is a schematic diagram of another physical form of the electronic device according to an embodiment of the application.
  • 5C is a schematic diagram of another physical form of the electronic device according to an embodiment of the application.
  • 6A is a schematic diagram of another physical form of an electronic device according to an embodiment of the application.
  • 6B is a schematic diagram of another physical form of the electronic device according to an embodiment of the application.
  • FIG. 7 is a schematic diagram of a software architecture of an electronic device according to an embodiment of the application.
  • FIG. 8 is a schematic flowchart of a screen projection method according to an embodiment of the application.
  • FIG. 9 is a schematic diagram of another scenario applied by an embodiment of this application.
  • FIG. 10 is a schematic diagram of a user interface according to an embodiment of the application.
  • 11A is a schematic diagram of another user interface according to an embodiment of the application.
  • FIG. 11B is a schematic diagram of another user interface according to an embodiment of the application.
  • 11C is a schematic diagram of another user interface according to an embodiment of the application.
  • 11D is a schematic diagram of another user interface according to an embodiment of the application.
  • FIG. 12A is a schematic diagram of another user interface according to an embodiment of the application.
  • 12B is a schematic diagram of another user interface according to an embodiment of the application.
  • FIG. 13 is a schematic diagram of another user interface according to an embodiment of the application.
  • 14A is a schematic diagram of a control interface according to an embodiment of the application.
  • 14B is a schematic diagram of another control interface according to an embodiment of the application.
  • 14C is a schematic diagram of another control interface according to an embodiment of the application.
  • 14D is a schematic diagram of another control interface according to an embodiment of the application.
  • 14E is a schematic diagram of another control interface according to an embodiment of the application.
  • 14F is a schematic diagram of another control interface according to an embodiment of the application.
  • 14G is a schematic diagram of another control interface according to an embodiment of the application.
  • 15 is a schematic diagram of another user interface according to an embodiment of the application.
  • 16 is a schematic flowchart of another screen projection method according to an embodiment of the application.
  • 17 is a schematic diagram of the physical form of another electronic device according to an embodiment of the application.
  • FIG. 18 is a schematic flowchart of another screen projection method according to an embodiment of the application.
  • 19 is a schematic diagram of the physical form of another electronic device according to an embodiment of the application.
  • 20 is a schematic diagram of the physical form of another electronic device according to an embodiment of the application.
  • FIG. 21 is a schematic structural diagram of another electronic device according to an embodiment of the application.
  • FIG. 22 is a schematic structural diagram of another electronic device according to an embodiment of the application.
  • the at least two electronic devices include a source device and a target device.
  • the source device can also be referred to as the source device, which is an electronic device that initiates screen projection.
  • the source device is used to send screencast content.
  • the screencast content may be video, audio, image, document, game, etc., which is not limited.
  • the target device can also be referred to as a client device (client device) or a peer device, etc., and is an electronic device that receives projected screen content.
  • client device client device
  • the target device can present or display the projected content in a corresponding layout.
  • the layout of the projected content on the target device and the layout of the source device in the embodiment of the present application may be different or the same.
  • the source device may be a portable electronic device, such as a mobile phone, a tablet computer, a laptop computer (Laptop), a wearable device (such as a smart watch), and the like.
  • a portable electronic device such as a mobile phone, a tablet computer, a laptop computer (Laptop), a wearable device (such as a smart watch), and the like.
  • Exemplary embodiments of the aforementioned portable electronic devices include, but are not limited to, carrying Or other operating systems.
  • the embodiments of the present application do not limit the physical form of the portable electronic device.
  • the portable electronic device may be a foldable device, a candy bar device, a flip device, etc.
  • the portable electronic device in the embodiment of the present application may also be equipped with a smart protective cover.
  • the source device may also be an all-in-one computer, a desktop computer, or the like.
  • the target device may be an electronic device such as a tablet computer, an all-in-one computer, a desktop computer, a television, a monitor, a projector, a speaker, etc., which can be used to receive and present or display projected content.
  • Fig. 1 shows an application scenario of an embodiment of the present application.
  • the electronic device 10 is a source device, and the electronic device 20 is a target device.
  • the electronic device 10 can send the projected content to the electronic device 20, so that the projected content can be presented or displayed by the electronic device 20, thereby achieving a better viewing effect.
  • the electronic device 10 and the electronic device 20 may establish a connection in a wired manner (such as through a power cord) and/or wireless (such as wireless fidelity (wi-fi), Bluetooth, etc.).
  • a wired manner such as through a power cord
  • wireless such as wireless fidelity (wi-fi), Bluetooth, etc.
  • the embodiment of the present application does not limit the number of target end devices that receive the projection content sent by the source end device.
  • the electronic device 10 may send the screen content to two or more electronic devices including the electronic device 20, or it may only send the screen content ⁇ 20 ⁇ To the electronic device 20.
  • the embodiment of the present application provides a screen projection method so that the source device can initiate a screen projection in response to the first operation. That is, the user can operate the source-end device once to enable the source-end device to perform screen projection, thereby helping to reduce the complexity of the operation mode that triggers the source-end device to initiate the screen-casting, and improve the efficiency of triggering the source-end device to initiate the screen-casting.
  • the first operation may be used to control the screen to turn off the electronic device. In this case, the first operation may be referred to as a screen turning off operation.
  • the electronic device in the embodiment of the present application may display a black screen and not lock the screen, or display a black screen and lock the screen, or display the default user interface and not lock the screen, or display the default user interface and lock the screen , Or part of the black screen, part of the default user interface, etc., where the default user interface may include date and time information, and/or commonly used application icons, etc.
  • the content included in the default user interface can be set according to the user's needs, or the electronic device can be set before leaving the factory. For example, if the electronic device is in a locked screen state when the screen is turned off, the first operation may also be called a screen lock operation.
  • the source device can actively initiate a screen projection when the screen is off, and send the projected content to the target device for presentation or display, the user can continue to view the corresponding content on the target device, which helps to improve the user experience .
  • the first operation may also be other operations. Specifically, the implementation of the first operation may be related to the physical form of the electronic device.
  • the source device and target device and embodiments for using such source device and target device will be introduced in the following.
  • FIG. 2 shows a schematic diagram of the hardware structure of an electronic device 10 according to an embodiment of the present application.
  • the electronic device 10 includes a processor 110, an internal memory 121, an external memory interface 122, a camera 131, a first display screen 141, a sensor module 150, an audio module 160, a speaker 161, a receiver 162, a microphone 163, and a headset.
  • Interface 164 buttons 170, subscriber identification module (SIM) card interface 171, universal serial bus (USB) interface 172, charging management module 180, power management module 181, battery 182, mobile communication module 191 and wireless communication module 192.
  • SIM subscriber identification module
  • USB universal serial bus
  • the electronic device 10 further includes a second display screen 142.
  • the first display screen 141 and the second display screen 142 may be located on different faces of the electronic device 10.
  • the first display screen 141 is located on the first face of the electronic device 10 (for example, the front face of the electronic device 10)
  • the second The display screen 142 is located on the second side of the electronic device 10 (such as the back of the electronic device 10).
  • the electronic device 10 in the embodiment of the present application may also include a motor, an indicator, a mechanical shaft, and the like.
  • the hardware structure shown in FIG. 2 is only an example.
  • the source device of the embodiment of the present application may have more or fewer components than the electronic device 10 shown in the figure, may combine two or more components, or may have different component configurations.
  • the various components shown in the figure may be implemented in hardware, software, or a combination of hardware and software including one or more signal processing and/or application specific integrated circuits. It can be understood that the structure illustrated in the embodiment of the present application does not constitute a specific limitation on the electronic device 10. In other embodiments of the present application, the electronic device 10 may include more or fewer components than those shown in the figure, or combine certain components, or split certain components, or arrange different components.
  • the illustrated components can be implemented in hardware, software, or a combination of software and hardware.
  • the processor 110 may include one or more processing units.
  • the processor 110 may include an application processor (AP), a modem, a graphics processing unit (GPU), and an image signal processor.
  • AP application processor
  • GPU graphics processing unit
  • ISP image signal processor
  • video codec video codec
  • DSP digital signal processor
  • baseband processor baseband processor
  • neural network processor neural-network processing unit
  • the processor 110 may also be provided with a buffer for storing part of the program and/or data.
  • the buffer in the processor 110 may be a cache memory.
  • the buffer may be used to store programs and/or data that have just been used, generated, or recycled by the processor 110. If the processor 110 needs to use the program and/or data, it can be directly called from the buffer. The time for the processor 110 to obtain programs and/or data is reduced, thereby helping to improve the efficiency of the system.
  • the internal memory 121 may be used to store programs and/or data. It should be noted that the program in the embodiment of the present application may also be referred to as a program instruction.
  • the internal memory 121 includes a program storage area and a data storage area.
  • the storage program area can be used to store an operating system (such as Android, IOS, and other operating systems), a computer program required for at least one function (such as screen lock, screen projection), and the like.
  • the storage data area may be used to store data created and/or acquired during the use of the electronic device (such as the identification and image of the target device).
  • the processor 110 may call the program and/or data stored in the internal memory 121 to enable the electronic device 10 to execute a corresponding method, thereby implementing one or more functions.
  • the processor 110 calls certain programs and/or data in the internal memory 121, so that the electronic device 10 executes the screen projection method provided in the embodiment of the present application, thereby improving the efficiency of the source device initiating the screen projection and improving the user experience .
  • the internal memory 121 may be a high-speed random access memory, and/or a non-volatile memory.
  • the non-volatile memory may include at least one of one or more disk storage devices, flash memory devices, and/or universal flash storage (UFS).
  • the external memory interface 122 can be used to connect an external memory card (for example, a Micro SD card) to expand the storage capacity of the electronic device 10.
  • the external memory card communicates with the processor 110 through the external memory interface 122 to realize the data storage function.
  • the electronic device 10 may save files such as images, music, and videos in an external memory card through the external memory interface 122.
  • the camera 131 can be used to capture or collect moving and static images.
  • the camera 131 includes a lens and an image sensor.
  • the optical image generated by the object through the lens is projected onto the image sensor, and then converted into an electrical signal for subsequent processing.
  • the image sensor may be a charge coupled device (CCD) or a complementary metal-oxide-semiconductor (CMOS) phototransistor.
  • CMOS complementary metal-oxide-semiconductor
  • the image sensor converts the light signal into an electrical signal, and then transfers the electrical signal to the ISP to convert it into a digital image signal.
  • the camera 131 may include one or more cameras.
  • the first display screen 141 may include a display panel for displaying a graphical user interface (GUI).
  • GUI graphical user interface
  • the electronic device 10 displays a user interface on the first display screen 141 to present or display corresponding content to the user, such as videos, texts, images, virtual keys or virtual buttons for realizing interaction between the user and the electronic device 10, etc.
  • the display panel may adopt a liquid crystal display (LCD), an organic light-emitting diode (OLED), an active matrix organic light-emitting diode, or an active matrix organic light-emitting diode.
  • the electronic device 10 may implement a display function through a GPU, the first display screen 141, an application processor, and the like.
  • the first display screen 141 in the embodiment of the present application may be a foldable screen or a non-foldable screen, which is not limited.
  • the specific implementation manner of the second display screen 142 can be referred to the specific implementation manner of the first display screen 141, which is not repeated here.
  • the electronic device 10 is an electronic device with a foldable screen, and includes a first display screen 141 and a second display screen 142, where the first display screen 141 is a foldable screen, and the second display screen 142 is a non-foldable screen.
  • the first display screen 141 is located on the first surface of the electronic device 10
  • the second display screen 142 is located on the second surface of the electronic device 10
  • the first surface and the second surface are different.
  • the first side of the electronic device 10 may also be referred to as the front side of the electronic device 10.
  • the first display screen 141 is located on the first surface of the electronic device 10.
  • the included angle of 141 is ⁇ .
  • the second surface of the electronic device 10 can also be referred to as the back of the electronic device 10.
  • the second display screen 142 is located on the second surface of the electronic device 10.
  • the first display screen 141 The included angle is ⁇ . It should be noted that the value of the included angle of the first display screen 141 can be in the range of [0°, 180°]. When the angle of the first display screen 141 is 0°, the first display screen 141 is in the folded or closed state, and when the angle of the first display screen 141 is 180°, the first display screen 141 is in the Expanded state.
  • the first display screen 141 may also be referred to as an internal screen or a main screen
  • the second display screen 142 may also be referred to as an external screen or an auxiliary screen.
  • the electronic device 10 is an electronic device with a non-foldable screen, and includes a first display screen 141.
  • the electronic device 10 includes a first surface and a second surface, and the first display screen 141 is located on the first surface of the electronic device.
  • the first side of the electronic device 10 is shown in FIG. 4A
  • the second side of the electronic device 10 is shown in FIG. 4C.
  • the electronic device 10 only includes the first display screen 141.
  • the electronic device 10 may further include a second display screen 142.
  • the first display screen 141 is located on the first surface of the electronic device 10, and the second display screen 142 is located on the second surface of the electronic device 10.
  • the first side of the electronic device 10 is shown in FIG. 4A.
  • the second side of the electronic device 10 may be as shown in FIG. 4B.
  • the first display screen 141 may also be referred to as the main screen, and the second display screen 142 is again It can be called a secondary screen.
  • the electronic device 10 is an electronic device with a foldable screen, and includes a first display screen 141.
  • the first display screen 141 is an electronic device with a foldable screen.
  • the first display screen 141 When the first display screen 141 is in the unfolded state, it may be as shown in FIG. 5A, and the first display screen 141 in the closed state or the folded state may be as shown in FIG. 5B.
  • 5C is a schematic diagram when the included angle of the first display screen 141 is folded to ⁇ .
  • the electronic device 10 is an electronic device with a retractable screen, and includes a first display screen 141.
  • the state after the first display screen 141 is stretched may be as shown in FIG. 6A, and the state after the first display screen 141 is contracted may be as shown in FIG. 6B.
  • the sensor module 150 may include one or more sensors. For example, touch sensor 150A, pressure sensor 150B, distance sensor 150C, etc. In other embodiments, the sensor module 150 may also include a gyroscope, an acceleration sensor, a fingerprint sensor, an ambient light sensor, a proximity light sensor, a bone conduction sensor, a temperature sensor, and so on.
  • the touch sensor 150A may also be referred to as a “touch panel”.
  • the touch sensor 150A may be provided on the first display screen 141 and/or the second display screen 142. Take the touch sensor 150A provided on the first display screen 141 as an example.
  • the touch sensor 150A and the first display screen 141 form a first touch screen, also called "first touch screen”.
  • the touch sensor 150A is used to detect touch operations acting on or near it.
  • the touch sensor 150A may transmit the detected touch operation to the application processor to determine the type of touch event.
  • the electronic device 10 may provide visual output related to touch operations and the like through the first display screen 141.
  • the touch sensor 150A may also be disposed on the surface of the electronic device 10, which is different from the position of the first display screen 141.
  • the pressure sensor 150B is used to sense the pressure signal and can convert the pressure signal into an electrical signal.
  • the pressure sensor 150B may be provided on the first display screen 141 and/or the second display screen 142. Among them, touch operations that act on the same touch position but have different touch operation strengths can correspond to different operation instructions.
  • the distance sensor 150C which can also be called a distance sensor 150C, etc., is used to measure distance.
  • the electronic device 10 may use the distance sensor 150C to measure the distance to achieve rapid focusing.
  • the distance sensor 150C may also be used to determine whether the first display screen 141 and/or the second display screen 142 is blocked. For example, when the first side of the electronic device 10 is shown in FIG. 3A and the second side of the electronic device 10 is shown in FIG. 3B, if the first display screen 141 is in the closed state or folded state, the distance sensor 150C can be used for It is determined whether the second display screen 142 is blocked.
  • the distance sensor 150C is used to determine whether the first display screen 141 and the second display screen 142 are Occlude.
  • the distance sensor 150C is used to determine whether the first display screen 141 is blocked.
  • the electronic device 10 may implement audio functions through the audio module 160, the speaker 161, the receiver 162, the microphone 163, the earphone interface 164, and the application processor. For example, audio playback function, recording function, voice wake-up function, etc.
  • the audio module 160 can be used to perform digital-to-analog conversion and/or analog-to-digital conversion on audio data, and can also be used to encode and/or decode audio data.
  • the audio module 160 may be provided in the processor 110, or part of the functional modules of the audio module 160 may be provided in the processor 110.
  • the speaker 161 also called a “speaker” is used to convert audio data into sound and play the sound.
  • the electronic device 100 may listen to music through the speaker 161, answer a hands-free call, or issue a voice prompt, etc.
  • the receiver 162 also called “earpiece” is used to convert audio data into sound and play the sound. For example, when the electronic device 100 answers a call, the receiver 162 may be brought close to the human ear to answer the call.
  • the microphone 163 also known as a "microphone” or a “microphone” is used to collect sounds (such as ambient sounds, including sounds made by people, sounds made by equipment, etc.), and convert the sounds into audio electrical data.
  • sounds such as ambient sounds, including sounds made by people, sounds made by equipment, etc.
  • the electronic device may be provided with at least one microphone 163.
  • two microphones 163 are provided in the electronic device, in addition to collecting sound, it can also achieve a noise reduction function.
  • three, four or more microphones 163 may be provided in the electronic device, so that in addition to sound collection and noise reduction, sound source identification or directional recording functions can also be realized.
  • the earphone interface 164 is used to connect wired earphones.
  • the earphone interface 164 may be a USB interface 170, or a 3.5mm open mobile terminal platform (OMTP) standard interface, a cellular telecommunications industry association (cellular telecommunications industry association of the USA, CTIA) standard interface, etc. .
  • OMTP open mobile terminal platform
  • CTIA cellular telecommunications industry association of the USA, CTIA
  • the button 170 may include a power button, a volume button, and the like.
  • the button 170 may be a mechanical button or a virtual button.
  • the electronic device 10 can generate signal inputs related to user settings and function control of the electronic device 10 in response to the operation of the keys.
  • the electronic device 10 may lock the screen of the first display screen 141 in response to pressing the power button, and trigger the execution of the screen projection method of the embodiment of the present application.
  • the power button in the embodiment of the present application may also be referred to as a power-on button, a side button, etc., and the name of the power button is not limited.
  • the SIM card interface 171 is used to connect to a SIM card.
  • the SIM card can be inserted into the SIM card interface 171 or pulled out from the SIM card interface 171 to achieve contact and separation with the electronic device 10.
  • the electronic device 10 may support 1 or K SIM card interfaces 171, and K is a positive integer greater than 1.
  • the SIM card interface 171 may support Nano SIM cards, Micro SIM cards, and/or SIM cards, etc.
  • the same SIM card interface 171 can insert multiple SIM cards at the same time.
  • the types of the multiple SIM cards can be the same or different.
  • the SIM card interface 171 can also be compatible with different types of SIM cards.
  • the SIM card interface 171 may also be compatible with external memory cards.
  • the electronic device 10 interacts with the network through the SIM card to implement functions such as call and data communication.
  • the electronic device 10 may also adopt an eSIM card, that is, an embedded SIM card.
  • the eSIM card can be embedded in the electronic device 10 and cannot be separated from the electronic device 10.
  • the USB interface 172 is an interface that complies with the USB standard specification, and specifically may be a Mini USB interface, a Micro USB interface, a USB Type C interface, and so on.
  • the USB interface 172 can be used to connect a charger to charge the electronic device 10, and can also be used to connect the electronic device 10 to earphones and play sounds through the earphones.
  • the USB interface 172 can be connected to a headset, it can be understood that the USB interface 172 is used as a headset interface.
  • the USB interface 172 can be used as a headphone interface, and can also be used to connect to other electronic devices, such as AR devices, computers, and so on.
  • the charging management module 180 is used to receive charging input from the charger.
  • the charger can be a wireless charger or a wired charger.
  • the charging management module 180 may receive the charging input of the wired charger through the USB interface 170.
  • the charging management module 180 may receive the wireless charging input through the wireless charging coil of the electronic device 10. While the charging management module 180 charges the battery 182, it can also supply power to the electronic device 10 through the power management module 180.
  • the power management module 181 is used to connect the battery 182, the charging management module 180, and the processor 110.
  • the power management module 181 receives input from the battery 182 and/or the charging management module 180, and supplies power to the processor 110, the internal memory 121, the first camera 131, the second camera 132, the first display screen 141, and the like.
  • the power management module 181 can also be used to monitor parameters such as battery capacity, battery cycle times, and battery health status (leakage, impedance).
  • the power management module 181 may also be provided in the processor 110.
  • the power management module 181 and the charging management module 180 may also be provided in the same device.
  • the mobile communication module 191 may provide a wireless communication solution including 2G/3G/4G/5G and the like applied to the electronic device 10.
  • the mobile communication module 191 may include a filter, a switch, a power amplifier, a low noise amplifier (LNA), and the like.
  • LNA low noise amplifier
  • the wireless communication module 192 can provide applications on the electronic device 10 including WLAN (such as Wi-Fi network), Bluetooth (bluetooth, BT), global navigation satellite system (GNSS), frequency modulation (frequency modulation, FM) , Near field communication technology (near field communication, NFC), infrared technology (infrared, IR) and other wireless communication solutions.
  • the wireless communication module 192 may be one or more devices integrating at least one communication processing module.
  • the electronic device 10 may send screen projection content and/or screen projection instructions to the target device through the wireless communication module 192.
  • the antenna 1 of the electronic device 10 is coupled with the mobile communication module 191, and the antenna 2 is coupled with the wireless communication module 192, so that the electronic device 10 can communicate with other devices.
  • the mobile communication module 191 may communicate with other devices through the antenna 1
  • the wireless communication module 193 may communicate with other devices through the antenna 2.
  • FIG. 7 shows a schematic diagram of the software architecture of a source device and a target device in an embodiment of the present application.
  • the source device includes an input module 710A, a processing module 720A, and an output module 730A.
  • the input module 710A is used to detect the user's operation and report the user's operation to the processing module 720A.
  • the user's operation may be a touch operation or a non-touch operation.
  • the operation of displaying a certain user interface on the first display screen 141 or the second display screen 142, and folding the first display screen 142 The operation of a display screen 141, the operation of pressing the power button, the operation of closing the outer cover of the device, the operation of closing the cover of the smart protective cover, etc.
  • the input module 710A can detect the user's operation through a mechanical shaft, a touch sensor, a button, etc., which is not limited.
  • the processing module 720A is configured to receive the user's operation reported by the input module 710A, and identify the operation type of the user's operation reported by the input module 710A. For example, when the operation type of the user's operation is a screen-off operation, the screen is turned off on the first display screen 141, and after the screen is turned off on the first display screen 141, a screen projection is triggered.
  • the processing module 720A includes an operation recognition module 721A, a projection judgment module 722A, a content acquisition module 723A, a device acquisition module 724A, and the like.
  • the operation recognition module 721A is used to recognize the operation type of the user operation reported by the input module 710A.
  • the projection judgment module 722A is used to judge whether the smart projection function is turned on, or whether the first display screen 741 or the second display screen 742 is blocked, etc.
  • the content acquisition module 723A is used to acquire the projected content.
  • the device acquisition module 724A is used to determine the target device for receiving the screened content.
  • the output module 730A is used to establish a connection with the target device and send screen content to the target device. As another example, the output module 730A is also used to control the second display screen 742 or the first display screen 741 to display related information. For example, the output module 730A is used to control the second display screen 742 or the first display screen 741 to display a control interface.
  • the target device includes an input module 710B, a processing module 720B, and an output module 730B.
  • the input module 710A is used to establish a connection with the source-end device and receive screen projection content or screen-casting instructions sent by the source device.
  • the input module 710B includes a device connection module 711B, a content interaction module 712B, and an instruction interaction module 713B.
  • the device connection module 711B is used to establish a connection with the source device.
  • the content interaction module 720B is configured to receive the screencast content sent by the source device, and send the screencast content to the processing module 720B.
  • the instruction interaction module 713B is used to receive a screencasting instruction sent by the source device, for example, a screencasting cancellation instruction.
  • the processing module 720B is configured to, after receiving the projected content sent by the content interaction module 720B, re-layout or crop the projected content, and send the re-layout or cropped projected content to the output module 730B.
  • the output module 730B is configured to present or display the rearranged screencast content after receiving the rearranged screencast content sent by the processing module 720B.
  • the source device is an electronic device with a foldable screen.
  • the source device includes an internal screen and an external screen, the internal screen is a foldable screen, and the external screen is a non-foldable screen.
  • the inner screen of the source device may be the first display screen 141 shown in FIG. 3A or 3B, and the outer screen of the source device may be the second display screen 142 shown in FIG. 3B.
  • FIG. 8 it is a schematic flowchart of a screen projection method according to an embodiment of this application, which specifically includes the following steps.
  • step 801 the inner screen of the source device is in use, and an operation of folding the inner screen from the unfolded state to the closed state is received.
  • the internal screen of the source device is in use, which can be understood as:
  • the source device When the internal screen is in the expanded state and the screen is not locked, the source device displays the corresponding user interface on the internal screen, such as the main interface, one screen (-1 screen), the user interface of a certain application, etc.
  • the user can perform corresponding operations as needed when the internal screen displays the corresponding user interface, so that the source device responds to the user's operation and performs corresponding display on the internal screen. That is, when the inner screen of the source device is in the expanded state and the screen is not locked, when a user interface is displayed, the user can perform corresponding operations on the source device and control the inner screen to display accordingly, so as to satisfy the user Demand.
  • the source device Detecting that the user clicks on an icon of an application on the desktop (for example, the iQiyi icon), in response to the operation of clicking the iQiyi icon, the iQiyi user interface is displayed on the inner screen.
  • the user can press the virtual buttons (or virtual keys) included in the user interface Perform corresponding operations to achieve corresponding control.
  • the source device when the source device detects the user's operation on the iQiyi user interface when the internal screen displays the iQiyi user interface (for example, clicking the operation of the virtual button that controls the full-screen video display), it will respond to the above operation, The corresponding video is displayed in full screen on the internal screen and played, so that the user can watch the corresponding video.
  • the source device displays the user interface of Baidu Maps on the internal screen, it detects the user's operation of searching for routes on the user interface of Baidu Maps, and in response to the operation of searching for routes, displays the corresponding route search on the internal screen As a result, it is convenient for the user to reach the corresponding destination.
  • the user interface displayed on the internal screen of the source device is the user interface of the source device running the application in the foreground.
  • the source device can run one or more applications in the foreground.
  • the source device can run one or more applications in the background while running one or more applications in the foreground.
  • the application running in the foreground of the source device is iQiyi
  • the user interface of iQiyi is displayed on the internal screen.
  • the source device may also run other applications in the background, such as Alipay, WeChat, etc.
  • the external screen is locked and the screen is black, or the external screen is locked but the default user interface is displayed.
  • the default user interface can include information such as time and date. Specifically, the user can follow The information displayed on the default user interface needs to be set.
  • the first display screen 141 is the inner screen of the electronic device 10
  • the second display screen 142 is the outer screen of the electronic device 10.
  • the processor 110 of the source device can detect the rotation angle of the mechanical shaft of the first display screen 141. Change to determine whether the operation of folding the inner screen from the unfolded state to the closed state is received.
  • the included angle of the first display screen 141 is changed from 180° to 0°
  • the event reported to the processor 110 when the processor 110 receives the event that the angle of the first display screen 141 reported by the mechanical shaft changes from 180° to 0°, it determines to receive the folding of the inner screen from the unfolded state to the closed state Operation.
  • the processor 110 of the source device may also collect data from other sensors for sensing the angle change of the first display screen 141 to determine whether an operation of folding the inner screen from the unfolded state to the closed state is received. It should be noted that the embodiment of the present application does not limit the manner in which the source device specifically determines whether the operation of folding the inner screen from the unfolded state to the closed state is received.
  • Step 802 In response to receiving the operation of folding the inner screen from the unfolded state to the closed state, the source device turns off the inner screen.
  • the inner screen when the inner screen is folded into a closed state, the inner screen cannot display the user interface normally to the user. Therefore, in order to save the power consumption of the electronic device, turning off the inner screen can be understood as a black screen and The screen is locked, or the internal screen is black but the screen is not locked.
  • the source device folds the inner screen into a closed state, the user cannot use the inner screen normally, that is, the user cannot perform normal operations on the inner screen to achieve control of the source device. Taking the source device as the electronic device 10 in the application scenario shown in FIG.
  • the processor 110 of the electronic device 10 determines that it has received the operation of folding the first display screen 141 from the unfolded state to the closed state, according to the operation of folding the first display screen 141 from the unfolded state to the closed state, it determines that the operation type is off. Screen operation, and then turn off the first display screen 141.
  • the source device in response to receiving the operation of folding the inner screen from the expanded state to the closed state, can also automatically unlock the outer screen, and automatically map the user interface displayed on the inner screen to the outer screen for display.
  • the user interface of iQiyi is displayed on the internal screen
  • the external screen in response to the operation of folding the internal screen from the expanded state to the closed state, the external screen is unlocked, and the iQiyi user interface displayed on the internal screen is automatically mapped Display on the external screen.
  • the tablet responds to an operation of folding the inner screen from the unfolded state to the closed state, and the outer screen continues to remain black and/or locked, or the outer screen continues to remain locked and display the default user interface, etc., which is not limited.
  • Step 803 After turning off the inner screen in response to the operation of folding the inner screen from the expanded state to the closed state, the source device determines a target application from at least one currently running application.
  • the at least one currently running application program may include an application program currently running in the foreground and/or the background.
  • the target application program may be an application program that meets the first preset condition among at least one currently running application program.
  • the first preset condition can be set according to actual conditions, which is not limited.
  • the target application is an application identified in the whitelist among at least one currently running application.
  • the whitelist includes the identification of the application that supports the projection function, which can be set by the user according to their own needs, or set by the source device before leaving the factory, or generated by the source device according to a preset strategy.
  • the preset policy stipulates that the audio and video types, map types, reading types, and instant messaging (instant messaging, IM) types of applications are applications that support the screen projection function, and the source device can be based on the installed, and The identification of the type of application that meets the requirements of the preset policy generates a whitelist.
  • the whitelist generated according to the preset strategy includes the logo of iQiyi and the logo of WeChat. It should be understood that when a new application is detected on the source device, it can be judged whether the application meets the type specified by the preset policy. When the application meets the type specified by the preset policy, the identification of the application Add to the whitelist. Or, in other embodiments, the whitelist includes the identifiers of applications that do not support the screen projection function. In this case, the target application program may be an application program identified in at least one currently running application program that does not belong to the whitelist.
  • the identifier of the application program may be the package name of the application program, or the icon of the application program, or customized according to needs, which is not limited.
  • the target application is an application whose service is in progress among currently running applications.
  • video-type applications such as iYiqi, Youku, Tencent Video, etc.
  • music-playing applications such as Xiami Music, NetEase Cloud Music Etc.
  • map-type applications such as Baidu Maps, AutoNavi Maps, etc.
  • game-type applications such as Honor of Kings, Tetris, etc.
  • the service is in progress, which can be understood as being in the game.
  • the service is in progress, which can be understood as inputting, or voice call, or video call, or file transfer, etc.
  • the application program in which the service is in progress can be an application program running in the foreground or an application program running in the background.
  • the target application is an application that is identified in the whitelist and whose service is in progress among currently running applications.
  • the white list includes the identification of the application that supports the projection function.
  • the target application is an application currently running in the foreground.
  • the source device no longer casts a screen when at least one application program currently running does not meet the first preset condition. Or, the source device no longer casts a screen when the application is not currently running.
  • step 804 the source device obtains the screencast content from the target application, and sends the screencast content to the target device.
  • the source device may send the screencast content to the target device based on technologies such as Miracast, airplay, dlna, or Hicast.
  • the source device and the target device can establish the connection before receiving the user's operation to fold the inner screen from the expanded state to the closed state, or after receiving the user's operation to fold the inner screen from the expanded state to the closed state To establish a connection. For example, after turning off the internal screen, the source device determines the target device for receiving the projected content, and then initiates a connection establishment process to the target device. After the source device establishes a connection with the target device, when it obtains the screencast content from the target application, it sends the screencast content to the target device.
  • the source device can determine the target device that receives the projected content based on the following methods:
  • the source device can obtain the identification of the surrounding electronic devices that support the projection function based on communication technologies such as Bluetooth and/or Wi-Fi, and according to the acquired identification of the surrounding electronic devices that support the projection function, Determine the target device ID.
  • the electronic device identified by the target device identifier is the target device for receiving the projected content.
  • the target device identifier may be an identifier that meets the second preset condition among the identifiers of surrounding electronic devices that support the screen projection function.
  • the second preset condition can be set accordingly according to actual needs, which is not limited.
  • the target device identifier is an identifier in the trusted list among the identifiers of surrounding electronic devices that support the screen projection function.
  • the credited list includes the identification of at least one electronic device, which may be added by the user according to his own needs, or may be the identification of an electronic device that has been connected to the source device.
  • the identification of the electronic device included in the trusted list may be the identification of the private electronic device added by the user, or the identification of the private electronic device that has been connected to the source device.
  • private electronic devices refer to electronic devices in non-public places, such as TVs at home, desktop computers in dormitories, etc., while electronic devices in public places may be displays in conference rooms.
  • the source device can obtain its own geographic location information, and determine the current location based on its own geographic location information. Whether the place is a public place or not, when the geographical location of the place is not a public place, one of the obtained identifiers of the surrounding electronic devices that support the screen projection function is selected as the target device identifier.
  • the geographic location of the source device when the geographic location of the source device is a public place, it further determines whether the external screen of the source device is blocked. If the external screen is not blocked, the surrounding electronic devices that support the projection function are displayed on the external screen. logo. The user can select an identifier as the target device identifier among the identifiers of the surrounding electronic devices that support the screen projection function displayed on the external screen according to their own needs. Taking the source device as the electronic device 10 as an example, as shown in FIG.
  • the identification of the surrounding electronic devices that support the screen projection function acquired by the electronic device 10 includes the identification of the electronic device 20, the identification of the electronic device 30, and the identification of the electronic device 40 .
  • the electronic device 10 displays the identification of the electronic device 20, the identification of the electronic device 30, and the identification of the electronic device 40 on the external screen. For example, as shown in FIG. 10, the electronic device 10 displays the identification of the electronic device 20 and the electronic device on the external screen.
  • the identification of the electronic device 30 and the identification of the electronic device 40 When the user selects the identification of the electronic device 30, the electronic device 10 determines that the target device identification is the identification of the electronic device 30.
  • the electronic device 10 responds to the operation of folding the inner screen from the unfolded state to the closed state, turns off the inner screen, and continues to lock the outer screen. Based on the above scenario, in order to enable the user to view the electronic device 20 displayed on the outer screen To operate the identification, the identification of the electronic device 30, and the identification of the electronic device 40, the user needs to unlock the external screen of the electronic device 10 first.
  • the electronic device 10 can unlock the external screen of the electronic device 10 by recognizing the user's face or fingerprint.
  • the user operates the identification of the electronic device 20, the identification of the electronic device 30, and the identification of the electronic device 40 displayed on the external screen without being restricted by the external screen lock screen, that is, in the case of the external screen lock screen, the electronic device
  • the device 10 When the device 10 is locked on the external screen, it displays the identity of the electronic device 20, the identity of the electronic device 30, and the identity of the electronic device 40.
  • the user can check the identity of the electronic device 20, the identity of the electronic device 30, and the electronic device without unlocking the external screen.
  • the identification of the device 40 performs corresponding operations.
  • the electronic device 10 also displays a virtual button for controlling the cancellation of the screen projection on the external screen.
  • the user can click or touch the virtual button for controlling the cancellation of the screen projection to make the electronic device 10 cancel the screen projection. .
  • the electronic device 10 also displays on the external screen that the user has not selected a screen projection device for more than a preset time period, and the screen projection is automatically cancelled.
  • the preset duration can be 10s, 15s, etc., which can be set accordingly according to user needs.
  • the electronic device 10 may automatically black out the screen or display the default user interface after canceling the screen projection.
  • identification of the electronic device in the embodiment of the present application may include the icon of the electronic device, the name of the electronic device, etc., which is not limited.
  • the source device when the external screen is blocked, for example, the source device is placed in a bag or the external screen is placed facing the desktop, the source device no longer casts the screen.
  • the source device can also refer to device capability information (such as whether it can include a display screen, whether it includes a speaker, touch), device attribute information (such as the resolution and sound effects of the device display screen), and the current operating status of the device ( For example, whether it is playing video, audio, or whether it is communicating with other devices, etc.), the target device identification is determined from the identifications of electronic devices that support the screen projection function obtained from the surroundings.
  • the device capability information, device attribute information, and current operating status of the device may be obtained by the source device in the process of obtaining the identification of the electronic device that supports the screen projection function.
  • the electronic device identified by the target device identifier may be an electronic device that does not currently play a video, includes a display screen, and whose display screen resolution is greater than a first threshold. Wherein, the first threshold may be set according to actual needs.
  • the source device may scan and acquire nearby connectable electronic devices based on Bluetooth, and/or Wi-Fi.
  • Bluetooth as an example, when the source device obtains the identities of multiple electronic devices based on Bluetooth scanning, one or more device identities can be determined from the identities of the multiple electronic devices, then the source device and the determined one or more The target device identified by the device identifier establishes a connection, and sends the screencast content to the target device identified by the one or more device identifiers.
  • the source device may determine a device identifier from the device identifiers of multiple target devices, and send the screencast content to the target device identified by the determined device identifier.
  • the source device determines whether the smart screen projection function is enabled, and when the smart screen projection function is enabled, steps 803 and 804 are executed again. When the smart screen projection function is not enabled, step 803 and step 804 are no longer executed.
  • the smart screen projection function is turned on or off by the source device in response to the user's operation. For example, set a virtual button on the system settings interface to control turning on or off the smart screen projection function. When the user switches the virtual button from OFF to ON, the source device turns on the smart screen projection function. When the user turns the virtual button from ON When switched to OFF, the source device turns off the smart screen projection function.
  • the system setting interface may be a user interface 1100 as shown in FIG. 11A.
  • the user interface 1100 includes a virtual button 1101, where the virtual button 1101 is used to control turning on or off the smart screen projection function.
  • the virtual button used to control turning on or off the smart screen projection function can also be set on other user interfaces.
  • the virtual button used to control turning on or off the smart screen projection function is set on the pull-up interface or the drop-down interface.
  • Notification bar, or system toolbar, or control bar For example, the pull-up interface may be displayed by the source device in response to the user's operation of sliding down on the external screen or the internal screen.
  • the drop-down interface may be displayed by the source device in response to the user's operation of sliding up on the external screen or the internal screen.
  • the source device prompts the user whether to cast the screen after performing step 802 or after determining that the smart screen projection function is enabled.
  • the source device can display a prompt box 910 on the external screen, where the prompt box 1110 includes prompt information, and the confirmation option "Yes” and the denial option "No".
  • the prompt information is used to prompt the user whether to vote or not.
  • the prompt message can be "Please confirm whether to cast the screen?” etc.
  • the source device may continue to perform step 803 and step 804 in response to the user selecting the confirmation option "Yes", and the source device may not continue to perform step 803 and step 804 in response to the user selecting the denial option "No".
  • the source device displays the prompt box 1110 on the external screen
  • the user may default to the user's consent to the screen, or the default user to reject the screen-casting.
  • the preset duration can be 10s, 15s, etc., which is not limited.
  • the source device can continue to perform step 803 and step 804.
  • the source device displays a prompt box 1110 when the external screen is locked. In order to improve security, the user needs to pass fingerprint, password or facial recognition before operating the prompt box 1110 displayed by the source device. Unlock the external screen.
  • the source device may also display the user interface shown in FIG. 11C or FIG. 11D on the external screen, so that the user can cancel the screen cast at any time. It should also be noted that, in the embodiment of the present application, the order of judging whether the smart screen projection function has been turned on and prompting the user whether to cast the screen may not be limited.
  • the method may further include:
  • Step 805 After the source device locks the internal screen in response to the operation of folding the internal screen from the unfolded state to the closed state, it determines whether the smart screen projection function has been turned on. If the smart screen projection function is turned on, step 806 is executed. If the smart screen projection function is not enabled, this process ends.
  • step 806 the source device determines whether the external screen is blocked. If the external screen is not blocked, step 807 is executed. If the external screen is blocked, the process ends.
  • the source device can judge whether the external screen is blocked by distance sensors, cameras, etc., for example, when the source device is placed in a bag or pocket, or the source device is placed on a table, the external screen faces down.
  • the source device can detect that the external screen is blocked by a distance sensor or camera.
  • the source device is placed on a table with the external screen facing up, it can be determined that the external screen is not blocked by a camera on the same side as the external screen.
  • the source device can also determine whether the external screen is blocked by other means, for example, by artificial intelligence (AI). Whether it is blocked.
  • AI artificial intelligence
  • the purpose of judging whether the external screen is blocked is to detect whether the user has the intention to cast the screen. If it is recognized that the source device is put in the pocket or bag, the source device thinks that the user no longer uses the device, and no longer triggers Cast screen.
  • the embodiments of the present application can also combine other parameters (such as time, location, etc.) acquired by other sensors (such as positioning sensors) to learn the behavior habits of the user using the device, so as to more accurately determine whether the user has the intention to cast a screen. , Improve the reliability of the source device to trigger the screen.
  • the source device may prompt the user whether to cast the screen after performing step 806.
  • the prompt box 1110 shown in FIG. 11B is displayed on the external screen.
  • the source device may obtain the current running
  • the identification of at least one application program that supports the screen projection function and the identification of at least one electronic device used to receive screen projection content can prompt the user on the external screen of the current process, for example, the source device is currently running support
  • the identification of the application program of the screen projection function the content acquisition can be displayed on the external screen, for example, as shown in Figure 11C.
  • the source device obtains the identification of the electronic device used to receive the screen content, it can be displayed on the external screen.
  • the acquisition of the upper display device for example, as shown in FIG. 11D, after the source device acquires the identification of at least one currently running application that supports the screen projection function and the identification of at least one electronic device used to receive screen projection content , Go to step 807.
  • the source device displays on the external screen the identification of at least one currently running application supporting the screen projection function; and displays on the external screen the identification of at least one electronic device for receiving screen projection content.
  • the identifier of at least one currently running application that supports the screen projection function may be an identifier of the currently running application that identifies an application in the white list, and the white list includes the identification of at least one application that supports the projection function.
  • the identification of the at least one electronic device used to receive the projected content may be obtained by the source device based on a communication technology such as Bluetooth and/or Wi-Fi.
  • the logo of at least one currently running application that supports the screen projection function includes the logo of iQiyi, the logo of Kugou Music, and the logo of Douyin
  • the logo of at least one electronic device for receiving screen projection content is an electronic device 20, the logo of the electronic device 30, and the logo of the electronic device 40
  • the source device displays the iQiyi logo, the Kugou music logo, and the Douyin logo on the external screen, and displays the logo of the electronic device 20,
  • the identification of the electronic device 30 and the identification of the electronic device 40 may be displayed on the same user interface, for example, as shown in FIG. 12A.
  • the identification of at least one currently running application supporting the screen projection function and the identification of at least one electronic device for receiving screen projection content can be displayed on the same user interface or on different user interfaces, for example, as shown in the figure Shown in 12B.
  • performing step 807 displays on the external screen one or more of the currently running at least two applications that support the screen projection function. ID of each application.
  • the screen projection content may be obtained from the application that supports the screen projection function without prompting the user of the identification of the application program.
  • the user can also be prompted with the identification of the application, but the user can select the identification of the application without operating, and the source device can download the application from the application. Get screencast content in.
  • the source device in the case that the source device obtains the identifications of at least two electronic devices for receiving the projected content, perform step 807 to display at least two electronic devices for receiving the projected content on the external screen
  • the ID of at least one electronic device in the ID For example, if the source device only obtains the identification of an electronic device for receiving screencast content, there is no need to display the identification of the electronic device on the external screen, or display the identification of the electronic device on the external screen, but the user can No operation is required to select the identification of the electronic device.
  • Step 808 After the source device receives the user's selection of an application identifier from the identifiers of at least one currently running application that supports the screen projection function, it obtains the vote from the application identified by the identifier of the application selected by the user. Screen content, and send the screencast content to the electronic device identified by the identification of the electronic device selected by the user from the identifications of at least one electronic device for receiving the screencast content.
  • the application identified by the identifier of the application selected by the user may be referred to as a target application.
  • step 807 when the source device receives the identifier of an application selected by the user from the identifiers of at least one currently running application supporting the screen projection function, and from at least one electronic device for receiving screen projection content, the user interface during delivery is displayed on the external screen, for example, the user interface shown in Figure 13 .
  • this embodiment of the application does not limit the sequence in which the source device obtains the identification of the application that supports the screen projection function and the identification of the electronic device used to receive the screen content.
  • the source device can simultaneously obtain the screen projection support.
  • the identification of the functional application and the identification of the electronic device used to receive the screened content, or the identification of the application that supports the screen-casting function can be obtained first, and then the identification of the electronic device used to receive the screened content, or, First obtain the identification of the electronic device used to receive the screencast content, and then obtain the identification of the application that supports the screencast function.
  • the source device may display the content acquisition on the external screen when acquiring the identification of at least one currently running application that supports the projection function (for example, as shown in FIG.
  • the identification of at least one application that supports the projection function is displayed on the external screen, and in response to the user selecting an application identification, the external screen displays the device acquisition (for example, as shown in the figure) 11D), and after acquiring the identification of at least one electronic device for receiving screencast content, the identification of at least one electronic device is displayed on the external screen, and in response to the user selecting a certain electronic device identification, the identification is displayed on the external screen Delivery (for example, as shown in Figure 13).
  • the source device can simultaneously obtain the screencast content and the identification of at least one electronic device for receiving the screencast content.
  • the source device when the source device does not obtain the identification of the application that supports the screen projection function, it can prompt the user that the screen projection fails. Further, the source device can also prompt the user the reason for the screen projection failure. As another example, when the source device fails to obtain the identification of the electronic device used to receive the screened content, it can prompt the user that the screen has failed. Furthermore, the source device can also prompt the user the reason for the failed screen.
  • step 806 can be located before step 805, after step 807, or after step 808, or step 806 can be executed simultaneously with step 807 or step 808, etc., which is not limited.
  • the source device detects that the external screen is blocked, it terminates the screen projection process.
  • the source device displays the control interface on the external screen after successfully sending the projected content to the target device.
  • the control interface includes virtual buttons with touch function.
  • the virtual buttons with touch functions are different.
  • the source device receives the user's operation on the control interface, it responds to the operation on the control interface to realize the control of the target application, so as to achieve the purpose of controlling the projected content.
  • the user's operation on the control interface can be an operation of a virtual button on the control interface for controlling a certain function, or can be a shortcut gesture operation on the control interface, etc., which is not limited.
  • the control interface can be as shown in Figure 14A, including a progress bar, a pause button, and a fast forward Button, selection button, and clarity button, etc.
  • the source device receives the user's operation on the definition button and sets the resolution from SD to HD, the voting content presented on the target device is switched from SD to HD.
  • the control interface may be as shown in Figure 14B, including a progress bar, a pause button, and a quick Enter button and menu button, etc.
  • the source device receives the user's operation of clicking the pause button, the audio is paused on the target device.
  • the control interface may be as shown in Figure 14C, including function buttons such as touch area, favorites, comments, and sharing. .
  • the user can switch videos by sliding up and down in the touch area, or click to pause or start playing the video in the touch area.
  • the control interface may be as shown in FIG. 14D, including up, down, left, and right function buttons. Among them, the user can control the movement direction of the snake through the up, down, left, and right function buttons.
  • the control interface can also be as shown in Figure 14E, that is, the control interface is a gamepad to present the user with virtual buttons for controlling the game, which is more vivid And vivid.
  • the control interface may be as shown in FIG. 14F or FIG. 14G, including an input method.
  • the control interface may be as shown in FIG. 14F.
  • voice input is adopted on the user interface of WeChat, the control interface may be as shown in FIG. 14G.
  • step 808 After performing step 803, or, in step 808, when the source device receives the identifier of an application selected by the user from the identifiers of at least one currently running application supporting the screen projection function, step 808 The application identified by the identifier of the application selected by the user is used as the target application, and the source device can display a corresponding control interface on the external screen according to the target application.
  • Example 1 The source device can determine the control interface corresponding to the type of the target application from preset control interfaces corresponding to different types of applications, and then display the control interface corresponding to the target application on the external screen.
  • the target application is iQiyi
  • the type of the application to which the target application belongs is the video type.
  • the source device may determine the control interface corresponding to the video type application from the preset control interfaces corresponding to the different types of applications, and then display the control interface corresponding to the type of application to which iQiyi belongs on the external screen. Help simplify the implementation.
  • the preset control interfaces for different types of applications may be different.
  • the preset control interface may be as shown in FIG. 14A.
  • the preset control interface may be different.
  • the set control interface can be as shown in Figure 14B.
  • the control interface corresponding to different types of applications may be pre-set in the electronic device before the device leaves the factory, or the electronic device may obtain it from the server in advance according to the application installed by itself.
  • the manner in which control interfaces corresponding to different types of application programs are preset in the electronic device is not limited.
  • this embodiment of the present application also provides a method for displaying a control interface on an external screen. method.
  • Example 2 The source device recognizes a virtual button with touch function from the user interface of the target application (for example, a virtual button can also be called a user interface (UI) element, virtual button, or control, etc.), Among them, the virtual button with touch function can be clicked, touched or pressed by the user to realize the corresponding function (such as pause playback, fast forward, etc.). Then, the electronic device rearranges, cuts, and/or zooms the identified virtual buttons with touch functions, generates a control interface corresponding to the target application, and displays the control interface on the external screen.
  • the control interface includes at least one virtual button, which is used to enable the user to implement shortcut operations on the target application.
  • the at least one virtual button may include a button with the same function as all the virtual buttons recognized by the source device from the user interface of the target application, or may include the same button as the source device from the user interface of the target application. Some of the recognized virtual buttons have the same functions.
  • control interface includes a virtual button with the same function as the virtual button recognized on the user interface of the target application, which can be implemented in the following ways:
  • mapping the virtual buttons on the control interface to the user interface of the target application For example, the position coordinates of the virtual button on the control interface that has the same function as the virtual button recognized on the user interface of the target application are mapped to the position coordinates of the virtual button recognized on the user interface of the target application.
  • the function of the virtual button is the same as the virtual button recognized on the user interface of the target application.
  • the at least one virtual button may also include other function buttons other than the button with the same function as the virtual button recognized on the user interface of the target application, such as a cancel projection button, a switch for receiving The button of the target device of the projected content and/or the button of switching the target application for projecting, etc.
  • a cancel projection button a switch for receiving The button of the target device of the projected content and/or the button of switching the target application for projecting, etc.
  • the electronic device receives the user's operation of clicking the canceling screen-casting button, and in response to the user's clicking the screen-casting cancel button, ends the screen projection.
  • the user interface of the target application can be as shown in A in Figure 15.
  • the source device recognizes the virtual button 1501 in A in Figure 15, and then performs the icon of the virtual button 1501. Re-cut and layout to obtain the virtual button 1502 shown in Figure 15 B, where the virtual button 1502 is a virtual button included on the control interface corresponding to the glory of the king, and its function is the same as that of the virtual button 1501.
  • the virtual button 1502 is a virtual button included on the control interface corresponding to the glory of the king, and its function is the same as that of the virtual button 1501.
  • the position coordinates of the virtual button 1502 are associated with the position coordinates of the virtual button 1501.
  • the source device can identify the virtual button with touch function based on the related records of the historical usage of the target application (for example, the historical record of the user's click on the screen, etc.).
  • the source device can identify a virtual button with a touch function according to a software development kit (SDK) interface provided by the target application.
  • SDK software development kit
  • the source device may also use a location area for laying out virtual buttons with touch functions from the user interface of a predefined target application, and identify the virtual buttons with touch functions.
  • the embodiments of the present application may also identify virtual buttons with touch functions in other ways.
  • the source device may perform semantic analysis on the user interface of the target application (for example, register information for virtual buttons with touch functions). Perform semantic analysis, etc.), perform image recognition on the user interface of the target application, and recognize virtual buttons with touch functions.
  • the display on the external screen The user interface can meet the needs of users. If the source device does not recognize a virtual button with touch function on the user interface of the target application, it can be configured according to the control interface corresponding to different application types in advance. Determine the control interface corresponding to the target application type, and display the control interface on the external screen. For the specific implementation manner, refer to the related introduction in Example 1, which will not be repeated here.
  • the user interface of the target application can be as shown in A in Figure 15.
  • the source device does not recognize the virtual button with touch function on the user interface shown in A in Figure 15, if the preset application
  • the control interface corresponding to the game type is shown as C in Fig. 15, and the source device displays the control interface shown in C in Fig. 15 on the external screen.
  • the control interface includes a touch area. Users can swipe up, down, left, and right in the touch area to control the game.
  • steps 801 to 803, or based on steps 801, 802, and steps 805 to 807 after the source device sends the screened content to the target device, if the user receives that the screen is closed from the closed state Expanded to expanded state, in response to receiving that the user expands the inner screen from the closed state to the expanded state, no more screen content is sent to the target device, and the user interface of the application where the screened content is displayed on the inner screen .
  • the video on the user interface of iQiyi is sent to the smart TV.
  • the inner screen of the electronic device When the inner screen of the electronic device is expanded to the expanded state by the user, it responds to the user When the screen is expanded from the closed state to the expanded state, the screen projection is stopped, and the iQiyi user interface is mapped to the inner screen for display. Furthermore, the external screen no longer displays the control interface corresponding to iQiyi, for example, the external screen can be turned off.
  • the target device after the target device receives the projected content sent by the source device, it can crop or re-layout the projected content, Present on the target device.
  • the source device can also crop or re-layout the content obtained from the target application according to the device attributes (such as resolution, touch capability, etc.) of the target device before sending the projected content
  • the projection content is then sent to the target device. This helps to enable the target device to display the projected content normally.
  • the method for the source device to display the control interface corresponding to the target application on the external screen in the embodiment of this application can also be applied to other screen projection methods except the embodiment of this application. limited.
  • the source device is an electronic device that only includes the first display screen, such as a tablet computer, a mobile phone, etc.
  • the first display screen of the source device may be the first display screen 141 shown in FIG. 4A, which is located on the front of the source device, and the back of the source device does not include a display screen.
  • FIG. 16 it is a schematic flowchart of another screen projection method according to an embodiment of this application, which specifically includes the following steps.
  • Step 1601 The first display screen of the source device is in use, and a screen-off operation on the first display screen is received.
  • the first display screen of the source device is in use for specific introduction, please refer to the related introduction about the inner screen of the source device in use in step 1601, which will not be repeated here.
  • the screen-off operation of the first display screen may be an operation of the user pressing the power button, or a voice instruction of the user, or an operation of the user clicking a virtual button for controlling the screen-off, etc., which is not limited.
  • step 1602 the source device turns off the first display screen in response to the turn-off operation on the first display screen.
  • Step 1603 After turning off the first display screen, the source device judges whether the smart screen projection function is turned on. If the smart screen projection function is turned on, step 1604 is executed. If the smart screen projection function is not turned on, the process ends .
  • Step 1604 The source device judges whether the first display screen is blocked. If the first display screen is blocked, the flow ends. If the first display screen is not blocked, step 1605 is executed.
  • Step 1605 The source device determines the target application from a currently running application. Among them, the source-end device determines the specific implementation manner of the target application from a currently running application, please refer to the related introduction in step 803, which will not be repeated here.
  • Step 1606 The source device obtains the screencast content from the target application, and sends the screencast content to the target device.
  • step 1606 refer to the relevant implementation of step 804, which will not be repeated here.
  • the source device may skip step 1603 and step 1604, and perform step 1605 and step 1606.
  • the method may further include:
  • Step 1607 The source device displays the identification of at least one currently running application supporting the screen projection function and the identification of at least one electronic device for receiving screen projection content on a partial area of the first display screen.
  • the user can set on the first display screen as needed to display the identification of at least one currently running application that supports the projection function, and display at least one for receiving projection content.
  • the size and position of the area of the identification of the electronic device, or, on the first display screen, used to display the identification of at least one currently running application program supporting the screen projection function, and display at least one electronic device for receiving screen projection content are set by the source device before leaving the factory, and there is no limitation on this.
  • the source device displays the identification of at least one currently running application supporting the projection function on the area 1700 of the first display screen 141; and displays at least An identification of an electronic device used to receive screencast content.
  • Example 1 the display of the identification of at least one currently running application supporting the screen projection function in a partial area of the first display screen, and the manner of displaying the identification of at least one electronic device for receiving screen projection content can be referred to in Example 1. Relevant introduction will not be repeated here.
  • the source device may also display the identification of at least one currently running application supporting the screen projection function on a partial area of the first display screen, and when displaying the identification of at least one electronic device for receiving screen projection content, The time and date are also displayed on the first display screen.
  • Step 1608 After the source device receives the user’s selection of an application identifier from the identifiers of at least one currently running application that supports the screen projection function, it obtains the vote from the application identified by the identifier of the application selected by the user. Screen content, and send the screencast content to the electronic device identified by the identification of the electronic device selected by the user from the identifications of at least one electronic device for receiving the screencast content.
  • the user can perform corresponding operations on the identification of the application program and the identification of the electronic device displayed on a partial area of the first display screen when the first display screen is turned off, which is helpful To reduce the steps of user operation.
  • step 1608 when the source-end device receives the identifier of an application selected by the user from the identifiers of at least one currently running application supporting the screen projection function, and from at least one electronic device used to receive screen projection content, it displays the user interface being posted on a part of the first display screen, for example, The user interface shown in Figure 13.
  • the first display screen The control interface is displayed on part of the area.
  • the specific introduction of displaying the control interface on a partial area of the first display screen please refer to the related introduction of displaying the control interface on the external screen in Example 1.
  • the area used to display the control interface on the first display screen and the area used to display the identification of the application program and the identification of the electronic device in step 1607 may be the same or different, which is not limited.
  • the source device sends the screened content to the target device
  • the user receives Reuse the first display screen operation (for example, unlocking operation)
  • the screen projection content is no longer sent to the target device, and the screen projection is displayed on the first display screen
  • the unlocking operation of the user on the first display screen may be an operation of inputting a fingerprint, an operation of inputting a password, etc., which is not limited.
  • the video on the user interface of iQiyi is sent to the smart TV.
  • the screen projection is stopped and the love The user interface of Qiyi is mapped to the first display screen for display.
  • the target device after the target device receives the screen content sent by the source device, it can crop the screen content Or after re-layout, it is presented on the target device.
  • the source device can also crop or re-layout the content obtained from the target application according to the device attributes (such as resolution, touch capability, etc.) of the target device before sending the projected content
  • the projection content is then sent to the target device. This helps to enable the target device to display the projected content normally.
  • the source device is an electronic device including a first display screen and a second display screen.
  • the first display screen of the source device may be the first display screen 141 shown in FIG. 4A, which is located on the front of the source device, and the second display screen of the source device may be the second display screen 142 shown in FIG. 4B. Located on the back of the source device.
  • FIG. 18 it is a schematic flowchart of another screen projection method according to an embodiment of this application, which specifically includes the following steps 1601 and 1602, and after performing step 1602, the following steps are also performed:
  • Step 1803 After turning off the first display screen, the source device judges whether the smart screen projection function is turned on. If the smart screen projection function is turned on, step 1804 is executed. If the smart screen projection function is not turned on, the process ends .
  • Step 1804 The source device judges whether the first display screen and the second display screen are blocked. If both the first display screen and the second display screen are blocked, the flow ends. If one of the first display screen or the second display screen is not blocked, step 1805 is executed.
  • the source device first determines whether the second display screen is blocked. If the second display screen is not blocked, then step 1805 is executed. If the second display screen is blocked, then it determines whether the first display screen is blocked. If the first display screen is not blocked, step 1805 is executed. If the first display screen is blocked, the process ends.
  • Step 1805 The source device determines the target application from a currently running application. Among them, the source-end device determines the specific implementation manner of the target application from a currently running application, please refer to the related introduction in step 803, which will not be repeated here.
  • Step 1806 The source device obtains the screencast content from the target application, and sends the screencast content to the target device.
  • step 1806, refer to the relevant implementation of step 604, which will not be repeated here.
  • the source device may skip step 1803 and step 1804, and perform step 1805 and step 1806.
  • step 1805 and step 1806, after step 1803 or step 1804, the method may further include:
  • Step 1807 When the second display screen is not blocked, the source device displays the identification of at least one currently running application program supporting the projection function on the second display screen, and displays at least one electronic device for receiving projection content.
  • the identification of the device When the second display screen of the source device is blocked and the first display screen is not blocked, a partial area of the first display screen displays the identification of at least one currently running application that supports the projection function, and displays at least one The identification of the electronic device that receives the projected content.
  • the specific implementation of displaying the identification of at least one currently running application supporting the screen projection function and displaying the identification of at least one electronic device for receiving screen projection content on the second display screen can be found in the example The related description of the first one will not be repeated here.
  • the identification of at least one currently running application that supports the screen projection function is displayed, and the way of displaying the identification of at least one electronic device for receiving screen projection content can refer to the related introduction in Example 2. I will not repeat them here.
  • Step 1808 After the source device receives the user’s selection of an application identifier from the identifiers of at least one currently running application that supports the screen projection function, it obtains the vote from the application identified by the identifier of the application selected by the user. Screen content, and send the screencast content to the electronic device identified by the identification of the electronic device selected by the user from the identifications of at least one electronic device for receiving the screencast content.
  • step 1808 when the source-end device receives the identifier of an application selected by the user from the identifiers of at least one currently running application supporting the screen projection function, and from at least one electronic device for receiving screen projection content, it displays the user interface being posted on a part of the first display screen, for example, The user interface shown in Figure 13.
  • control interface is displayed on a partial area of the first display screen or on the second display screen.
  • the control interface includes virtual buttons with touch function. It should be noted that, for the specific implementation of displaying the control interface on a partial area of the first display screen or on the second display screen, refer to the related introduction in Example 1.
  • the area used to display the control interface on the first display screen and the area used to display the identification of the application program and the identification of the electronic device in step 1808 may be the same or different, which is not limited.
  • the source device is an electronic device that includes a flip cover, such as a flip phone. As shown in Figure 19, it includes an inner screen and an outer screen. The inner screen is located inside the mobile phone cover, not shown in the figure, and the outer screen is located outside the mobile phone cover.
  • the following takes the source device as a flip phone as an example. In this scenario, compared with the electronic device with a foldable screen in the example 1, the only difference in the projection method is that the source device is a flip phone.
  • the source device receives the user's operation to close the cover of the mobile phone and turns off the internal screen.
  • the source device can unlock the internal screen in response to the user outputting a password or long pressing a preset key (for example, # key or * key or combination key, etc.). That is, in the scenario of a flip phone, when the source device receives the user's operation to close the phone cover when using the internal screen normally, it will respond to the user closing the phone cover to lock the internal screen, and then trigger the screen projection. Refer to Example 1, where the source device executes the steps after step 802, which will not be repeated here.
  • a preset key for example, # key or * key or combination key, etc.
  • the source device may also be an electronic device installed with a smart protective cover.
  • the electronic device 10 includes a first display screen 141, and the electronic device 10 is carded to the smart In the protective cover 20, the smart protective cover 20 includes a visualization area 18.
  • the user closes the cover 16 of the smart protective cover 20, it can be as shown in C in FIG. 20.
  • the user opens the cover 16 of the smart protective sleeve 20, it can be shown as B in FIG. 20.
  • the only difference in the screen projection method is that in the scenario of an electronic device with a smart protective cover, the source device receives When the user closes the cover of the smart protective sleeve, the first display screen is turned off. After opening the cover of the smart protective cover, the source device can unlock the first display screen in response to the user's output of a password, fingerprint, etc. That is, in the scenario of an electronic device equipped with a smart protective cover, when the source device receives the user's operation to close the cover of the smart protective cover when using the first display screen normally, it will respond to the user's closing of the mobile phone cover. A screen locks the screen and then triggers screen projection. For details, please refer to Example 1, where the source device executes the steps after step 602, which will not be repeated here.
  • the identification of the application can be displayed in the visualization area 17 of the smart protective cover 20.
  • the specific display mode please refer to the related introduction in Example 1. This will not be repeated here.
  • the smart cover when the source device does not need to prompt the user with the identification of the application program, the identification of the electronic device, or the control interface, the smart cover may not include the visualization area.
  • the source device can also be another electronic device with a foldable screen.
  • the source device includes a first display screen, and the first display screen is a foldable screen.
  • the first display screen 141 as shown in FIG. 5A may be used, and when the first display screen of the source device is in the closed state, it may be as shown in FIG. 5B.
  • the source device can present a corresponding interface to the user through the area 500 in the first display screen 141.
  • the first display screen of the source device is in the unfolded state and is in use, and an operation to fold the first display screen from the unfolded state to the closed state is received, triggering a screen projection to the target device.
  • the source device receives an operation to fold the first display screen 141 from the unfolded state to the closed state. If the projection fails, it can map the content displayed on the first display screen 141 before receiving the operation to the folded state. It is displayed in the area 500 of the first display screen 141. If the projection is successful, the control interface can be displayed in the area 1500 of the first display screen 141. The control interface can be displayed in the area 1500 of the first display screen 141. See the related introduction in Example 1.
  • the first display screen of the source device when the first display screen of the source device is in the closed state, it receives an operation to expand the first display screen from the closed state to the unfolded state, and in response to the foregoing operation, stops projection. For example, after the source device stops the screen projection, it can automatically display the user interface where the screen projection content is located on the first display screen.
  • the source device may also be an electronic device with a retractable screen, including a first display screen, where the first display screen is a retractable display screen.
  • the state of the first display screen of the source device after being stretched may be as shown in FIG. 6A
  • the state of the first display screen after being contracted may be as shown in FIG. 6B.
  • the first display screen of the source device is in the extended state and is in use, and the contraction operation of the first display screen is received, which triggers projection to the target device.
  • the specific projection method see Related introduction in example one.
  • the source device may map the content displayed on the first display screen 141 before receiving the operation to the shrinking first display screen 141 If the projection is successful, the control interface can be displayed in the area 600 of the first display screen 141.
  • the manner of displaying the control interface in the area 600 of the first display screen 141 see Example 1 Related introduction.
  • the first display screen of the source device receives the stretching operation of the first display screen in the retracted state, and in response to the above operation, the screen projection is stopped. For example, after the source device stops the screen projection, it can automatically display the user interface where the screen projection content is located on the first display screen.
  • the method provided in the embodiments of the present application is introduced from the perspective of an electronic device as an execution subject.
  • the electronic device may include a hardware structure and/or a software module, and realize the above functions in the form of a hardware structure, a software module, or a hardware structure plus a software module. Whether one of the above-mentioned functions is executed in a hardware structure, a software module, or a hardware structure plus a software module depends on the specific application and design constraint conditions of the technical solution.
  • FIG. 21 shows a device 2100 provided by this application for executing the screen projection method shown in FIG. 8, FIG. 16, or FIG. 18.
  • the device 2100 includes a processing module 2101 and a transceiver module 2102.
  • the processing module 2101 is configured to detect user operations, and in response to the user operations, trigger the transceiver module 2102 to send the screened content to the target device.
  • FIG. 22 shows a device 2200 provided in this application.
  • the device 2200 includes at least one processor 2210, a memory 2220, and a transceiver 2230.
  • the processor 2210 is coupled with the memory 2220 and the transceiver 2230.
  • the coupling in the embodiment of the present application is an indirect coupling or a communication connection between devices, units or modules, which can be electrical, mechanical or other forms for the device , Information exchange between units or modules.
  • the embodiment of the present application does not limit the connection medium between the transceiver 2230, the processor 2210, and the memory 2220.
  • the memory 2220, the processor 2210, and the transceiver 2230 may be connected by a bus, and the bus may be divided into an address bus, a data bus, and a control bus.
  • the memory 2220 is used to store program instructions.
  • the transceiver 2230 is used to send screen content, control instructions, etc. to the target device.
  • the processor 2210 is configured to call program instructions stored in the memory 2220, so that the device 2200 executes the screen projection method shown in FIG. 8, FIG. 16, or FIG. 18.
  • the processor 2210 may be a general-purpose processor, a digital signal processor, an application specific integrated circuit, a field programmable gate array or other programmable logic device, a discrete gate or transistor logic device, or a discrete hardware component. Or execute the methods, steps, and logical block diagrams disclosed in the embodiments of the present application.
  • the general-purpose processor may be a microprocessor or any conventional processor. The steps of the method disclosed in the embodiments of the present application may be directly embodied as being executed and completed by a hardware processor, or executed and completed by a combination of hardware and software modules in the processor.
  • the memory 2220 may be a non-volatile memory, such as a hard disk drive (HDD) or a solid-state drive (SSD), etc., and may also be a volatile memory (volatile memory). For example, random-access memory (RAM).
  • the memory is any other medium that can be used to carry or store desired program codes in the form of instructions or data structures and that can be accessed by a computer, but is not limited thereto.
  • the memory in the embodiments of the present application may also be a circuit or any other device capable of realizing a storage function, for storing program instructions and/or data.
  • the device 1300 and the device 2200 may be used to implement the method shown in FIG. 8, FIG. 16, or FIG. 18 in the embodiment of the present application.
  • the device 1300 and the device 2200 may be used to implement the method shown in FIG. 8, FIG. 16, or FIG. 18 in the embodiment of the present application.
  • the embodiments of the present application can be implemented by hardware, firmware, or a combination of them.
  • the above functions can be stored in a computer-readable medium or transmitted as one or more instructions or codes on the computer-readable medium.
  • the computer-readable medium includes a computer storage medium and a communication medium, where the communication medium includes any medium that facilitates the transfer of a computer program from one place to another.
  • the storage medium may be any available medium that can be accessed by a computer.
  • computer-readable media may include RAM, ROM, electrically erasable programmable read-only memory (EEPROM), compact disc read-only memory, CD- ROM) or other optical disk storage, magnetic disk storage media or other magnetic storage devices, or any other media that can be used to carry or store desired program codes in the form of instructions or data structures and that can be accessed by a computer.
  • EEPROM electrically erasable programmable read-only memory
  • CD- ROM compact disc read-only memory
  • Any connection can suitably become a computer-readable medium.
  • disks and discs include compact discs (CDs), laser discs, optical discs, digital video discs (digital video discs, DVDs), floppy discs, and Blu-ray discs. Disks usually copy data magnetically, while disks use lasers to copy data optically. The above combination should also be included in the protection scope of the computer-readable medium.

Abstract

L'invention concerne un procédé de projection d'écran et un dispositif électronique, qui se rapportent au domaine technique des terminaux, qui sont appliqués à un dispositif ayant un écran pliant, et qui sont aptes à mettre en œuvre une projection d'écran intelligente. Le procédé est appliqué à un premier dispositif électronique, qui comprend un écran interne et un écran externe, l'écran interne étant un écran pliable. Le procédé consiste : après avoir reçu une opération de pliage de l'écran interne d'un état déplié à un état fermé, à verrouiller, par le premier dispositif électronique, l'écran interne en réponse à l'opération de pliage de l'écran interne de l'état déplié à l'état fermé, à acquérir un contenu de projection d'écran d'un programme d'application cible parmi au moins un programme d'application qui est actuellement en cours d'exécution, puis à envoyer le contenu de projection d'écran à un second dispositif électronique. La solution technique décrite permet de déclencher un dispositif électronique pour projeter un écran au moyen d'un utilisateur effectuant une seule opération, ce qui simplifie le moyen de fonctionnement de la projection d'écran et améliore l'efficacité de la projection d'écran.
PCT/CN2020/106096 2019-07-31 2020-07-31 Procédé de projection d'écran et dispositif électronique WO2021018274A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN201910704758.6A CN112394891B (zh) 2019-07-31 2019-07-31 一种投屏方法及电子设备
CN201910704758.6 2019-07-31

Publications (1)

Publication Number Publication Date
WO2021018274A1 true WO2021018274A1 (fr) 2021-02-04

Family

ID=74230363

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2020/106096 WO2021018274A1 (fr) 2019-07-31 2020-07-31 Procédé de projection d'écran et dispositif électronique

Country Status (2)

Country Link
CN (2) CN112394891B (fr)
WO (1) WO2021018274A1 (fr)

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114089940A (zh) * 2021-11-18 2022-02-25 佛吉亚歌乐电子(丰城)有限公司 一种投屏方法、装置、设备及存储介质
CN114428599A (zh) * 2022-01-30 2022-05-03 深圳创维-Rgb电子有限公司 投屏亮度控制方法、装置、存储介质及投屏器
CN115964011A (zh) * 2023-03-16 2023-04-14 深圳市湘凡科技有限公司 基于多屏协同的显示应用界面的方法及相关装置
CN116048350A (zh) * 2022-07-08 2023-05-02 荣耀终端有限公司 一种截屏方法及电子设备

Families Citing this family (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115131547A (zh) * 2021-03-25 2022-09-30 华为技术有限公司 Vr/ar设备截取图像的方法、装置及系统
CN113259757A (zh) * 2021-04-08 2021-08-13 读书郎教育科技有限公司 一种便捷兼容多应用进行视频投屏的方法
CN113138737B (zh) * 2021-04-16 2023-11-03 阿波罗智联(北京)科技有限公司 投屏场景的显示控制方法、装置、设备、介质及程序产品
CN113268211B (zh) * 2021-05-13 2023-05-12 维沃移动通信(杭州)有限公司 图像获取方法、装置、电子设备及存储介质
CN115373558A (zh) * 2021-05-18 2022-11-22 广州视源电子科技股份有限公司 投屏方法、装置、设备及存储介质
WO2023036082A1 (fr) * 2021-09-09 2023-03-16 华为技术有限公司 Système et procédé d'affichage et de commande d'une tâche de dispositif distant
CN114063951B (zh) * 2021-09-26 2022-12-02 荣耀终端有限公司 投屏异常处理方法及电子设备
CN114786058B (zh) * 2022-04-27 2024-02-06 南京欧珀软件科技有限公司 多媒体数据展示方法、装置、终端及存储介质
CN117850644A (zh) * 2022-09-30 2024-04-09 华为技术有限公司 切换窗口的方法和电子设备

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20140359493A1 (en) * 2013-05-30 2014-12-04 Samsung Electronics Co., Ltd. Method, storage medium, and electronic device for mirroring screen data
CN107659712A (zh) * 2017-09-01 2018-02-02 咪咕视讯科技有限公司 一种投屏的方法、装置及存储介质
CN108713185A (zh) * 2016-03-02 2018-10-26 三星电子株式会社 电子装置及其显示和发送图像的方法
CN109992231A (zh) * 2019-03-28 2019-07-09 维沃移动通信有限公司 投屏方法及终端
CN110058828A (zh) * 2019-04-01 2019-07-26 Oppo广东移动通信有限公司 应用程序显示方法、装置、电子设备及存储介质
CN110308885A (zh) * 2019-06-25 2019-10-08 维沃移动通信有限公司 一种投屏方法及移动终端

Family Cites Families (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
GB9805198D0 (en) * 1998-03-11 1998-05-06 Maddock Alan Portable visual display device
JP2003101909A (ja) * 2001-09-25 2003-04-04 Matsushita Electric Ind Co Ltd 携帯型電子装置及び画像表示装置
CN103369070A (zh) * 2012-04-04 2013-10-23 朱洪来 三屏翻盖智能手机
US20140372896A1 (en) * 2013-06-14 2014-12-18 Microsoft Corporation User-defined shortcuts for actions above the lock screen
CN103399643A (zh) * 2013-08-23 2013-11-20 深圳市金立通信设备有限公司 一种柔性终端的应用程序启动方法及柔性终端
CN107589973A (zh) * 2017-08-29 2018-01-16 珠海格力电器股份有限公司 一种启动应用的方法、装置及电子设备
CN109871147B (zh) * 2019-02-22 2020-12-01 华为技术有限公司 一种触摸屏的响应方法及电子设备

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20140359493A1 (en) * 2013-05-30 2014-12-04 Samsung Electronics Co., Ltd. Method, storage medium, and electronic device for mirroring screen data
CN108713185A (zh) * 2016-03-02 2018-10-26 三星电子株式会社 电子装置及其显示和发送图像的方法
CN107659712A (zh) * 2017-09-01 2018-02-02 咪咕视讯科技有限公司 一种投屏的方法、装置及存储介质
CN109992231A (zh) * 2019-03-28 2019-07-09 维沃移动通信有限公司 投屏方法及终端
CN110058828A (zh) * 2019-04-01 2019-07-26 Oppo广东移动通信有限公司 应用程序显示方法、装置、电子设备及存储介质
CN110308885A (zh) * 2019-06-25 2019-10-08 维沃移动通信有限公司 一种投屏方法及移动终端

Cited By (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114089940A (zh) * 2021-11-18 2022-02-25 佛吉亚歌乐电子(丰城)有限公司 一种投屏方法、装置、设备及存储介质
CN114089940B (zh) * 2021-11-18 2023-11-17 佛吉亚歌乐电子(丰城)有限公司 一种投屏方法、装置、设备及存储介质
CN114428599A (zh) * 2022-01-30 2022-05-03 深圳创维-Rgb电子有限公司 投屏亮度控制方法、装置、存储介质及投屏器
CN116048350A (zh) * 2022-07-08 2023-05-02 荣耀终端有限公司 一种截屏方法及电子设备
CN116048350B (zh) * 2022-07-08 2023-09-08 荣耀终端有限公司 一种截屏方法及电子设备
CN115964011A (zh) * 2023-03-16 2023-04-14 深圳市湘凡科技有限公司 基于多屏协同的显示应用界面的方法及相关装置
CN115964011B (zh) * 2023-03-16 2023-06-06 深圳市湘凡科技有限公司 基于多屏协同的显示应用界面的方法及相关装置

Also Published As

Publication number Publication date
CN112394891A (zh) 2021-02-23
CN116185324A (zh) 2023-05-30
CN112394891B (zh) 2023-02-03

Similar Documents

Publication Publication Date Title
WO2021018274A1 (fr) Procédé de projection d'écran et dispositif électronique
KR101757870B1 (ko) 이동 단말기 및 그 제어방법
KR101668138B1 (ko) 자동으로 동작 모드를 결정하는 이동 장치
US11635873B2 (en) Information display method, graphical user interface, and terminal for displaying media interface information in a floating window
JP6385459B2 (ja) オーディオの再生のための制御方法及び装置
US11747953B2 (en) Display method and electronic device
US11435975B2 (en) Preview display method based on multi-angle and communication system
KR101901720B1 (ko) 더미 장치와의 연동 방법 및 그 전자 장치
WO2022078061A1 (fr) Procédé et appareil de communication vidéo, dispositif électronique et support de stockage lisible par ordinateur
KR20120062136A (ko) 이동 단말기 및 그 제어방법
CN104838352A (zh) 在多表面装置中的动作初始化
JP2017530493A (ja) 外付け機器の接続方法および装置、プログラム及び記録媒体
US11051147B2 (en) Electronic apparatus and method of outputting content by the electronic apparatus
WO2016015403A1 (fr) Procédé et appareil pour accéder à un réseau wi-fi
WO2021036659A1 (fr) Procédé d'enregistrement vidéo et appareil électronique
KR20150130188A (ko) 지문 인식을 이용한 휴대 단말장치의 제어 방법 및 그 휴대 단말 장치
US20230138804A1 (en) Enhanced video call method and system, and electronic device
EP4325338A1 (fr) Procédé de commande d'affichage, dispositif électronique, et support de stockage informatique
WO2020238448A1 (fr) Procédé de gestion d'autorisation et terminal
KR20130001826A (ko) 이동 단말기 및 그 제어방법
CN110263525B (zh) 设备配置方法及装置
CN114065706A (zh) 一种多设备数据协作的方法及电子设备
WO2021083313A1 (fr) Procédé de déverrouillage et dispositif électronique
WO2024002137A1 (fr) Procédé de communication, système de communication et dispositif électronique
JP2024517756A (ja) 表示制御方法、電子デバイス、およびコンピュータ記憶媒体

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 20848279

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 20848279

Country of ref document: EP

Kind code of ref document: A1