CN112394891B - Screen projection method and electronic equipment - Google Patents

Screen projection method and electronic equipment Download PDF

Info

Publication number
CN112394891B
CN112394891B CN201910704758.6A CN201910704758A CN112394891B CN 112394891 B CN112394891 B CN 112394891B CN 201910704758 A CN201910704758 A CN 201910704758A CN 112394891 B CN112394891 B CN 112394891B
Authority
CN
China
Prior art keywords
screen
electronic device
user
identifier
application program
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201910704758.6A
Other languages
Chinese (zh)
Other versions
CN112394891A (en
Inventor
周星辰
范振华
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Huawei Technologies Co Ltd
Original Assignee
Huawei Technologies Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Huawei Technologies Co Ltd filed Critical Huawei Technologies Co Ltd
Priority to CN201910704758.6A priority Critical patent/CN112394891B/en
Priority to CN202310208263.0A priority patent/CN116185324A/en
Priority to PCT/CN2020/106096 priority patent/WO2021018274A1/en
Publication of CN112394891A publication Critical patent/CN112394891A/en
Application granted granted Critical
Publication of CN112394891B publication Critical patent/CN112394891B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/14Digital output to display device ; Cooperation and interconnection of the display device with other functional units
    • G06F3/1454Digital output to display device ; Cooperation and interconnection of the display device with other functional units involving copying of the display data of a local workstation or window to a remote workstation or window so that an actual copy of the data is displayed simultaneously on two or more displays, e.g. teledisplay
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/14Digital output to display device ; Cooperation and interconnection of the display device with other functional units
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F9/00Arrangements for program control, e.g. control units
    • G06F9/06Arrangements for program control, e.g. control units using stored programs, i.e. using an internal store of processing equipment to receive or retain programs
    • G06F9/44Arrangements for executing specific programs
    • G06F9/451Execution arrangements for user interfaces
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M1/00Substation equipment, e.g. for use by subscribers
    • H04M1/72Mobile telephones; Cordless telephones, i.e. devices for establishing wireless links to base stations without route selection
    • H04M1/724User interfaces specially adapted for cordless or mobile telephones
    • H04M1/72448User interfaces specially adapted for cordless or mobile telephones with means for adapting the functionality of the device according to specific conditions
    • H04M1/72454User interfaces specially adapted for cordless or mobile telephones with means for adapting the functionality of the device according to specific conditions according to context-related or environment-related conditions

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Software Systems (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Signal Processing (AREA)
  • Environmental & Geological Engineering (AREA)
  • User Interface Of Digital Computer (AREA)
  • Controls And Circuits For Display Device (AREA)
  • Transforming Electric Information Into Light Information (AREA)

Abstract

A screen projection method and electronic equipment relate to the technical field of terminals, can be applied to folding screen equipment and are beneficial to achieving intelligent screen projection. The method is applied to first electronic equipment, the first electronic equipment comprises an inner screen and an outer screen, the inner screen is a foldable screen, and the method comprises the following steps: after receiving the operation of folding the inner screen from the unfolding state to the closing state, the first electronic device responds to the operation of folding the inner screen from the unfolding state to the closing state, locks the inner screen, obtains screen projection content of a target application program in at least one currently running application program, and then sends the screen projection content to the second electronic device. The technical scheme is beneficial to enabling the user to start the electronic equipment to project the screen once, thereby being beneficial to simplifying the operation mode of projecting the screen and improving the efficiency of projecting the screen.

Description

Screen projection method and electronic equipment
Technical Field
The present application relates to the field of terminal technologies, and in particular, to a screen projection method and an electronic device.
Background
The screen projection technology is to project the content on the electronic device a to the electronic device B by using a wireless communication technology, so that the electronic device B can display the content on the electronic device a. For example, through a screen projection technology, content on an electronic device (e.g., a mobile phone, a tablet computer, etc.) with a smaller display screen can be projected onto an electronic device (e.g., a television, a projector, etc.) with a larger display screen, so that a user can watch the content on the electronic device with the smaller display screen on the electronic device with the larger display screen, thereby achieving a better watching effect.
Along with the continuous popularization of screen projection technology, the screen projection intelligent system has important practical value for the research of intelligent screen projection.
Disclosure of Invention
The embodiment of the application provides a screen projection method and electronic equipment, which are beneficial to reducing the complexity of an operation mode of triggering the electronic equipment to initiate screen projection and improving the efficiency of triggering a source end equipment to initiate screen projection.
In a first aspect, a screen projection method according to an embodiment of the present application is applied to a first electronic device, where the first electronic device includes a first display screen, and the method includes:
the first electronic equipment receives a first operation;
responding to the first operation, the first electronic equipment enables the first display screen to be off, and screen-casting content is obtained from a target application program of at least one application program which is currently running;
the first electronic device sends the screen projection content to the second electronic device.
In the embodiment of the application, the first electronic device can initiate screen projection in response to the first operation, so that a user can initiate screen projection only by operating the first electronic device once, the complexity of an operation mode for triggering the electronic device to initiate screen projection is reduced, and the efficiency for triggering the first electronic device to initiate screen projection is improved.
In one possible design, the first operation is a screen-off operation. For example, clicking a power key, closing a cover of the smart boot. For another example, the first electronic device is a flip electronic device, and the first operation is an operation of an outer cover of the device. For another example, if the first display screen is a foldable display screen, the first operation may be an operation of folding the first display screen from an unfolded state to a closed state. For the display screen that is folded outward, the first electronic device may turn off a partial area of the first display screen or turn off the entire first display screen in response to an operation of folding the first display screen from the unfolded state to the closed state. For another example, if the first display screen is a retractable display screen, the first operation may be an operation of retracting the first display screen. For the retractable display screen, similar to the outer-folded display screen, the first electronic device may turn off a partial region (i.e., a retracted portion) of the first display screen or turn off the entire first display screen in response to an operation of retracting the first display screen.
In one possible design, after the first display screen is turned off in response to the first operation and before screen projection content is acquired from a target application program in the currently running at least one application program, the method further includes:
determining that the intelligent screen projection function is started; and/or the presence of a gas in the atmosphere,
prompting whether the screen projection is allowed or not to the user, and receiving the operation that the screen projection is allowed by the user.
By the technical scheme, the interaction with the user is improved.
In another possible design, before the first electronic device sends the screen-shot content to the second electronic device, the method further includes:
and judging the environment of the first electronic equipment to identify whether the user wants to project a screen. For example, the first electronic device may determine whether the display screen is blocked, for example, when the first electronic device includes the first display screen, the first display screen is a foldable display screen and is an inner screen, and the first electronic device further includes an outer screen, the first electronic device may determine whether the outer screen is blocked. As another example, the first electronic device only includes the first display screen, and the first electronic device may determine whether the first display screen is occluded. When the display screen of the first electronic equipment is not shielded, the first electronic equipment sends the screen projection content to the second electronic equipment. Whether the user places the first electronic equipment in the bag or the pocket or the like can be identified by judging whether the display screen is shielded, generally speaking, when the user places the first electronic equipment in the bag or the pocket, the first electronic equipment is not used, and the screen projection is probably not needed. Furthermore, the first electronic device can learn the behavior habit of the user using the electronic device through the history of the user using the device, and further judge whether the screen projection content is sent to the second electronic device according to the behavior habit of the user using the electronic device.
In one possible design, after the first electronic device sends the screen projection content to the second electronic device, a control interface is displayed on the second display screen, and the control interface is used for achieving quick operation of the target application program. The second display screen may be part or all of the display area of the first display screen, or may be a different display screen than the first display screen. Therefore, the control of the screen projection content is convenient for the user.
In one possible design, the first electronic device determines a control interface corresponding to the type of the target application program from preset control interfaces corresponding to the types of the application programs, and displays the control interface corresponding to the type of the target application program on the second display screen. Helping to simplify the implementation.
In one possible design, the first electronic device identifies a virtual button with a touch function in the target application, and displays the control interface for controlling the target application on the second display screen according to the virtual button with the touch function. The reliability of the displayed control interface is improved.
In a possible design, when the first electronic device does not recognize a virtual button with a touch function in the target application, the control interface corresponding to the type of the target application is determined from preset control interfaces corresponding to the types of the applications, and the control interface corresponding to the type of the target application is displayed on the external screen. Helping to simplify the implementation.
In one possible design, the determining, by the first electronic device, a target application from at least one currently running application includes:
the first electronic equipment displays an identifier of at least one application program supporting a screen projection function in at least one currently running application program on the second display screen; and after receiving an operation of selecting the identifier of the application program displayed on the outer screen by a user, responding to the operation of selecting the identifier of the application program displayed on the outer screen by the user, and determining that the target application program is the application program identified by the identifier of the application program selected by the user. Facilitating improved interaction between the device and the user.
In one possible design, after the first electronic device turns off the first display screen and before the first electronic device sends the screen shot content to the second electronic device, the method further includes:
the first electronic equipment acquires an identifier of at least one piece of electronic equipment;
the first electronic device determines the identification of a target electronic device from the identifications of the at least one electronic device, wherein the identification of the target electronic device is used for identifying the second electronic device.
In one possible design, the first electronic device determines an identification of a target electronic device from the identifications of the at least one electronic device, including:
and the first electronic equipment determines the identifier for identifying the private electronic equipment as the identifier of the target electronic equipment from the identifiers of the at least one electronic equipment. Which helps to reduce user operations.
In one possible design, the first electronic device determining an identity of a target electronic device from the identities of the at least one electronic device includes:
the first electronic equipment displays the identification of at least one piece of electronic equipment on a second display screen; and after receiving an operation of selecting the identifier of the electronic equipment displayed on the second display screen by the user, responding to the operation, and taking the identifier of the electronic equipment selected by the user as the identifier of the target electronic equipment.
In one possible design, each of the identities of the at least one electronic device is used to identify a common electronic device.
In one possible design, after the first electronic device sends the screen-shot content to a second electronic device, the method further comprises:
and the first electronic equipment receives a second operation and stops screen projection in response to the second operation. For example, after stopping screen projection, the first electronic device presents a user interface where the screen projection content is located on the first display screen. For example, the second operation may be an unlocking operation. Alternatively, in the case where the first display screen is a foldable display screen, the second operation may be an operation of unfolding the first display screen from a closed state to an unfolded state. Alternatively, in the case where the first display screen is a retractable display screen, the second operation may be an operation of extending the first display screen, or the like.
In a second aspect, an embodiment of the present application further provides a method for screen projection control, which is applied to a first electronic device, where the first electronic device includes a first application program, and the method includes:
the first electronic equipment acquires screen projection content from a first application program;
the first electronic equipment sends the screen projection content to second electronic equipment;
after the first electronic device successfully casts a screen, determining a control interface corresponding to the type of the first application program from preset control interfaces corresponding to the types of the application programs;
the first electronic equipment displays the determined control interface on a display screen, and the control interface comprises a virtual button with a touch function. And the first electronic equipment responds to the operation after receiving the operation of a certain virtual button on the control interface, and realizes the control of the screen projection content presented on the second electronic equipment.
According to the embodiment of the application, the control interface corresponding to the type of the application program is preset, so that the method for displaying the control interface is facilitated to be simplified.
In one possible design, the first electronic device may configure a control interface corresponding to the type of the application program in the first electronic device before factory shipment; or the first electronic device acquires a control interface corresponding to the type of the application program from the server according to the application program installed in the first electronic device.
In a third aspect, an embodiment of the present application further provides another method for controlling screen projection, where the method is applied to a first electronic device, where the first electronic device includes a first application program, and the method includes:
the first electronic equipment acquires screen projection content from a first application program;
the first electronic equipment sends the screen projection content to second electronic equipment;
after the screen projection of the first electronic device is successful, a virtual button with a touch function is identified from the first application program, and a control interface is displayed on the display screen according to the identified virtual button.
For example, the icon of the virtual button included on the control interface may be obtained by the first electronic device by re-laying out, cutting and/or scaling the virtual button recognized from the first application.
The control interface comprises virtual buttons with the same functions as the virtual buttons recognized by the first electronic equipment from the first application program.
It should be noted that, after receiving an operation on a virtual button on the control interface, the first electronic device responds to the operation to control the screen projection content presented on the second electronic device.
Through the technical scheme, the control interface displayed by the first electronic device is more accurate, and the user experience is improved.
In one possible design, the first electronic device identifies a virtual button with touch functionality from a first application, including:
the first electronic device identifies a virtual button with a touch function from the first application program according to a historical operation record of the first application program by a user, or a Software Development Kit (SDK) interface provided by the first application program, or a position coordinate of a predefined virtual button in the first application program on a user interface. Or the first electronic device identifies the virtual button with the touch function from the first application program by performing voice analysis on the first application program. Helping to simplify the implementation.
In one possible design, when the virtual button with the touch function is not recognized from the first application program, the first electronic device determines a control interface corresponding to the type of the first application program from preset control interfaces corresponding to the types of the application programs, and displays the determined control interface on the display screen. The method is beneficial to simplifying the implementation mode and simultaneously meets the user requirements.
In one possible design, the control interface further includes an identification of the at least one alternative application, and an identification of the at least one alternative screen projecting device. The screen projection device and the screen projection method have the advantages that the user can switch the screen projection application program and/or the screen projection device according to the needs of the user.
In one possible design, the control interface further includes a virtual button for dismissing the screen. The screen projection method and the screen projection device are beneficial to the active stop of the user according to the requirement, and the interaction between the equipment and the user is improved.
In a fourth aspect, a chip provided in this embodiment of the present application is coupled to a memory in a device, so that the chip invokes program instructions stored in the memory when running to implement the above-mentioned aspects of the embodiments of the present application and any method that may be designed according to the aspects.
In a fifth aspect, a computer storage medium of the embodiments of the present application stores program instructions that, when executed on an electronic device, cause the device to perform the various aspects of the embodiments of the present application and any of the designed methods related to the various aspects.
In a sixth aspect, a computer program product according to embodiments of the present application, when run on an electronic device, causes the electronic device to perform a method that implements the above aspects of embodiments of the present application and any of the possible designs to which the aspects relate.
In addition, the technical effects brought by any one of the possible design manners in the fourth aspect to the sixth aspect may refer to the technical effects brought by different design manners in the related part of the method, and are not described herein again.
Drawings
FIG. 1 is a schematic view of a scenario applied in an embodiment of the present application;
fig. 2 is a schematic diagram of a hardware structure of an electronic device according to an embodiment of the present application;
fig. 3A is a schematic diagram of a physical form of an electronic device according to an embodiment of the present application;
FIG. 3B is a diagram of another physical form of the electronic device according to the embodiment of the present application;
FIG. 4A is a diagram illustrating another physical form of an electronic device according to an embodiment of the disclosure;
FIG. 4B is a diagram of another physical form of the electronic device according to the embodiment of the present application;
FIG. 4C is a diagram of another physical form of the electronic device according to the embodiment of the disclosure;
FIG. 5A is a diagram illustrating another physical form of an electronic device according to an embodiment of the present application;
FIG. 5B is a diagram of another physical form of the electronic device according to the embodiment of the present application;
FIG. 5C is a diagram of another physical form of the electronic device according to the embodiment of the disclosure;
FIG. 6A is a diagram illustrating another physical form of an electronic device according to an embodiment of the disclosure;
FIG. 6B is a diagram of another physical form of the electronic device according to the embodiment of the present application;
fig. 7 is a schematic diagram of a software architecture of an electronic device according to an embodiment of the present application;
fig. 8 is a schematic flowchart of a screen projection method according to an embodiment of the present application;
FIG. 9 is a schematic diagram of another scenario applied in the embodiment of the present application;
FIG. 10 is a schematic view of a user interface according to an embodiment of the present application;
FIG. 11A is a schematic view of another user interface according to an embodiment of the present application;
FIG. 11B is a schematic view of another user interface according to an embodiment of the present application;
FIG. 11C is a schematic view of another user interface according to an embodiment of the present application;
FIG. 11D is a schematic view of another user interface according to an embodiment of the present application;
FIG. 12A is a schematic view of another user interface according to an embodiment of the present application;
FIG. 12B is a schematic view of another user interface according to an embodiment of the present application;
FIG. 13 is a schematic view of another user interface of an embodiment of the present application;
FIG. 14A is a schematic view of a control interface according to an embodiment of the present disclosure;
FIG. 14B is a schematic view of another control interface according to an embodiment of the present application;
FIG. 14C is a schematic view of another control interface according to an embodiment of the present application;
FIG. 14D is a schematic view of another control interface according to an embodiment of the present application;
FIG. 14E is a schematic view of another control interface according to an embodiment of the present application;
FIG. 14F is a schematic view of another control interface according to an embodiment of the present application;
FIG. 14G is a schematic view of another control interface according to an embodiment of the present application;
FIG. 15 is a schematic view of another user interface of an embodiment of the present application;
FIG. 16 is a flowchart illustrating another screen projection method according to an embodiment of the present application;
fig. 17 is a schematic physical diagram of another electronic device according to an embodiment of the present application;
FIG. 18 is a flowchart illustrating another screen projection method according to an embodiment of the present application;
FIG. 19 is a schematic diagram of another electronic device according to an embodiment of the disclosure;
fig. 20 is a physical diagram of another electronic device according to an embodiment of the disclosure;
fig. 21 is a schematic structural diagram of another electronic device according to an embodiment of the present application;
fig. 22 is a schematic structural diagram of another electronic device according to an embodiment of the present application.
Detailed Description
It is to be understood that in this application, "/" indicates an OR meaning, e.g., A/B may indicate either A or B; in the present application, "and/or" is only an association relationship describing an associated object, and means that there may be three relationships, for example, a and/or B, and may mean: a exists alone, A and B exist simultaneously, and B exists alone. "at least one" means one or more, "a plurality" means two or more.
In this application, "exemplary," "in some embodiments," "in other embodiments," and the like are used to mean serving as an example, illustration, or description. Any embodiment or design described herein as "exemplary" is not necessarily to be construed as preferred or advantageous over other embodiments or designs. Rather, the term using examples is intended to present concepts in a concrete fashion.
Furthermore, the terms "first," "second," and the like, as used herein, are used for descriptive purposes only and not for purposes of indicating or implying relative importance or implicit indication of a number of technical features being indicated or implied as well as the order in which such is indicated or implied.
It should be understood that the present application relates to at least two electronic devices in embodiments. Specifically, the at least two electronic devices include a source device and a target device. The source end device may also be referred to as a source end device, and is an electronic device that initiates screen projection. Illustratively, the source device is configured to send the screen shot content. For example, the screen-shot content may be video, audio, images, documents, games, and the like, but is not limited thereto. The target device may also be referred to as a client device (client device) or an opposite device, and is an electronic device that receives the screen-casting content. For example, after receiving the screen-shot content, the target device may present or display the screen-shot content in a corresponding layout. It should be noted that, in the embodiment of the present application, the layout of the screen projection content in the target device may be different from or the same as the layout of the source device.
In some embodiments, the source device may be a portable electronic device, such as a cell phone, a tablet, a Laptop (Laptop), a wearable device (e.g., a smart watch), and so on. Exemplary embodiments of the above-described portable electronic device include, but are not limited to, a mount
Figure BDA0002151752660000061
Or other operating system. The physical form of the portable electronic device is not limited in the embodiments of the present application. For example, the portable electronic device may be a foldable device, a tablet device, a flip device, or the like. And also need to be explainedTherefore, the portable electronic device in the embodiment of the present application may further have an intelligent protective sleeve mounted thereon. In other embodiments of the present application, the source device may also be a kiosk, a desktop, or the like.
In some embodiments, the target device may be a tablet, a kiosk, a desktop, a television, a display, a projector, a stereo, or the like, which may be used to receive and present or display the projected content.
By way of example, fig. 1 illustrates an application scenario of the embodiment of the present application. As shown in fig. 1, the electronic device 10 is a source device, and the electronic device 20 is a target device. The electronic device 10 may transmit the screen-shot content to the electronic device 20 so that the screen-shot content may be presented or displayed by the electronic device 20 for better viewing. The electronic device 10 and the electronic device 20 may establish a connection in a wired manner (e.g., via a power line) and/or a wireless manner (e.g., wireless fidelity (wi-fi), bluetooth, etc.). It should be noted that fig. 1 is only an example of an application scenario in the embodiment of the present application, and the embodiment of the present application does not limit the number of target devices that receive the screen-shot content sent by the source device. Taking the source device as the electronic device 10 shown in fig. 1 as an example, the electronic device 10 may send the screen projection content to two or more electronic devices including the electronic device 20, or may send only the screen projection content to the electronic device 20.
However, because the operation manner of triggering the source end device to initiate screen projection is complicated at present, and user experience is poor, the embodiment of the present application provides a screen projection method, so that the source end device can respond to the first operation to initiate screen projection. The user can operate the source end device once, so that the source end device can realize screen projection, the complexity of an operation mode of triggering the source end device to initiate screen projection is reduced, and the efficiency of triggering the source end device to initiate screen projection is improved. In some embodiments, the first operation may be used to control the electronic device to turn off the screen, in which case the first operation may be referred to as a screen turn-off operation. It should be noted that, in the embodiment of the application, when the electronic device is turned off, a blank screen and a screen are not locked, or a blank screen and a screen are displayed, or a default user interface and a screen are not locked, or a default user interface and a screen are displayed, or a part of the blank screen and a part of the default user interface are displayed, and the default user interface may include date and time information, and/or a common application icon, and the like. For example, the content included in the default user interface may be set according to the requirement of the user, or may be set before the electronic device leaves the factory. For example, the electronic device is in a screen locking state when the screen is turned off, and the first operation may also be referred to as a screen locking operation. When the source end device is turned off, the screen projection can be actively initiated, and the screen projection content is sent to the target end device to be presented or displayed, so that a user can continuously check corresponding content at the target end device, and the user experience is improved. In addition, in other embodiments of the present application, the first operation may also be other operations, and in particular, the implementation manner of the first operation may be related to a physical form of the electronic device.
Source and destination devices, and embodiments for using such source and destination devices, are described below.
The source device is taken as the electronic device 10 in the application scenario shown in fig. 1 as an example. For example, fig. 2 shows a hardware structure diagram of an electronic device 10 according to an embodiment of the present application. As shown in fig. 2, the electronic device 10 includes a processor 110, an internal memory 121, an external memory interface 122, a camera 131, a first display 141, a sensor module 150, an audio module 160, a speaker 161, a receiver 162, a microphone 163, an earphone interface 164, a button 170, a Subscriber Identification Module (SIM) card interface 171, a Universal Serial Bus (USB) interface 172, a charging management module 180, a power management module 181, a battery 182, a mobile communication module 191, and a wireless communication module 192. In other embodiments, the electronic device 10 further includes a second display screen 142. The first display screen 141 and the second display screen 142 may be located on different sides of the electronic device 10, for example, the first display screen 141 is located on a first side of the electronic device 10 (e.g., a front side of the electronic device 10), and the second display screen 142 is located on a second side of the electronic device 10 (e.g., a back side of the electronic device 10). In addition, the electronic device 10 in the embodiment of the present application may further include a motor, an indicator, a mechanical shaft, and the like.
It should be understood that the hardware configuration shown in fig. 2 is only one example. The source device of embodiments of the present application may have more or fewer components than the electronic device 10 shown in the figures, may combine two or more components, or may have a different configuration of components. The various components shown in the figures may be implemented in hardware, software, or a combination of hardware and software, including one or more signal processing and/or application specific integrated circuits. It is to be understood that the illustrated structure of the embodiment of the present application does not specifically limit the electronic device 10. In other embodiments of the present application, the electronic device 10 may include more or fewer components than shown, or some components may be combined, some components may be split, or a different arrangement of components. The illustrated components may be implemented in hardware, software, or a combination of software and hardware.
Processor 110 may include one or more processing units, such as: the processor 110 may include an Application Processor (AP), a modem, a Graphics Processing Unit (GPU), an Image Signal Processor (ISP), a video codec, a Digital Signal Processor (DSP), a baseband processor, and/or a neural Network Processor (NPU), etc. The different processing units may be separate devices or may be integrated into one or more processors.
In some embodiments, a buffer may also be provided in the processor 110 for storing a portion of the program and/or data. As an example, the cache in the processor 110 may be a cache memory. The buffer may be used to hold programs and/or data that have just been used, generated, or recycled by processor 110. If the processor 110 needs to use the program and/or data, it can be called directly from the cache. The time for the processor 110 to acquire programs and/or data is reduced, thereby helping to improve the efficiency of the system.
The internal memory 121 may be used to store programs and/or data. It should be noted that, in the embodiments of the present application, programs may also be referred to as program instructions. In some embodiments, the internal memory 121 includes a program storage area and a data storage area. The storage program area may be used to store an operating system (e.g., an operating system such as Android and IOS), a computer program required by at least one function (e.g., screen locking and screen projection), and the like. The storage data area may be used for storing data created and/or obtained during the use of the electronic device (such as an identifier of the target device, an image), and the like. For example, the processor 110 may implement one or more functions by calling programs and/or data stored in the internal memory 121 to cause the electronic device 10 to execute corresponding methods. For example, the processor 110 calls some programs and/or data in the internal memory 121, so that the electronic device 10 executes the screen projection method provided in the embodiment of the present application, thereby improving the efficiency of initiating screen projection by the source device and improving the user experience. The internal memory 121 may be a high-speed random access memory, a nonvolatile memory, or the like. For example, the non-volatile memory may include at least one of one or more magnetic disk storage devices, flash memory devices, and/or universal flash memory (UFS), among others.
The external memory interface 122 may be used to connect an external memory card (e.g., a Micro SD card) to extend the memory capability of the electronic device 10. The external memory card communicates with the processor 110 through the external memory interface 122 to implement a data storage function. For example, the electronic device 10 may save files such as images, music, videos, etc. in an external memory card through the external memory interface 122.
The camera 131 may be used to capture or acquire motion, still images, and the like. Typically, the camera 131 includes a lens and an image sensor. The optical image generated by the object through the lens is projected on the image sensor, and then is converted into an electric signal for subsequent processing. For example, the image sensor may be a Charge Coupled Device (CCD) or a complementary metal-oxide-semiconductor (CMOS) phototransistor. The image sensor converts the optical signal into an electrical signal and then transmits the electrical signal to the ISP to be converted into a digital image signal. It should be noted that in the embodiment of the present application, the camera 131 may include one or more cameras.
The first display screen 141 may include a display panel for displaying a Graphical User Interface (GUI). For convenience of description, the graphic user interface is simply referred to as a user interface. The electronic device 10 presents or displays corresponding content, such as videos, texts, images, virtual keys or virtual buttons for enabling user interaction with the electronic device 10, and the like, to the user by displaying a user interface on the first display screen 141. In some embodiments, the display panel may employ a Liquid Crystal Display (LCD), an organic light-emitting diode (OLED), an active matrix organic light-emitting diode (active-matrix organic light-emitting diode), an AMOLED (AMOLED), a flexible light-emitting diode (FLED), a miniature, a Micro-led, a quantum dot light-emitting diode (QLED), or the like. For example, the electronic device 10 may implement a display function via the GPU, the first display screen 141, the application processor, and the like. It should be understood that the first display screen 141 in the embodiment of the present application may be a foldable screen or a non-foldable screen, and is not limited thereto. It should be noted that, for a specific implementation manner of the second display screen 142, reference may be made to a specific implementation manner of the first display screen 141, and details are not described herein again.
The electronic device 10 is a foldable electronic device, and includes a first display 141 and a second display 142, wherein the first display 141 is a foldable screen and the second display 142 is a non-foldable screen. The first display screen 141 is located on a first side of the electronic device 10, and the second display screen 142 is located on a second side of the electronic device 10, where the first side and the second side are different. For example, the first side of the electronic device 10 may also be referred to as the front side of the electronic device 10, and as shown in fig. 3A, the first display screen 141 is located on the first side of the electronic device 10, and an included angle β of the first display screen 141 in fig. 3A is shown. The second surface of the electronic device 10 may also be referred to as a back surface of the electronic device 10, and as shown in fig. 3B, the second display screen 142 is located on the second surface of the electronic device 10, an included angle of the first display screen 141 in fig. 3B is α, and it should be noted that a value of the included angle of the first display screen 141 may be in a range of [0 °,180 ° ]. When the included angle of the first display screen 141 is 0 °, the first display screen 141 is in a folded state or a closed state, and when the included angle of the first display screen 141 is 180 °, the first display screen 141 is in an unfolded state. It should be noted that the first display screen 141 may also be referred to as an inner screen or a main screen, and the second display screen 142 may also be referred to as an outer screen or an auxiliary screen.
As another example, the electronic device 10 is an electronic device with an unfoldable screen and includes a first display screen 141. Illustratively, the electronic device 10 includes a first side and a second side, and the first display screen 141 is located on the first side of the electronic device. For example, a first side of the electronic device 10 is shown in FIG. 4A and a second side of the electronic device 10 is shown in FIG. 4C. When the first side of the electronic device 10 is shown in FIG. 4A and the second side of the electronic device 10 is shown in FIG. 4C, the electronic device 10 includes only the first display 141. In other embodiments, the electronic device 10 may also include a second display screen 142. The first display screen 141 is located on a first side of the electronic device 10, and the second display screen 142 is located on a second side of the electronic device 10. For example, a first side of the electronic device 10 is shown in FIG. 4A. The second side of the electronic device 10 may be as shown in fig. 4B. It should be understood that when the first side of the electronic device 10 is shown in FIG. 4A and the second side of the electronic device 10 is shown in FIG. 4B, the first display screen 141 may be referred to as the primary screen and the second display screen 142 may be referred to as the secondary screen.
As another example, the electronic device 10 is a screen-foldable electronic device, and includes a first display screen 141. The first display screen 141 is an electronic device with a foldable screen. The first display screen 141 may be in an unfolded state as shown in fig. 5A, and the first display screen 141 may be in a closed state or a folded state as shown in fig. 5B. Fig. 5C is a schematic diagram illustrating the first display screen 141 folded at an included angle β.
As another example, the electronic device 10 is a screen-retractable electronic device, and includes a first display screen 141. In which the extended state of the first display 141 can be as shown in fig. 6A, and the retracted state of the first display 141 can be as shown in fig. 6B.
The sensor module 150 may include one or more sensors. For example, a touch sensor 150A, a pressure sensor 150B, a distance sensor 150C, etc. In other embodiments, the sensor module 150 may also include a gyroscope, an acceleration sensor, a fingerprint sensor, an ambient light sensor, a proximity light sensor, a bone conduction sensor, a temperature sensor, and the like.
Here, the touch sensor 150A may also be referred to as a "touch panel". The touch sensor 150A may be disposed at the first display screen 141 and/or the second display screen 142. For example, the touch sensor 150A is disposed on the first display screen 141. The touch sensor 150A and the first display screen 141 form a first touch screen, which is also called a "first touch screen". The touch sensor 150A is used to detect a touch operation applied thereto or nearby. The touch sensor 150A can communicate the detected touch operation to the application processor to determine the touch event type. The electronic apparatus 10 may provide visual output or the like related to the touch operation through the first display screen 141. In other embodiments, the touch sensor 150A may be disposed on the surface of the electronic device 10, different from the position of the first display screen 141.
The pressure sensor 150B is used for sensing a pressure signal, and can convert the pressure signal into an electrical signal. For example, the pressure sensor 150B may be disposed on the first display screen 141 and/or the second display screen 142. The touch operations which act on the same touch position but have different touch operation intensities can correspond to different operation instructions.
A distance sensor 150C, which may also be referred to as a distance sensor 150C or the like, is used to measure distance. For example, in a shooting scenario, the electronic device 10 may utilize the range sensor 150C to range for fast focus. For another example, after the first display screen 141 is locked, the distance sensor 150C may be further configured to determine whether the first display screen 141 and/or the second display screen 142 are blocked. For example, when the first side of the electronic device 10 is shown in fig. 3A and the second side of the electronic device 10 is shown in fig. 3B, if the first display screen 141 is in the closed state or the folded state, the distance sensor 150C may be used to determine whether the second display screen 142 is occluded. For another example, when the first side of the electronic device 10 is shown in fig. 4A and the second side of the electronic device 10 is shown in fig. 4B, the distance sensor 150C is used to determine whether the first display screen 141 and the second display screen 142 are blocked. For another example, when the first side of the electronic device 10 is shown in fig. 4A and the second side of the electronic device 10 is shown in fig. 4C, the distance sensor 150C is used to determine whether the first display screen 141 is blocked.
The electronic device 10 may implement audio functions via the audio module 160, speaker 161, microphone 163, headphone jack 164, and application processor, among others. Such as an audio play function, a recording function, a voice wake-up function, etc.
The audio module 160 may be used to perform digital-to-analog conversion, and/or analog-to-digital conversion on the audio data, and may also be used to encode and/or decode the audio data. For example, the audio module 160 may be disposed in the processor 110, or some functional modules of the audio module 160 may be disposed in the processor 110.
The speaker 161, also called a "horn", is used to convert audio data into sound and play the sound. For example, the electronic device 100 may listen to music, listen to a speakerphone, or issue a voice prompt, etc. via the speaker 161.
A receiver 162, also called "earpiece", is used to convert audio data into sound and play the sound. For example, when the electronic device 100 answers a call, the answer can be made by placing the receiver 162 close to the ear of the person.
The microphone 163, also referred to as a "microphone" or "microphone", is used to collect sound (e.g., ambient sound, including human-generated sound, device-generated sound, etc.) and convert the sound into audio electrical data. When making a call or transmitting voice, the user can make a sound by approaching the microphone 163 through the mouth of the person, and the microphone 163 collects the sound made by the user. It should be noted that the electronic device may be provided with at least one microphone 163. For example, two microphones 163 are provided in the electronic device, and in addition to collecting sound, a noise reduction function can be realized. For example, three, four or more microphones 163 may be further disposed in the electronic device, so that the recognition of the sound source, the directional recording function, or the like may be further implemented on the basis of implementing sound collection and noise reduction.
The earphone interface 164 is used to connect wired earphones. The headset interface 164 may be a USB interface 170, or may be a 3.5mm open mobile electronic device platform (OMTP) standard interface, a cellular telecommunications industry association (cellular telecommunications industry association of the USA, CTIA) standard interface, or the like.
Keys 170 may include a power key, a volume key, and the like. Keys 170 may be mechanical keys or may be virtual keys. The electronic device 10 may generate signal inputs related to user settings and function control of the electronic device 10 in response to operation of the keys. For example, the electronic device 10 may lock the first display screen 141 in response to a pressing operation of the power key, and trigger execution of the screen projection method according to the embodiment of the present application. It should be noted that, in the embodiment of the present application, the power key may also be referred to as a power-on key, a side key, and the like, and the name of the power key is not limited.
The SIM card interface 171 is for connecting a SIM card. The SIM card can be attached to and detached from the electronic device 10 by being inserted into the SIM card interface 171 or being pulled out from the SIM card interface 171. The electronic device 10 may support 1 or K SIM card interfaces 171, K being a positive integer greater than 1. The SIM card interface 171 may support a Nano SIM card, a Micro SIM card, and/or a SIM card, etc. The same SIM card interface 171 can be inserted with multiple SIM cards at the same time. The types of the plurality of SIM cards can be the same or different. The SIM card interface 171 may also be compatible with different types of SIM cards. The SIM card interface 171 may also be compatible with an external memory card. The electronic device 10 interacts with the network through the SIM card to implement functions such as communication and data communication. In some embodiments, the electronic device 10 may also employ an eSIM card, namely: an embedded SIM card. The eSIM card can be embedded in the electronic device 10 and cannot be separated from the electronic device 10.
The USB interface 172 is an interface conforming to the USB standard specification, and may specifically be a Mini USB interface, a Micro USB interface, a USB Type C interface, or the like. The USB interface 172 may be used to connect a charger to charge the electronic device 10, or may be used to connect the electronic device 10 to a headset to play sound through the headset. When the USB interface 172 can connect a headset, it can be understood that: the USB interface 172 is used as a headset interface. For example, the USB interface 172 may be used to connect other electronic devices, such as an AR device, a computer, and the like, besides being used as a headset interface.
The charge management module 180 is configured to receive a charging input from a charger. The charger may be a wireless charger or a wired charger. In some wired charging embodiments, the charging management module 180 may receive charging input from a wired charger via the USB interface 170. In some wireless charging embodiments, the charging management module 180 may receive a wireless charging input through a wireless charging coil of the electronic device 10. While the charging management module 180 charges the battery 182, the electronic device 10 may be powered by the power management module 180.
The power management module 181 is used to connect the battery 182, the charging management module 180 and the processor 110. The power management module 181 receives an input from the battery 182 and/or the charging management module 180, and supplies power to the processor 110, the internal memory 121, the first camera 131, the second camera 132, the first display screen 141, and the like. The power management module 181 may also be used to monitor parameters such as battery capacity, battery cycle count, battery state of health (leakage, impedance), and the like. In some other embodiments, the power management module 181 may also be disposed in the processor 110. In other embodiments, the power management module 181 and the charging management module 180 may be disposed in the same device.
The mobile communication module 191 may provide a solution including 2G/3G/4G/5G wireless communication, etc. applied to the electronic device 10. For example, the mobile communication module 191 may include a filter, a switch, a power amplifier, a Low Noise Amplifier (LNA), and the like.
The wireless communication module 192 may provide a solution for wireless communication applied to the electronic device 10, including WLAN (e.g., wi-Fi network), bluetooth (BT), global Navigation Satellite System (GNSS), frequency Modulation (FM), near Field Communication (NFC), infrared (IR), and the like. The wireless communication module 192 may be one or more devices that integrate at least one communication processing module. For example, the electronic device 10 may send the screen-casting content and/or the screen-casting instruction to the target device through the wireless communication module 192.
In some embodiments, antenna 1 of electronic device 10 is coupled to mobile communication module 191 and antenna 2 is coupled to wireless communication module 192 so that electronic device 10 may communicate with other devices. Specifically, the mobile communication module 191 may communicate with other devices through the antenna 1, and the wireless communication module 193 may communicate with other devices through the antenna 2.
It should be noted that, in the embodiment of the present application, reference may be made to the related description of the hardware architecture of the electronic device 10 in fig. 3 for the hardware architecture of the target device, which is not described herein again.
For example, fig. 7 shows a schematic software architecture diagram of a source device and a destination device according to an embodiment of the present application. As shown in fig. 5, the source device includes an input module 710A, a processing module 720A, and an output module 730A.
The input module 710A is configured to detect a user operation and report the user operation to the processing module 720A. In the embodiment of the present application, the operation of the user may be a touch operation or a non-touch operation, for example, an operation of displaying a certain user interface on the first display screen 141 or the second display screen 142, an operation of folding the first display screen 141, an operation of pressing a power key, an operation of closing an outer cover of the device, an operation of closing a cover of the smart sock, and the like. Specifically, the input module 710A may detect the operation of the user through a mechanical rotation shaft, a touch sensor, a key, and the like, which is not limited thereto.
The processing module 720A is configured to receive the operation of the user reported by the input module 710A, and identify an operation type of the operation of the user reported by the input module 710. For example, when the operation type of the operation by the user is an off-screen operation, the first display screen 141 is off-screen, and after the first display screen 141 is off-screen, screen projection is triggered. Illustratively, the processing module 720A includes an operation identification module 721A, a screen projection determination module 722A, a content acquisition module 723A, a device acquisition module 724A, and the like. The operation identifying module 721 is configured to identify an operation type of the user operation reported by the input module 710. The screen projection judging module 722A is configured to judge whether the intelligent screen projection function is turned on, or judge whether the first display screen 741 or the second display screen 742 is blocked. The content obtaining module 723A is used to obtain screen projection content. The device acquisition module 724A is configured to determine a target device for receiving the screen-cast content.
The output module 730A is configured to establish a connection with the target device and send the screen projection content to the target device. As another example, the output module 730A is further configured to control the second display screen 742 or the first display screen 741 to display related information. For example, the output module 730A is used to control the second display screen 742 or the first display screen 741 to display a control interface.
Illustratively, as shown in fig. 7, the target device includes an input module 710B, a processing module 720B, and an output module 730B.
The input module 710A is configured to establish a connection with a source device, and receive a screen projection content or a screen projection instruction sent by the source device. Illustratively, the input module 710B includes a device connection module 711B, a content interaction module 712B, and an instruction interaction module 713B. The device connection module 711B is configured to establish a connection with a source device. The content interaction module 720B is configured to receive the screen-shot content sent by the source device, and send the screen-shot content to the processing module 720B. The instruction interaction module 713B is configured to receive a screen-casting instruction sent by the source device, for example, a screen-casting cancellation instruction.
The processing module 720B is configured to, after receiving the screen-projected content sent by the content interaction module 720B, re-arrange or clip the screen-projected content, and send the re-arranged or clipped screen-projected content to the output module 730B.
The output module 730B is configured to present or display the re-laid screen projection content after receiving the re-laid screen projection content sent by the processing module 720B.
The following embodiments may be implemented in an electronic device having the above-described hardware configuration and/or software configuration. The screen projection method according to the embodiment of the present application is specifically described below with reference to source end devices of different physical forms.
Example one:
the source end equipment is electronic equipment with a foldable screen. The source end device comprises an inner screen and an outer screen, the inner screen is a foldable screen, and the outer screen is an unfoldable screen. For example, the inner screen of the source device may be the first display screen 141 shown in fig. 3A or fig. 3B, and the outer screen of the source device may be the second display screen 142 shown in fig. 3B.
Illustratively, as shown in fig. 8, a schematic flow chart of a screen projection method according to an embodiment of the present application specifically includes the following steps.
Step 801, receiving an operation of folding an inner screen of a source device from an expanded state to a closed state when the inner screen is in use.
Wherein, the inner screen of the source device is in use, which can be understood as:
and when the inner screen is in an expanded state and is not locked, the source device displays a corresponding user interface, such as a main interface, a minus one screen (-1 screen), a user interface of an application program and the like, on the inner screen. The user can perform corresponding operations as required when the inner screen displays a corresponding user interface, so that the source end device responds to the operations of the user and performs corresponding display on the inner screen. That is, when a certain user interface is displayed under the condition that the inner screen of the source end device is in the expanded state and the screen is not locked, the user can perform corresponding operation on the source end device and control the inner screen to perform corresponding display, so that the requirements of the user are met.
For example, when the source device is in an expanded state and the inner screen is not locked, and the inner screen displays a desktop (the desktop may also be referred to as a main interface and includes icons of one or more applications), the source device detects that a user clicks an icon (e.g., an icon of an arcade) of an application on the desktop, and displays a user interface of the arcade on the inner screen in response to the operation of clicking the icon of the arcade. For another example, when the source device is in the expanded state and the inner screen is not locked, and when the inner screen displays the user interface of a certain application program, the user may perform corresponding operations on virtual buttons (or virtual keys) included in the user interface, thereby implementing corresponding control. For example, when the source device displays the user interface of the arcade, and detects an operation of the user on the user interface of the arcade (for example, an operation of clicking a virtual button for controlling the video to be displayed in a full screen), the source device responds to the operation, displays the corresponding video in the full screen on the inner screen, and plays the video, so that the user can watch the corresponding video. For another example, when the source device detects that the user searches for the route on the user interface of the Baidu map when the inner screen displays the user interface of the Baidu map, the source device displays a corresponding route search result on the inner screen in response to the operation of searching for the route, so that the user can conveniently reach a corresponding destination.
In general, the user interface displayed on the inner screen by the source device is a user interface of the source device running the application in the foreground. Wherein the source device can run one or more applications in the foreground. In some embodiments, the source device may run one or more applications in the background while running one or more applications in the foreground. For example, if the application running in the foreground by the source device is the love art, the user interface of the love art is displayed on the inner screen. For example, when the inner screen displays the user interface of the amusing art, the source device may also run other applications in the background, such as a pay pal, a Wechat, and the like.
For example, when the inner screen of the source device is in use, the outer screen locks and is blank, or the outer screen locks but displays a default user interface, where the default user interface may include information such as time and date, and specifically, the user may set information displayed on the default user interface as needed.
Taking the source device as the electronic device 10 in the application scenario shown in fig. 1 as an example. For example, the first display screen 141 is an inner screen of the electronic device 10, the second display screen 142 is an outer screen of the electronic device 10, and the processor 110 of the source device may determine whether an operation of folding the inner screen from an unfolded state to a closed state is received by detecting a rotation angle change of a mechanical rotating shaft of the first display screen 141. For example, when the mechanical rotation shaft of the first display screen 141 of the source device rotates so that the included angle of the first display screen 141 changes from 180 ° to 0 °, an event that the included angle of the first display screen 141 changes from 180 ° to 0 ° is reported to the processor 110, and after receiving the event that the included angle of the first display screen 141 changes from 180 ° to 0 ° reported by the mechanical rotation shaft, the processor 110 determines that an operation of folding the inner screen from the unfolded state to the closed state is received. For another example, the processor 110 of the source device may also determine whether an operation of folding the inner screen from the unfolded state to the closed state is received by collecting data of other sensors for sensing the angle change of the first display screen 141. It should be noted that, in the embodiment of the present application, a manner of specifically determining whether an operation of folding the inner screen from the expanded state to the closed state is received by the source end device is not limited.
In step 802, the source device turns off the inner screen in response to receiving an operation of folding the inner screen from the unfolded state to the closed state.
It should be noted that, when the inner screen is folded into the closed state, the inner screen cannot normally display a user interface to a user, and therefore, in order to save power consumption of the electronic device, turning off the inner screen may be understood as that the inner screen is a black screen and is locked, or that the inner screen is a black screen and is not locked. In addition, when the source end device folds the inner screen into a closed state, the user cannot normally use the inner screen, that is, the user cannot normally operate the inner screen, so that the source end device is controlled. Taking the source device as the electronic device 10 in the application scenario shown in fig. 1 as an example, when the first display screen 141 is an inner screen of the electronic device 10 and the second display screen 142 is an outer screen of the electronic device 10, the processor 110 of the electronic device 10 determines that the operation type is the screen-off operation according to the operation of folding the first display screen 141 from the unfolded state to the closed state when determining that the operation of folding the first display screen 141 from the unfolded state to the closed state is received, and then turns off the screen of the first display screen 141.
It should be noted that, in response to receiving an operation of folding the inner screen from the expanded state to the closed state, the source device may also automatically unlock the outer screen, and automatically map the user interface displayed on the inner screen to the outer screen for display. For example, when the user interface of the art of love is displayed on the inner screen, the outer screen is unlocked in response to the operation of folding the inner screen from the unfolded state to the closed state, and the user interface of the art of love displayed on the inner screen is automatically mapped to the outer screen for display. Or, the tablet may respond to the operation of folding the inner screen from the unfolded state to the closed state, and the outer screen may continue to maintain the blank screen and/or lock the screen, or the outer screen may continue to maintain the lock screen and display a default user interface, and the like, which is not limited in this respect.
In step 803, after responding to the operation of folding the inner screen from the expanded state to the closed state, the source device determines a target application program from at least one currently running application program.
It should be noted that the at least one application currently running may include an application currently running in the foreground and/or the background. For example, the target application may be an application satisfying a first preset condition among at least one currently running application. The first preset condition may be set according to an actual situation, which is not limited herein.
For example, the target application identifies applications in the whitelist for at least one application currently running. The white list includes an identifier of an application program supporting a screen projection function, which may be set by a user according to needs of the user, set by the source device before delivery, or generated by the source device according to a preset policy. For example, if the application programs of the audio/video type, the map type, the reading type, and the Instant Messaging (IM) type specified in the preset policy are application programs supporting a screen projection function, the source device may generate a white list according to the identifier of the application program installed by the source device and conforming to the type specified in the preset policy. For example, if an love art, a WeChat and a Paibao are installed on the source device, the white list generated according to the preset policy includes an identification of the love art and an identification of the WeChat. It should be understood that when the source device detects that a new application is installed, it may be determined whether the application conforms to the type specified by the preset policy, and when the application conforms to the type specified by the preset policy, the identifier of the application is added to the white list. Alternatively, in other embodiments, the whitelist includes an identification of applications that do not support the screen projection function. In this case, the target application may identify applications not on the white list for the at least one application currently running.
In this embodiment, the identifier of the application program may be a package name of the application program, an icon of the application program, or a custom identifier according to needs, which is not limited herein.
For another example, the target application is an application in progress for a service in the currently running application. Specifically, for video type applications, such as love art, youku, tencent video, etc., services are in progress, which can be understood as that the video is playing; for music playing type application programs, such as shrimp music, internet music and the like, services are in progress, and it can be understood that music is playing; for map-type applications, such as Baidu maps, gauder maps, etc., services are in progress, which may be understood as being in navigation, or being in search, etc.; for game-type applications, such as royal glory, russian dice, etc., the service is in progress, which can be understood as being in the game, etc. For instant messaging type applications, such as WeChat, QQ, etc., services are in progress, which may be understood to be in progress, or in a voice call, or in a video call, or in a file transfer, etc. The application program whose service is in progress may be an application program running in the foreground or an application program running in the background.
As another example, the target application identifies an application that is in the white list and for which the service is in progress, for the currently running application. Wherein the white list includes an identification of the application program that supports the screen-casting function.
As another example, the target application is an application currently running in the foreground.
In some embodiments, the source device does not cast a screen when none of the at least one application currently running satisfies the first preset condition. Or the source device does not screen when the application program is not currently running.
In step 804, the source device obtains the screen-casting content from the target application program, and sends the screen-casting content to the target device.
For example, the source device may send the screen-casting content to the target device based on Miracast, airplay, dlna, or Hicast, and other technologies.
The source device and the target device may establish connection before receiving an operation of a user to fold the inner screen from the expanded state to the closed state, or may establish connection after receiving an operation of the user to fold the inner screen from the expanded state to the closed state. For example, after the source device turns off the inner screen, the source device determines a target device for receiving the screen-projected content, and then initiates a connection establishment procedure to the target device. After establishing connection with the target end device, the source end device sends the screen projection content to the target end device when acquiring the screen projection content from the target application program.
For example, the source device may determine a target device that receives the projected content based on:
after the source end device is used for extinguishing the inner screen, the identification of the electronic device supporting the screen projection function on the periphery can be obtained based on communication technologies such as Bluetooth and/or wi-fi, and the identification of the target device is determined according to the obtained identification of the electronic device supporting the screen projection function on the periphery. And the electronic equipment identified by the target equipment identification is target end equipment for receiving the screen projection content. In some embodiments, to simplify the manner of determining the target device identifier, the target device identifier may be an identifier satisfying a second preset condition among identifiers of surrounding electronic devices supporting the screen-projection function. The second preset condition may be set according to actual needs, which is not limited in this respect.
For example, the target device identification is an identification located in a trusted list in identifications of surrounding electronic devices supporting screen projection functions. The trusted list includes an identifier of at least one electronic device, which may be added by the user according to the need of the user, or an identifier of an electronic device that has been connected to the source device. Further, the identity of the electronic device included in the trusted list may be an identity of a private electronic device added by the user, or an identity of a private electronic device that has been connected to the source device. In the embodiment of the present application, the private electronic device refers to an electronic device in a non-public place, for example, a television in a home, a desktop in a dormitory, or the like, and the electronic device in a public place may be a display in a conference room, or the like.
In some embodiments, when the obtained identifiers of the electronic devices supporting the screen-projection function all around are not in the trusted list, the source device may obtain geographic location information of its own, determine whether a current location is a public location according to the geographic location information of its own, and select one identifier from the obtained identifiers of the electronic devices supporting the screen-projection function all around as the target device identifier when the current location is not the public location.
For example, when the geographic location of the source device is a public place, the source device further determines whether the external screen of the source device is blocked, and if the external screen is not blocked, the identifier of the electronic device supporting the screen projecting function is displayed on the external screen. The user can select one of the identifications of the electronic equipment supporting the screen projection function on the periphery displayed on the outer screen as the target equipment identification according to the self requirement. Taking a source device as the electronic device 10 as an example, as shown in fig. 9, the identifier of the electronic device that supports the screen-projection function in the periphery acquired by the electronic device 10 includes an identifier of the electronic device 20, an identifier of the electronic device 30, and an identifier of the electronic device 40, and then the electronic device 10 displays the identifier of the electronic device 20, the identifier of the electronic device 30, and the identifier of the electronic device 40 on an external screen, for example, as shown in fig. 10, the electronic device 10 displays the identifier of the electronic device 20, the identifier of the electronic device 30, and the identifier of the electronic device 40 on the external screen, where when a user selects the identifier of the electronic device 30, the electronic device 10 determines that the target device identifier is the identifier of the electronic device 30. For example, in response to the operation of folding the inner screen from the unfolded state to the closed state, the electronic device 10 turns off the inner screen, and the outer screen continues to be locked, based on the above scenario, in order to enable the user to operate the identifier of the electronic device 20, the identifier of the electronic device 30, and the identifier of the electronic device 40 displayed on the outer screen, the user needs to unlock the outer screen of the electronic device 10 first. For example, the electronic device 10 may unlock the external screen of the electronic device 10 by recognizing a face or a fingerprint of the user. Or, the user operates the identifier of the electronic device 20, the identifier of the electronic device 30, and the identifier of the electronic device 40 displayed on the external screen, which is not limited by the external screen locking, that is, in the case of the external screen locking, when the electronic device 10 locks the external screen, the identifier of the electronic device 20, the identifier of the electronic device 30, and the identifier of the electronic device 40 are displayed, and the user may perform corresponding operations on the identifier of the electronic device 20, the identifier of the electronic device 30, and the identifier of the electronic device 40 without unlocking the external screen.
In other embodiments of the present application, the electronic device 10 further displays a virtual button for controlling screen dismissal on the external screen, and the user may click or touch the virtual button for controlling screen dismissal to enable the electronic device 10 to dismiss the screen. Therefore, the user can cancel the screen projection according to the requirement of the user. Or, the electronic device 10 further displays, on the external screen, a prompt message indicating that the user does not select the screen-casting device for longer than a preset time, and then automatically cancels the screen-casting. For example, the preset time period may be 10s, 15s, and the like, and may be set accordingly according to a user requirement. For example, the electronic device 10 may automatically blank the screen or display a default user interface when the screen is dismissed.
It should be noted that, in the embodiment of the present application, the identifier of the electronic device may include an icon of the electronic device, a name of the electronic device, and the like, which is not limited herein.
In other embodiments, when the outer screen is occluded, for example, the source device is placed in a bag, or the outer screen is placed against the desktop, the source device no longer casts the screen.
As another example, the source device may further determine the target device identifier from the identifiers of the electronic devices supporting the screen-casting function, which are obtained from the surroundings, with reference to at least one of device capability information (e.g., whether a display screen is included, whether a speaker is included, touch control is included), device attribute information (e.g., resolution of a display screen of the device, sound effects), a current operating state of the device (e.g., whether video and audio are being played or whether communication with other devices is being performed, etc.), and the like. The device capability information, the device attribute information, and the current operation state of the device may be obtained by the source device in the process of obtaining an identifier of the electronic device supporting the screen-casting function. For example, the electronic device identified by the target device identification may be an electronic device that does not currently play a video, includes a display screen, and has a display screen resolution greater than a first threshold. Wherein, the first threshold value can be set according to actual needs.
In the embodiment of the application, the source device can acquire the connectable nearby electronic devices based on bluetooth and/or wi-fi scanning. Taking bluetooth as an example, when the source device obtains the identifiers of the multiple electronic devices based on bluetooth scanning, it may determine one or more device identifiers from the identifiers of the multiple electronic devices, and then the source device establishes connection with the target device identified by the determined one or more device identifiers, and sends the screen-casting content to the target device identified by the determined one or more device identifiers. Specifically, the source device may determine a device identifier from the device identifiers of the target devices, and send the screen-shot content to the target device identified by the determined device identifier.
In order to improve the interaction between the user and the electronic device, for example, after the source device executes step 802, it is determined whether the intelligent screen projection function is already started, and when the intelligent screen projection function is already started, step 803 and step 804 are executed again, and when the intelligent screen projection function is not started, step 803 and step 804 are not executed again. The intelligent screen projection function is started or closed by the source device in response to the operation of the user. For example, a virtual button for controlling to turn ON or OFF the smart screen projection function is set ON the system setting interface, when the user switches the virtual button from OFF to ON, the source device turns ON the smart screen projection function, and when the user switches the virtual button from ON to OFF, the source device turns OFF the smart screen projection function. The system setup interface may be a user interface 1100 as shown in fig. 11A. The user interface 1100 comprises a virtual button 1101, wherein the virtual button 1101 is used to control turning on or off the smart screen projection function. As another example, the virtual button for controlling turning on or off the smart screen projection function may also be provided on other user interfaces, for example, the virtual button for controlling turning on or off the smart screen projection function is provided in a notification bar, or a system toolbar, or a control bar of a pull-up interface or a pull-down interface. For example, the pull-up interface may be displayed by the source device in response to a user operation sliding down on the outer screen or the inner screen. As another example, the pull-down interface may be displayed by the source device in response to a user sliding up on the outer screen or the inner screen.
In other embodiments, the source device prompts the user whether to screen the source device after performing step 802 or after determining that the smart screen projection function is turned on. For example, as shown in fig. 11B, the source device may display a prompt box 910 on the external screen, where the prompt box 1110 includes prompt information, and a confirmation option "yes" and a denial option "no", where the prompt information is used to prompt the user whether to screen, for example, the prompt information may be "please confirm whether to screen? "and the like. The source device may continue to perform steps 803 and 804 in response to the user selecting the confirmation option "yes", and the source device may not continue to perform steps 803 and 804 in response to the user selecting the denial option "no". In addition, in some examples, after the source device displays the prompt box 1110 on the external screen, if the operation of the user is not detected for more than a preset time period, the source device may default that the user agrees to screen projection, or default that the user refuses to screen projection. For example, the preset time period may be 10s, 15s, and the like, which is not limited. In the case that the default user agrees to screen projection, the source device may proceed to steps 803 and 804. In addition, it should be noted that the prompt box 1110 is displayed by the source device when the external screen is locked, and in order to improve security, before the user operates the prompt box 1110 displayed by the source device, the external screen needs to be unlocked by using a fingerprint, a password, or facial recognition. Further, after the source device agrees to screen projection, the source device may further display a user interface shown in fig. 11C or fig. 11D on the external screen, so that the user may cancel screen projection at any time. It should be further noted that, in the embodiment of the present application, a sequence of determining whether the intelligent screen projection function is turned on and prompting whether to project a screen to a user may not be limited.
In some embodiments, the source device does not screen any more when there is no connectable electronic device in the surroundings or nearby, that is, the source device does not acquire the identity of at least one electronic device supporting the screen-casting function in the surroundings.
In another embodiment, as an alternative step to step 803 and step 804, after step 802, the method may further comprise:
step 805, after the source device locks the inner screen in response to the operation of folding the inner screen from the expanded state to the closed state, determining whether the intelligent screen projecting function is already opened, if so, executing step 806, and if not, ending the process.
In step 806, the source device determines whether the outer screen is blocked, if not, step 807 is executed, and if the outer screen is blocked, the process is ended.
For example, the source device may determine whether the outer screen is blocked by the distance sensor, the camera, or the like, for example, when the source device is placed in a bag or a pocket, or the outer screen is downward when the source device is placed on a table, the source device may detect that the outer screen is blocked by the distance sensor or the camera, or the like. For another example, when the source device is placed on a desk and the outer screen faces upward, it may be determined that the outer screen is not blocked by the camera located on the same side as the outer screen. It should be noted that, the above is only an exemplary description of detecting whether the external screen is occluded, and the source device may also determine whether the external screen is occluded by other manners, for example, determine whether the external screen is occluded by an Artificial Intelligence (AI) manner. Whether the outer screen is shielded or not is judged to detect whether the user has the intention of screen projection, and if the source end equipment is placed in a pocket or a bag, the source end equipment considers that the user does not use the equipment any more, and then screen projection is not triggered. In addition, the embodiment of the application can also be combined with other parameters (such as time, place and the like) acquired by other sensors (such as a positioning sensor) and the like to learn the behavior habit of the user using the device, so that whether the user has the intention of screen projection is more accurately judged, and the reliability of screen projection triggered by the source end device is improved.
In some embodiments, the source device may prompt the user whether to screen, for example, display a prompt box 1110 shown in fig. 11B on the external screen after the user agrees to screen, and after the user agrees to screen, the source device may obtain an identifier of at least one currently running application program supporting the screen-projection function and an identifier of at least one electronic device for receiving the screen-projection content, may prompt the user of a currently running process on the external screen, for example, when the source device obtains an identifier of a currently running application program supporting the screen-projection function, the content obtaining may be displayed on the external screen, as shown in fig. 11C, and for example, when the source device obtains an identifier of an electronic device for receiving the screen-projection content, the content obtaining may be displayed on the external screen, as shown in fig. 11D, and after the source device obtains an identifier of at least one currently running application program supporting the screen-projection function and an identifier of at least one electronic device for receiving the screen-projection content, step 807 is performed.
Step 807, the source device displays an identifier of at least one currently running application program supporting the screen projection function on the external screen; and displaying an identification of at least one electronic device for receiving the screen-projected content on the external screen.
The identifier of the currently running at least one application program supporting the screen-projection function may be an identifier that identifies an application program in a white list in the currently running application program, where the white list includes the identifier of the at least one application program supporting the screen-projection function. The identification of at least one electronic device for receiving screen projection content can be acquired by the source device based on communication technologies such as bluetooth and/or wi-fi.
For example, the identification of the at least one application program supporting the screen projection function currently running comprises identification of the arcade, identification of the cool dog music and identification of the trembler, the identification of the at least one electronic device for receiving the screen projection content is identification of the electronic device 20, identification of the electronic device 30 and identification of the electronic device 40, and the source device displays the identification of the arcade, identification of the cool dog music and identification of the trembler on the external screen, and displays the identification of the electronic device 20, identification of the electronic device 30 and identification of the electronic device 40 on the external screen. Wherein, the identification of at least one application program supporting the screen projection function currently running and the identification of at least one electronic device for receiving the screen projection content can be displayed on the same user interface, for example, as shown in fig. 12A. The identification of the at least one application supporting the screen-projection function currently running and the identification of the at least one electronic device for receiving the screen-projection content may be displayed on the same user interface or on different user interfaces, for example, as shown in fig. 12B.
In some embodiments, in the case that the source device currently runs at least two applications supporting the screen projection function, the step 807 is executed to display, on the external screen, an identification of one or more applications of the at least two applications supporting the screen projection function. For example, when the source device currently runs an application program supporting a screen projection function, the source device may obtain the screen projection content from the application program supporting the screen projection function without prompting the user for an identifier of the application program. Or, in the case that the source device currently runs an application program supporting the screen projection function, the source device may also prompt the user with an identifier of the application program, but the user may select the identifier of the application program without operation, and the source device may obtain the screen projection content from the application program.
In still other embodiments, in the case that the source device obtains the identifications of at least two electronic devices for receiving the screen-projected content, the identification of at least one of the identifications of the at least two electronic devices for receiving the screen-projected content is displayed on the outer screen in step 807. For example, if the source device only obtains the identifier of one electronic device for receiving the screen-shot content, the identifier of the electronic device may not be displayed on the external screen, or the identifier of the electronic device may be displayed on the external screen, but the user may select the identifier of the electronic device without any operation.
It should be noted that, when none of the currently running application programs of the source device supports the screen projection function, or no application program is currently running, and/or no identifier of the electronic device for receiving the screen projection content is acquired, the source device does not project the screen.
Step 808, after receiving the identifier of an application selected by the user from the identifiers of at least one currently running application supporting the screen-projecting function, the source device obtains the screen-projecting content from the application identified by the identifier of the application selected by the user, and sends the screen-projecting content to the electronic device identified by the identifier of the electronic device selected by the user from the identifiers of at least one electronic device for receiving the screen-projecting content. It should be noted that the application identified by the identification of the application selected by the user may be referred to as a target application.
Further, in step 807, the source device displays a user interface in delivery on the external screen after receiving an identifier of an application selected by a user from identifiers of at least one application supporting screen projection functions currently running and an identifier of an electronic device selected from identifiers of at least one electronic device for receiving screen projection content, or after determining a target device for receiving screen projection content, for example, the user interface shown in fig. 13.
It should be noted that, in the embodiment of the present application, the sequence of the source device acquiring the identifier of the application program supporting the screen projecting function and the identifier of the electronic device for receiving the screen projecting content is not limited, the source device may acquire the identifier of the application program supporting the screen projecting function and the identifier of the electronic device for receiving the screen projecting content at the same time, and may also acquire the identifier of the application program supporting the screen projecting function first and then acquire the identifier of the electronic device for receiving the screen projecting content, or acquire the identifier of the electronic device for receiving the screen projecting content first and then acquire the identifier of the application program supporting the screen projecting function.
For example, the identifier of the application program supporting the screen projection function is obtained first, and then the identifier of the electronic device for receiving the screen projection content is obtained. The source device may display, when acquiring an identifier of at least one currently running application program supporting a screen projection function, the content acquisition on the external screen (for example, as shown in fig. 11C), and after acquiring the identifier of at least one currently running application program supporting a screen projection function, display, on the external screen, the identifier of at least one application program supporting a screen projection function, and after selecting an identifier of a certain application program in response to a user, the device acquisition on the external screen (for example, as shown in fig. 11D), and after acquiring an identifier of at least one electronic device used for receiving screen projection content, display, on the external screen, the identifier of at least one electronic device, and display, in response to selecting an identifier of a certain electronic device by the user, a screen projection on the external screen (for example, as shown in fig. 13). In this case, the source device may simultaneously obtain the screen-shot content and the identification of at least one electronic device for receiving the screen-shot content.
For example, when the source device does not acquire the identifier of the application program supporting the screen projection function, the source device may prompt the user of the screen projection failure, and further, the source device may prompt the user of the reason for the screen projection failure. For another example, when the source device does not obtain the identifier of the electronic device for receiving the screen projection content, the source device may prompt the user of the screen projection failure, and further, the source device may prompt the user of a reason for the screen projection failure.
It should be noted that, in this embodiment of the present application, step 806 may be located before step 805, or located after step 807, or located after step 808, or step 806 may also be executed simultaneously with step 807 or step 808, and so on, which is not limited to this, but the source device terminates the flow of screen projection once detecting that the external screen is occluded.
Further, based on steps 801 to 803, or based on steps 801 and 802 and steps 805 to 807, the source device displays a control interface on the external screen after successfully sending the screen-projected content to the target device. The control interface comprises a virtual button with a touch function. The virtual buttons with touch function are different for different applications. After receiving the operation of the user on the control interface, the source device responds to the operation on the control interface to control the target application program, so that the purpose of controlling the screen projection content is achieved. The operation of the user on the control interface may be an operation of a virtual button on the control interface for controlling a certain function, or may be a shortcut gesture operation on the control interface, which is not limited to this.
For example, for a video type application, such as the Aichi art, when the on-screen content is the content on the user interface of the Aichi art, the control interface may include a progress bar, a pause button, a fast forward button, a selection button, a sharpness button, and the like, as shown in FIG. 14A. For example, the source device receives an operation of the definition button by the user, and sets the definition from the standard definition to the high definition, and then the voting content presented on the target device is switched from the standard definition to the high definition.
For another example, for an audio-type application, such as cool-me music, when the content is projected as content on a user interface of cool-me music, the control interface may include a progress bar, a pause button, a fast forward button, a menu button, and the like, as shown in fig. 14B. For example, when the source device receives an operation of clicking a pause button by a user, playing of audio is paused on the target device.
For another example, for a social video application, such as trembling, when the on-screen content is the content on the trembling user interface, the control interface may include the touch area, favorite, comment, share, and other function buttons as shown in fig. 14C. The user can switch the video by sliding up and down in the touch area, or click to pause or start playing the video in the touch area, and the like.
For another example, for a game-like application, such as a greedy snake, when the content is projected on a user interface of the greedy snake, the control interface may include up, down, left, right function buttons, etc., as shown in fig. 14D. Wherein, the user can control the moving direction of the snake by an up, down, left and right function buttons, etc. Alternatively, when the projected content is content on a user interface greedy snake, the control interface may also be more visual and animated as shown in fig. 14E, i.e., the control interface is a gamepad that presents virtual buttons to the user for controlling the game.
For another example, for an application of short message type, such as WeChat, QQ, etc., when the content being projected is content on the user interface of WeChat, the control interface may include an input method as shown in FIG. 14F or FIG. 14G. When text input is used on the user interface for WeChat, the control interface can be as shown in FIG. 14F. When voice input is employed on the user interface for WeChat, the control interface may be as shown in FIG. 14G.
Specifically, after the step 803 is executed, or after the source device receives an identifier of an application program selected by the user from the identifiers of at least one application program supporting a screen-projection function currently running in step 808, the source device may take the application program identified by the identifier of the application program selected by the user as the target application program in step 808, and the source device may display a corresponding control interface on the external screen according to the target application program.
Example 1: the source device may determine a control interface corresponding to the type to which the target application program belongs from control interfaces corresponding to different types of preset application programs, and then display the control interface corresponding to the target application program on the external screen. For example, the target application is an arcade, and the type of the application to which the target application belongs is a video type. The source device can determine the control interface corresponding to the video type application program from the preset control interfaces corresponding to different types of application programs, and then display the control interface corresponding to the type of the application program belonging to the Aichi art on the outer screen. Helping to simplify the implementation. It should be noted that the preset control interface may be different for different types of applications, for example, for a video type application, the preset control interface may be as shown in fig. 14A, and for an audio type application, the preset control interface may be as shown in fig. 14B. Specifically, the control interfaces corresponding to the different types of applications may be preset in the electronic device before the device leaves a factory, or the electronic device may obtain the control interfaces from a server in advance according to the application installed in the electronic device.
Since for some applications, there are individual control buttons, for example, some game-type applications, and the preset control interface may not meet the requirements of the user, the embodiment of the present application further provides a method for displaying the control interface on the external screen.
Example 2: the source device identifies a virtual button (for example, the virtual button may also be referred to as a User Interface (UI) element, a virtual key, or a control, etc.) with a touch function from a user interface of the target application, where the virtual button with the touch function may be clicked, touched, or pressed by a user, so as to implement a corresponding function (such as pausing play, fast forwarding, etc.). Then, the electronic device performs re-layout, cutting, zooming and/or the like on the identified virtual button with the touch function, generates a control interface corresponding to the target application program, and displays the control interface on the external screen. The control interface comprises at least one virtual button for enabling a user to realize shortcut operation on the target application program.
It should be noted that the at least one virtual button may include a button having the same function as all virtual buttons recognized by the source device from the user interface of the target application, or may include a button having the same function as part of virtual buttons recognized by the source device from the user interface of the target application.
The control interface includes a virtual button having the same function as the virtual button identified on the user interface of the target application program, and the method can be implemented as follows:
1. by mapping virtual buttons on the control interface onto the user interface of the target application. For example, the position coordinates of the virtual button on the control interface having the same function as the virtual button identified on the user interface of the target application are mapped to the position coordinates of the virtual button identified on the user interface of the target application.
2. The control interface is generated by directly changing the layout of the virtual buttons with touch control functions on the user interface of the target application program, so that the virtual buttons with the same functions as the virtual buttons identified on the user interface of the target application program are included on the control interface.
3. And fitting the virtual button on the control interface to the universal service interface of the target application program to ensure that the virtual button has the same function as the virtual button identified on the user interface of the target application program.
In other embodiments, the at least one virtual button may also include other functional buttons other than the same function as the virtual button identified on the user interface of the target application, such as a screen-off button, a button to switch the target end device for receiving the screen-cast content, and/or a button to switch the target application for screen-casting, etc. Taking the screen-projection canceling button as an example, the electronic device receives an operation of clicking the screen-projection canceling button by a user, responds to the operation of clicking the screen-projection canceling button by the user, and finishes screen projection.
Taking the target application as the royal glory as an example, the user interface of the target application may be as shown in a in fig. 15, the source device recognizes the virtual button 1501 in a in fig. 15, and then re-cuts and lays out the icon of the virtual button 1501, resulting in the virtual button 1502 shown in B in fig. 15, where the virtual button 1502 is a virtual button included on the control interface corresponding to the royal glory and has the same function as the virtual button 1501, and in order to enable the same function as when the virtual button 1501 is operated to be implemented when the user operates the virtual button 1502, for example, the position coordinates of the virtual button 1502 are associated with the position coordinates of the virtual button 1501.
In some embodiments, the source device may identify the virtual button with touch functionality based on a record of historical usage of the target application (e.g., a history of user click screen operations, etc.). Alternatively, the source device may identify the virtual button having the touch function according to a software development package (SDK) interface provided by the target application. Alternatively, the source device may further identify the virtual button for touch functionality from a predefined location area for laying out the virtual button for touch functionality in the user interface of the target application. In addition, the virtual button with the touch function may also be identified in other manners in the embodiment of the present application, for example, the source device may identify the virtual button with the touch function by performing semantic analysis on the user interface of the target application (for example, performing semantic analysis on registration information of the virtual button with the touch function, and the like), performing image identification on the user interface of the target application, and the like.
Further, limited by the capability of the electronic device, for some applications, the virtual button with the touch function on the user interface may not be identified, so that in order to simplify the implementation of displaying the control interface on the external screen and enable the user interface displayed on the external screen to meet the user's requirement, the source end device may determine the control interface corresponding to the type of the target application from the control interfaces corresponding to the types of different applications configured in advance and display the control interface on the external screen, in the case that the virtual button with the touch function is not identified from the user interface of the target application. Specific implementation manners can be referred to in the related description of example 1, and are not described herein again.
Take the target application as example to be glory of the king person. The user interface of the target application may be as shown in a in fig. 15, and when the source device does not recognize the virtual button with the touch function on the user interface shown in a in fig. 15, if the preset application type is the control interface corresponding to the game type as shown in C in fig. 15, the source device displays the control interface shown in C in fig. 15 on the external screen. As shown in C in fig. 15, the control interface includes a touch area. The user can slide up, down, left and right in the touch area, and the like, so that the game is controlled.
The above is merely an example of a method for displaying, by a source device, a control interface corresponding to a target application on an external screen, and a specific implementation of the method for displaying, by a source device, a control interface corresponding to a target application on an external screen in this embodiment of the present application is not limited.
In some embodiments, after sending the screen-casting content to the target device by the source device based on steps 801 to 803 or based on steps 801 and 802 and steps 805 to 807, if it is received that the user has expanded the inner screen from the closed state to the expanded state, in response to receiving that the user has expanded the inner screen from the closed state to the expanded state, the source device does not send the screen-casting content to the target device any more, and displays a user interface of an application program in which the screen-casting content is located on the inner screen. For example, when the inner screen of the electronic device is in a closed state, a video on the user interface of the love art is projected to the smart television, and when the inner screen of the electronic device is expanded to an expanded state by a user, the projection is stopped in response to the operation that the user expands the inner screen from the closed state to the expanded state, and the user interface of the love art is mapped to the inner screen for display. Furthermore, the outer screen does not display the control interface corresponding to the love art any more, for example, the outer screen can be turned off.
Further, based on steps 801 to 803, or based on steps 801 and 802, and steps 805 to 807, after receiving the screen-projected content sent by the source device, the target device may perform clipping or re-layout on the screen-projected content, and then perform presentation on the target device. For example, before sending the screen-projected content, the source device may also perform corresponding clipping or re-layout on the content acquired from the target application program according to the device attribute (e.g., resolution, touch capability, etc.) of the target device to adapt to the screen-projected content presented by the target device, and then send the screen-projected content to the target device. Thereby helping to enable the target device to normally present the projected content.
It should be noted that the method for displaying, by the source device, the control interface corresponding to the target application on the external screen in the embodiment of the present application may also be applied to other screen projection methods besides the embodiment of the present application, and is not limited to this.
Example two:
the source device is an electronic device including only the first display screen, such as a tablet computer, a mobile phone, and the like. For example, the first display screen of the source device may be the first display screen 141 shown in fig. 4A, and is located on the front side of the source device, and the back side of the source device does not include the display screen.
Illustratively, as shown in fig. 16, a schematic flow chart of another screen projection method according to an embodiment of the present application specifically includes the following steps.
Step 1601, when a first display screen of the source device is in use, receiving a screen-off operation on the first display screen. For a specific description of the first display screen of the source device in use, refer to the related description of the inner screen of the source device in use in step 1601, which is not described herein again.
The screen-off operation on the first display screen may be, but is not limited to, an operation of pressing a power key by a user, a voice instruction of the user, or an operation of clicking a virtual key for controlling screen-off by the user.
In step 1602, the source device turns off the first display screen in response to the screen-off operation on the first display screen.
In step 1603, the source end device determines whether the smart screen projection function is started after the first display screen is turned off, if the smart screen projection function is started, step 1604 is executed, and if the smart screen projection function is not started, the process is ended.
In step 1604, the source device determines whether the first display screen is blocked, and if the first display screen is blocked, the current flow is ended. If the first display screen is not shielded, go to step 1605.
In step 1605, the source device determines the target application program from the currently running application program. The specific implementation manner of determining the target application from the currently running application by the source device may refer to related description in step 803, and is not described herein again.
In step 1606, the source device obtains the screen-casting content from the target application program, and sends the screen-casting content to the target device. The relevant implementation manner of step 1606 may refer to the relevant implementation manner of step 804, and is not described herein again.
It should be noted that the source device may skip step 1603 and step 1604 and execute step 1605 and step 1606 after executing step 1602.
In other embodiments, as an alternative to step 1605 and step 16076, after step 1602 or step 1604, the method may further comprise:
step 1607, the source device displays an identifier of at least one application program supporting the screen projection function currently running on a partial area of the first display screen, and displays an identifier of at least one electronic device for receiving the screen projection content.
It should be noted that, in this embodiment, a user may set, as needed, a size and a position of an area for displaying, on the first display screen, the identifier of the currently-running at least one application program supporting a screen projecting function and the identifier of the at least one electronic device for receiving the screen projecting content, or a size and a position of an area for displaying, on the first display screen, the identifier of the currently-running at least one application program supporting a screen projecting function and the identifier of the at least one electronic device for receiving the screen projecting content are set by the source device before shipping, which is not limited herein. Illustratively, as shown in fig. 17, the source device displays an identification of at least one currently running application program supporting a screen-casting function on an area 1700 of the first display screen 141; and displaying an identification of at least one electronic device for receiving the screen-cast content on a region 1700 of the first display screen 141. Specifically, for a manner of displaying, in a partial area of the first display screen, an identifier of at least one currently running application program supporting a screen projection function, and an identifier of at least one electronic device for receiving screen projection content, reference may be made to related descriptions in example one, and details are not described here again.
Further, the source device may further display, on a partial area of the first display screen, an identifier of at least one currently running application program that supports the screen-casting function, and when displaying an identifier of at least one electronic device for receiving screen-casting content, also display a time, a date, and the like on the first display screen.
In step 1608, after receiving the identifier of one application selected by the user from the identifiers of the at least one application supporting the screen projection function currently running, the source device obtains the screen projection content from the application identified by the identifier of the application selected by the user, and sends the screen projection content to the electronic device identified by the identifier of the electronic device selected by the user from the identifiers of the at least one electronic device for receiving the screen projection content.
It should be noted that, in this embodiment of the application, when the first display screen is turned off, the user may perform corresponding operations on the identifier of the application program and the identifier of the electronic device displayed on the partial area of the first display screen, which is beneficial to reducing steps of the user operations.
Further, in step 1608, after receiving an identifier of an application selected by a user from the identifiers of at least one application supporting a screen projection function currently running and an identifier of an electronic device selected from the identifiers of at least one electronic device for receiving screen projection content, or after determining a target device for receiving screen projection content, the source device displays a user interface in the screen projection on a partial area of the first display screen, for example, the user interface shown in fig. 13.
Further, based on steps 1601 to 1606, or based on steps 1601, 1602, 1603, 1604, and 1607 to 1608, the source device displays the control interface on the partial area of the first display screen after successfully sending the screen projection content to the target device. For a related description of displaying the control interface on the partial area of the first display screen, reference may be made to a related description of displaying the control interface on the external screen in example one.
It should be noted that, an area for displaying the control interface on the first display screen and an area for displaying the identifier of the application program and the identifier of the electronic device in step 1607 may be the same or different, and are not limited thereto.
In some embodiments, after the source device sends the screen-casting content to the target device based on steps 1601 to 1606 or based on steps 1601, 1602, 1603, 1604, and 1607 to 1608, if an operation (e.g., an unlocking operation) of reusing the first display screen by the user is received, in response to receiving the unlocking operation of the user on the first display screen, the screen-casting content is no longer sent to the target device, and the user interface of the application program where the screen-casting content is located is displayed on the first display screen. The operation of unlocking the first display screen by the user may be an operation of inputting a fingerprint, an operation of inputting a password, and the like, which is not limited.
For example, after the first display screen of the electronic device is turned off, the video on the user interface of the love art is delivered to the smart television, and when the first display screen of the electronic device is unlocked, the screen projection is stopped, and the user interface of the love art is mapped to the first display screen for display.
Further, based on steps 1601 to 1606, or based on steps 1601, 1602, 1603, 1604 and 1607 to 1608, after receiving the screen projection content sent by the source device, the target device may perform clipping or re-layout on the screen projection content, and then perform presentation on the target device. For example, before sending the screen-projected content, the source device may also perform corresponding clipping or re-layout on the content acquired from the target application program according to the device attribute (e.g., resolution, touch capability, etc.) of the target device to adapt to the screen-projected content presented by the target device, and then send the screen-projected content to the target device. Thereby helping to enable the target device to normally present the projected content.
Example three:
compared with the embodiment, the source device is an electronic device comprising a first display screen and a second display screen. For example, the first display screen of the source device may be the first display screen 141 shown in fig. 4A and located on the front side of the source device, and the second display screen of the source device may be the second display screen 142 shown in fig. 4B and located on the back side of the source device.
For example, as shown in fig. 18, a flowchart of another screen projection method according to an embodiment of the present application specifically includes the following steps 1601 and 1602, and after the step 1602 is executed, the following steps are further executed:
step 1803, after the source device turns off the screen of the first display screen, it determines whether the intelligent screen projecting function is already started, if so, step 1804 is executed, and if not, the process is ended.
In step 1804, the source device determines whether the first display screen and the second display screen are blocked, and if both the first display screen and the second display screen are blocked, the current flow is ended. If one of the first display screen and the second display screen is not blocked, step 1805 is executed.
For example, the source device first determines whether the second display screen is blocked, if the second display screen is not blocked, step 1805 is executed, if the second display screen is blocked, then determines whether the first display screen is blocked, if the first display screen is not blocked, step 1805 is executed, and if the first display screen is blocked, the process is ended.
In step 1805, the source device determines a target application from a currently running application. The specific implementation manner of determining the target application from the currently running application by the source device may refer to related description in step 803, and is not described herein again.
Step 1806, the source device obtains the screen-shot content from the target application program, and sends the screen-shot content to the target device. The implementation manner of step 1806 may refer to the implementation manner of step 604, and is not described herein again.
It should be noted that the source device may skip the step 1803 and the step 1804 and execute the step 1805 and the step 1806 after executing the step 1602.
In other embodiments, as an alternative to step 1805 and step 1806, after step 1803 or step 1804, the method may further include:
step 1807, when the second display screen is not blocked, the source device displays, on the second display screen, an identifier of at least one currently running application program supporting the screen projection function, and an identifier of at least one electronic device for receiving the screen projection content. When the second display screen is shielded and the first display screen is not shielded, the source end device displays an identifier of at least one application program supporting a screen projection function which is currently running and an identifier of at least one electronic device for receiving screen projection content on a partial area of the first display screen.
It should be noted that, for a specific implementation manner of displaying, on the second display screen, the identifier of the currently running at least one application program supporting the screen projecting function and displaying the identifier of the at least one electronic device for receiving the screen projecting content, reference may be made to the relevant description in example one, and details are not repeated here. For a way of displaying, in a partial area of the first display screen, an identifier of at least one currently running application program that supports a screen projection function, and an identifier of at least one electronic device for receiving screen projection content, reference may be made to the related description in example two, and details are not described here again.
Step 1808, after receiving the identifier of an application selected by the user from the identifiers of at least one currently running application supporting the screen-projecting function, the source device obtains the screen-projecting content from the application identified by the identifier of the application selected by the user, and sends the screen-projecting content to the electronic device identified by the identifier of the electronic device selected by the user from the identifiers of at least one electronic device for receiving the screen-projecting content.
Further, in step 1808, after receiving an identifier of an application selected by a user from identifiers of at least one currently running application supporting a screen-projection function and an identifier of an electronic device selected from identifiers of at least one electronic device for receiving screen-projection content, or after determining a target device for receiving screen-projection content, the source device displays a user interface in delivery, for example, the user interface shown in fig. 13, on a partial area of the first display screen.
Still further, based on steps 1601, 1602 and steps 1803 to 1806, or based on steps 1601, 1602, 1803, 1804 and steps 1807 to 1808, after the source device successfully sends the screen projection content to the target device, the source device displays a control interface in a partial area of the first display screen or on the second display screen. The control interface comprises a virtual button with a touch function. It should be noted that, for a specific implementation manner of displaying the control interface on the partial area of the first display screen or the second display screen, reference may be made to the related description in example one.
It should be further noted that, an area for displaying the control interface on the first display screen and an area for displaying the identifier of the application program and the identifier of the electronic device in step 1808 may be the same or different, which is not limited herein.
Example four:
the source device is an electronic device including a flip cover, such as a flip phone, as shown in fig. 19, and includes an inner screen and an outer screen, where the inner screen is located at the inner side of the phone cover, not shown in the figure, and the outer screen is located at the outer side of the phone cover. In this scenario, compared with the electronic device in which the source device is a foldable screen in the example i, the difference of the screen projection method is that when the source device is a flip phone, the source device receives an operation of closing a phone cover by a user, and turns off the screen of the internal screen. After the source device opens the mobile phone cover, the internal screen can be unlocked in response to the user outputting a password or long pressing a preset key (such as a # key, an x key, a combination key and the like). That is, in a scenario of a flip phone, when the source device normally uses the inner screen, receiving an operation of closing the phone cover by the user, responding to an inner screen lock of closing the phone cover by the user, and then triggering screen projection, which can be specifically referred to as a first example, the source device performs steps after step 802, which is not described herein again.
Example five:
the source device may also be an electronic device mounted with a smart sheath, taking the source device as the electronic device 10 as an example, for example, as shown in fig. 20, the electronic device 10 includes a first display screen 141, and the electronic device 10 is snapped into the smart sheath 20, where the smart sheath 20 includes the visualization area 18. Wherein, after the user closes the cover 16 of the smart protective sheath 20, it can be seen as C in fig. 20. After the user opens the cover 16 of the smart protective sheath 20, it may be shown as B in fig. 20.
In such a scenario, compared with the electronic device with the foldable screen as the source device in the example i, the difference of the screen projection method is that in the scenario of the electronic device with the intelligent protective cover installed, the source device receives an operation of closing the cover of the intelligent protective cover by a user, and turns off the screen of the first display screen. After the cover of intelligent protective sheath is opened to the source end equipment, can respond to user output password, fingerprint etc. and unblock first display screen. That is, in the scene of the electronic device with the intelligent protective cover, when the source device normally uses the first display screen, the operation of closing the cover of the intelligent protective cover by the user is received, the first display screen is locked in response to the user closing the mobile phone cover, and then the screen is triggered to be projected, which can be specifically referred to as example one, the source device executes the steps after step 602, which is not described herein again.
It should be noted that, in this scenario, the identifier of the application, the identifier of the electronic device, and the control interface may be displayed in the visualization area 17 of the smart sheath 20, and specific display manners may refer to related descriptions in example one, and are not described herein again.
Further, in other embodiments, the smart sheath may not include a visualization area when the source device need not prompt the user for an identification of the application, an identification of the electronic device, or a control interface.
Example six
The source device can also be another electronic device with a foldable screen. The source end device comprises a first display screen, and the first display screen is a foldable screen. For example, the first display screen of the source device may be as shown in fig. 5A when in the expanded state, as shown in fig. 5B when in the closed state, and when in the closed state as shown in fig. 5B, the source device may present a corresponding interface to the user through an area 500 in the first display screen 141. In this scenario, when the first display screen of the source device is in use in the expanded state, receiving an operation of folding the first display screen from the expanded state to the closed state, and triggering screen projection to the target device, where a specific screen projection method may be described in the related description in example one. For example, the source device receives an operation of folding the first display screen 141 from the unfolded state to the closed state, and if the screen projection fails, the content displayed on the first display screen 141 before the operation is received may be mapped to the area 500 of the folded first display screen 141 for display, and if the screen projection succeeds, the control interface may be displayed in the area 1500 of the first display screen 141, where a manner of displaying the control interface in the area 1500 of the first display screen 141 may be described in relation to the description in example one.
Further, in other embodiments, the first display screen of the source device receives an operation of unfolding the first display screen from the closed state to the unfolded state in the closed state, and stops the screen projection in response to the operation. For example, after the source device stops projecting the screen, the user interface where the screen-projected content is located may be automatically displayed on the first display screen.
Example seven
The source device can also be a screen-retractable electronic device, including a first display screen, where the first display screen is a retractable display screen. For example, the extended state of the first display screen of the source device may be as shown in fig. 6A, and the retracted state of the first display screen may be as shown in fig. 6B. In this scenario, when the first display screen of the source device is in use in an extended state, a contraction operation on the first display screen is received, and a screen projection to the target device is triggered, where a specific screen projection method may be referred to as related description in example one. For example, after the source device receives the contraction operation on the first display screen 141, if the screen projection fails, the content displayed on the first display screen 141 before the operation is received may be mapped to the area 600 of the contracted first display screen 141 for display, and if the screen projection succeeds, the control interface may be displayed in the area 600 of the first display screen 141, where a manner of displaying the control interface in the area 600 of the first display screen 141 may be described in relation to example one.
Further, in other embodiments, the first display screen of the source device receives an extending operation on the first display screen in a contracted state, and stops the screen projection in response to the extending operation. For example, after the source device stops projecting the screen, the user interface where the screen-projected content is located may be automatically displayed on the first display screen.
The above embodiments can be used alone or in combination with each other to achieve different functions.
In the embodiments provided in the present application, the method provided in the embodiments of the present application is described from the perspective of an electronic device as an execution subject. In order to implement the functions in the method provided by the embodiments of the present application, the electronic device may include a hardware structure and/or a software module, and the functions are implemented in the form of a hardware structure, a software module, or a hardware structure and a software module. Whether any of the above functions is implemented as a hardware structure, a software module, or a combination of a hardware structure and a software module depends upon the particular application and design constraints imposed on the technical solution.
Based on the same concept, fig. 21 shows an apparatus 2100 provided by the present application for performing the screen projection method shown in fig. 8, 16 or 18. Illustratively, the device 2100 includes a processing module 2101 and a transceiver module 2102.
Illustratively, the processing module 2101 is configured to detect a user operation, and trigger the transceiver module 2102 to send the screen-shot content to the target device in response to the user operation.
Based on the same concept, fig. 22 shows an apparatus 2200 provided by the present application. The device 2200 includes at least one processor 2210, memory 2220, and a transceiver 2230. The processor 2210 is coupled to the memory 2220 and the transceiver 2230, which is an indirect coupling or communication link between devices, units or modules in this embodiment, and can be in an electrical, mechanical or other form for information exchange between devices, units or modules. The connection medium between the transceiver 2230, the processor 2210 and the memory 2220 is not limited in the embodiments of the present application. For example, in fig. 22, the memory 2220, the processor 2210 and the transceiver 2230 in the embodiment of the present application may be connected through a bus, and the bus may be divided into an address bus, a data bus, a control bus, and the like.
In particular, memory 2220 is used to store program instructions.
The transceiver 2230 is used for transmitting screen shot content, control instructions, and the like to the target device.
Processor 2210 is used to call program instructions stored in memory 2220 to cause device 2200 to perform the screen projection method shown in fig. 8, 16, or 18.
In the embodiments of the present application, the processor 2210 may be a general purpose processor, a digital signal processor, an application specific integrated circuit, a field programmable gate array or other programmable logic device, discrete gate or transistor logic, discrete hardware components, and may implement or perform the methods, steps, and logic blocks disclosed in the embodiments of the present application. The general purpose processor may be a microprocessor or any conventional processor or the like. The steps of a method disclosed in connection with the embodiments of the present application may be directly implemented by a hardware processor, or may be implemented by a combination of hardware and software modules in a processor.
In this embodiment, the memory 2220 may be a non-volatile memory, such as a Hard Disk Drive (HDD) or a solid-state drive (SSD), and may also be a volatile memory (RAM), for example. The memory is any other medium that can be used to carry or store desired program code in the form of instructions or data structures and that can be accessed by a computer, but is not limited to such. The memory in the embodiments of the present application may also be circuitry or any other device capable of performing a storage function for storing program instructions and/or data.
It should be understood that the apparatus 1300 and the apparatus 2200 may be used to implement the method shown in fig. 8, fig. 16 or fig. 18 according to the embodiment of the present application, and the related features may refer to the above description and are not described herein again.
It is clear to those skilled in the art that the embodiments of the present application can be implemented in hardware, or firmware, or a combination thereof. When implemented in software, the functions described above may be stored on or transmitted over as one or more instructions or code on a computer-readable medium. Computer-readable media includes both computer storage media and communication media including any medium that facilitates transfer of a computer program from one place to another. A storage media may be any available media that can be accessed by a computer. Take this as an example but not limiting: the computer-readable medium may include RAM, ROM, an Electrically Erasable Programmable Read Only Memory (EEPROM), a compact disc read-Only memory (CD-ROM) or other optical disk storage, magnetic disk storage or other magnetic storage devices, or any other medium that can be used to carry or store desired program code in the form of instructions or data structures and that can be accessed by a computer. Furthermore. Any connection is properly termed a computer-readable medium. For example, if software is transmitted from a website, a server, or other remote source using a coaxial cable, a fiber optic cable, a twisted pair, a Digital Subscriber Line (DSL), or wireless technologies such as infrared, radio, and microwave, the coaxial cable, the fiber optic cable, the twisted pair, the DSL, or the wireless technologies such as infrared, radio, and microwave are included in the fixation of the medium. Disk and disc, as used in embodiments of the present application, includes Compact Disc (CD), laser disc, optical disc, digital Versatile Disc (DVD), floppy disk and blu-ray disc where disks usually reproduce data magnetically, while discs reproduce data optically with lasers. Combinations of the above should also be included within the scope of computer-readable media.
In short, the above description is only an example of the present application, and is not intended to limit the scope of the present application. Any modifications, equivalents, improvements and the like made in accordance with the disclosure of the present application are intended to be included within the scope of the present application.

Claims (14)

1. A screen projection method is applied to a first electronic device, the first electronic device comprises an inner screen and an outer screen, the inner screen is a foldable screen, and the method comprises the following steps:
the method comprises the steps that a first electronic device receives operation of folding the inner screen from an unfolded state to a closed state;
in response to the operation of folding the inner screen from the unfolded state to the closed state, the first electronic device turns off the inner screen and acquires screen projection content from a target application program in at least one currently running application program;
and the first electronic equipment sends the screen projection content to second electronic equipment.
2. The method of claim 1, wherein after the inner screen of the first electronic device is turned off in response to the operation of folding the inner screen from the unfolded state to the closed state and before screen projection content is obtained from a target application of the currently running at least one application, the method further comprises:
determining that an intelligent screen projection function is started; and/or
Prompting whether the screen projection is allowed or not to the user, and receiving the operation that the screen projection is allowed by the user; and/or the presence of a gas in the gas,
determining that the outer screen is not occluded.
3. The method of claim 2, wherein the method further comprises:
and if the outer screen is shielded, the first electronic equipment does not project the screen any more.
4. The method of any of claims 1-3, wherein after the first electronic device sends the projected content to the second electronic device, the method further comprises:
and the first electronic equipment displays a control interface on the outer screen, wherein the control interface is used for realizing quick operation of the target application program.
5. The method of claim 4, wherein the first electronic device displays a control interface on the outer screen, comprising:
the first electronic equipment identifies a virtual button with a touch function in the target application program, and displays the control interface on the outer screen according to the virtual button with the touch function; and/or the presence of a gas in the gas,
the first electronic device determines a control interface corresponding to the type of the target application program from preset control interfaces corresponding to the types of the application programs, and displays the control interface corresponding to the type of the target application program on the outer screen.
6. The method of any one of claims 1 to 3, wherein before the first electronic device obtains the screen-shot content from the target application of the currently running at least one application, the method further comprises:
the first electronic equipment displays an identifier of at least one application program supporting a screen projection function in at least one currently running application program on the outer screen;
the first electronic equipment receives an operation that a user selects an identifier of an application program displayed on an outer screen;
and responding to the operation of selecting the identification of the application program displayed on the outer screen by the user, and determining that the target application program is the application program identified by the identification of the application program selected by the user.
7. The method of any of claims 1-3, wherein after the first electronic device has turned an internal screen of the first electronic device off, and before the projected content is sent to the second electronic device, the method further comprises:
the first electronic equipment acquires an identifier of at least one piece of electronic equipment;
the first electronic device determines an identifier of a target electronic device from identifiers of the at least one electronic device, wherein the identifier of the target electronic device is used for identifying the second electronic device.
8. The method of claim 7, wherein the first electronic device determining an identity of a target electronic device from the identities of the at least one electronic device, comprises:
and the first electronic equipment determines the identifier for identifying the private electronic equipment as the identifier of the target electronic equipment from the identifiers of the at least one electronic equipment.
9. The method of claim 7, wherein the first electronic device determining an identity of a target electronic device from the identities of the at least one electronic device, comprises:
the first electronic device displays an identification of at least one electronic device on the outer screen;
the first electronic equipment receives an operation that a user selects an identifier of the electronic equipment displayed on the outer screen;
and the first electronic equipment takes the identifier of the electronic equipment selected by the user as the identifier of the target electronic equipment.
10. The method of claim 9, wherein each of the identities of the at least one electronic device is used to identify a common electronic device.
11. The method of any of claims 1-3, wherein after the first electronic device transmits the screen-cast content to a second electronic device, the method further comprises:
the first electronic equipment receives an operation of unfolding the inner screen from a closed state to an unfolded state;
in response to the operation of unfolding the inner screen from the closed state to the unfolded state, the first electronic device stops screen projection and displays screen projection content on the inner screen.
12. An electronic device, wherein the electronic device comprises a processor and a memory;
the memory has stored therein program instructions;
the program instructions, when executed, cause the electronic device to perform the method of any of claims 1 to 11.
13. A chip, wherein the chip is coupled to a memory in an electronic device, such that when run, the chip invokes program instructions stored in the memory to implement the method of any of claims 1 to 11.
14. A computer-readable storage medium, comprising program instructions which, when run on an apparatus, cause the apparatus to perform the method of any one of claims 1 to 11.
CN201910704758.6A 2019-07-31 2019-07-31 Screen projection method and electronic equipment Active CN112394891B (en)

Priority Applications (3)

Application Number Priority Date Filing Date Title
CN201910704758.6A CN112394891B (en) 2019-07-31 2019-07-31 Screen projection method and electronic equipment
CN202310208263.0A CN116185324A (en) 2019-07-31 2019-07-31 Screen projection method and electronic equipment
PCT/CN2020/106096 WO2021018274A1 (en) 2019-07-31 2020-07-31 Screen projection method and electronic device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201910704758.6A CN112394891B (en) 2019-07-31 2019-07-31 Screen projection method and electronic equipment

Related Child Applications (1)

Application Number Title Priority Date Filing Date
CN202310208263.0A Division CN116185324A (en) 2019-07-31 2019-07-31 Screen projection method and electronic equipment

Publications (2)

Publication Number Publication Date
CN112394891A CN112394891A (en) 2021-02-23
CN112394891B true CN112394891B (en) 2023-02-03

Family

ID=74230363

Family Applications (2)

Application Number Title Priority Date Filing Date
CN201910704758.6A Active CN112394891B (en) 2019-07-31 2019-07-31 Screen projection method and electronic equipment
CN202310208263.0A Pending CN116185324A (en) 2019-07-31 2019-07-31 Screen projection method and electronic equipment

Family Applications After (1)

Application Number Title Priority Date Filing Date
CN202310208263.0A Pending CN116185324A (en) 2019-07-31 2019-07-31 Screen projection method and electronic equipment

Country Status (2)

Country Link
CN (2) CN112394891B (en)
WO (1) WO2021018274A1 (en)

Families Citing this family (21)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN118259861B (en) * 2021-02-26 2024-10-29 华为技术有限公司 Screen-throwing display method and electronic equipment
CN115131547A (en) * 2021-03-25 2022-09-30 华为技术有限公司 Method, device and system for image interception by VR/AR equipment
CN113259757A (en) * 2021-04-08 2021-08-13 读书郎教育科技有限公司 Method for video screen projection by being convenient and fast to be compatible with multiple applications
CN113138737B (en) 2021-04-16 2023-11-03 阿波罗智联(北京)科技有限公司 Display control method, device, equipment, medium and program product for screen-throwing scene
CN113268211B (en) * 2021-05-13 2023-05-12 维沃移动通信(杭州)有限公司 Image acquisition method, device, electronic equipment and storage medium
CN115373558A (en) * 2021-05-18 2022-11-22 广州视源电子科技股份有限公司 Screen projection method, device, equipment and storage medium
EP4379544A4 (en) * 2021-09-09 2024-08-21 Huawei Tech Co Ltd System and method for displaying and controlling remote device task
CN114063951B (en) * 2021-09-26 2022-12-02 荣耀终端有限公司 Screen projection abnormity processing method and electronic equipment
CN113849146A (en) * 2021-09-30 2021-12-28 联想(北京)有限公司 Display method, display device and computer storage medium
CN114035973A (en) * 2021-10-08 2022-02-11 阿波罗智联(北京)科技有限公司 Screen projection method and device of application program, electronic equipment and storage medium
CN114089940B (en) * 2021-11-18 2023-11-17 佛吉亚歌乐电子(丰城)有限公司 Screen projection method, device, equipment and storage medium
CN114168097A (en) * 2021-12-09 2022-03-11 Oppo广东移动通信有限公司 Screen control method, screen control device, electronic equipment, storage medium and computer program
CN114428599A (en) * 2022-01-30 2022-05-03 深圳创维-Rgb电子有限公司 Screen projection brightness control method and device, storage medium and screen projector
CN114564168B (en) * 2022-02-28 2024-10-01 深圳创维-Rgb电子有限公司 Screen projection reading method, electronic equipment and readable storage medium
CN114786058B (en) * 2022-04-27 2024-02-06 南京欧珀软件科技有限公司 Multimedia data display method, device, terminal and storage medium
CN116048350B (en) * 2022-07-08 2023-09-08 荣耀终端有限公司 Screen capturing method and electronic equipment
CN117850644A (en) * 2022-09-30 2024-04-09 华为技术有限公司 Window switching method and electronic equipment
CN115424555A (en) * 2022-09-30 2022-12-02 联想(北京)有限公司 Display control method and device
CN115665473A (en) * 2022-10-14 2023-01-31 维沃移动通信有限公司 Screen projection method and device, electronic equipment and storage medium
CN115802113B (en) * 2022-10-27 2024-06-04 北京奇艺世纪科技有限公司 Screen projection control method, system, device and computer readable storage medium
CN115964011B (en) * 2023-03-16 2023-06-06 深圳市湘凡科技有限公司 Method and related device for displaying application interface based on multi-screen cooperation

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6466369B1 (en) * 1998-03-11 2002-10-15 Alan Maddock Portable visual display device with a collapsible presentation screen
JP2003101909A (en) * 2001-09-25 2003-04-04 Matsushita Electric Ind Co Ltd Portable electronic equipment and image display device

Family Cites Families (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103369070A (en) * 2012-04-04 2013-10-23 朱洪来 Three-screen flip intelligent handset
KR20140140957A (en) * 2013-05-30 2014-12-10 삼성전자주식회사 Method for mirroring screen data, machine-readable storage medium and electronic device
US20140372896A1 (en) * 2013-06-14 2014-12-18 Microsoft Corporation User-defined shortcuts for actions above the lock screen
CN103399643A (en) * 2013-08-23 2013-11-20 深圳市金立通信设备有限公司 Application program starting method of flexible terminal and flexible terminal
KR102538955B1 (en) * 2016-03-02 2023-06-01 삼성전자 주식회사 Electronic apparatus and method for displaying and transmitting image thereof
CN107589973A (en) * 2017-08-29 2018-01-16 珠海格力电器股份有限公司 Method and device for starting application and electronic equipment
CN107659712A (en) * 2017-09-01 2018-02-02 咪咕视讯科技有限公司 A kind of method, apparatus and storage medium for throwing screen
CN109871147B (en) * 2019-02-22 2020-12-01 华为技术有限公司 Touch screen response method and electronic equipment
CN109992231B (en) * 2019-03-28 2021-07-23 维沃移动通信有限公司 Screen projection method and terminal
CN110058828B (en) * 2019-04-01 2022-06-21 Oppo广东移动通信有限公司 Application program display method and device, electronic equipment and storage medium
CN110308885B (en) * 2019-06-25 2022-04-01 维沃移动通信有限公司 Screen projection method and mobile terminal

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6466369B1 (en) * 1998-03-11 2002-10-15 Alan Maddock Portable visual display device with a collapsible presentation screen
JP2003101909A (en) * 2001-09-25 2003-04-04 Matsushita Electric Ind Co Ltd Portable electronic equipment and image display device

Non-Patent Citations (3)

* Cited by examiner, † Cited by third party
Title
Zhixin Shen ; Reika Sato ; Zhejun Liu ; Tomoyuki Takami."Super large screen games with interactive floor display".《IEEE》.2017, *
从专利看基于双显示屏移动终端的节能技术;邓承;《通信企业管理》;20160210(第02期);第78、79页 *
基于物联网架构的高清视音频无线投屏智能终端的研发及其应用;郑锋;《智能建筑》;20150406(第04期);第76-80页 *

Also Published As

Publication number Publication date
CN112394891A (en) 2021-02-23
CN116185324A (en) 2023-05-30
WO2021018274A1 (en) 2021-02-04

Similar Documents

Publication Publication Date Title
CN112394891B (en) Screen projection method and electronic equipment
US12120596B2 (en) Method and device for controlling connection to network
US11747953B2 (en) Display method and electronic device
CN115297200A (en) Touch method of equipment with folding screen and folding screen equipment
US20240220071A1 (en) Display control method, electronic device, and computer storage medium
US11435975B2 (en) Preview display method based on multi-angle and communication system
CN110198362B (en) Method and system for adding intelligent household equipment into contact
US11886830B2 (en) Voice call translation capability negotiation method and electronic device
CN114079691B (en) Equipment identification method and related device
EP4141634A1 (en) Control method applied to electronic device, and electronic device
CN110795187A (en) Image display method and electronic equipment
CN114185503B (en) Multi-screen interaction system, method, device and medium
CN109542325B (en) Double-sided screen touch method, double-sided screen terminal and readable storage medium
CN114090140B (en) Interaction method between devices based on pointing operation and electronic device
WO2024045801A1 (en) Method for screenshotting, and electronic device, medium and program product
JP2024521007A (en) Unlocking method and electronic device
WO2022100219A1 (en) Data transfer method and related device
CN113391775A (en) Man-machine interaction method and equipment
CN111132047A (en) Network connection method and device
KR20180017638A (en) Mobile terminal and method for controlling the same
CN115242994A (en) Video call system, method and device
CN111176777B (en) Call information processing method and electronic equipment
KR20210069719A (en) Information display method and device
JP7532658B2 (en) Incoming call prompting method and electronic device
US20240370218A1 (en) Screen sharing method and related device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant