CN116185324A - Screen projection method and electronic equipment - Google Patents

Screen projection method and electronic equipment Download PDF

Info

Publication number
CN116185324A
CN116185324A CN202310208263.0A CN202310208263A CN116185324A CN 116185324 A CN116185324 A CN 116185324A CN 202310208263 A CN202310208263 A CN 202310208263A CN 116185324 A CN116185324 A CN 116185324A
Authority
CN
China
Prior art keywords
screen
electronic device
user
application program
content
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202310208263.0A
Other languages
Chinese (zh)
Inventor
周星辰
范振华
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Huawei Technologies Co Ltd
Original Assignee
Huawei Technologies Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Huawei Technologies Co Ltd filed Critical Huawei Technologies Co Ltd
Priority to CN202310208263.0A priority Critical patent/CN116185324A/en
Publication of CN116185324A publication Critical patent/CN116185324A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/14Digital output to display device ; Cooperation and interconnection of the display device with other functional units
    • G06F3/1454Digital output to display device ; Cooperation and interconnection of the display device with other functional units involving copying of the display data of a local workstation or window to a remote workstation or window so that an actual copy of the data is displayed simultaneously on two or more displays, e.g. teledisplay
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/14Digital output to display device ; Cooperation and interconnection of the display device with other functional units
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F9/00Arrangements for program control, e.g. control units
    • G06F9/06Arrangements for program control, e.g. control units using stored programs, i.e. using an internal store of processing equipment to receive or retain programs
    • G06F9/44Arrangements for executing specific programs
    • G06F9/451Execution arrangements for user interfaces
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M1/00Substation equipment, e.g. for use by subscribers
    • H04M1/72Mobile telephones; Cordless telephones, i.e. devices for establishing wireless links to base stations without route selection
    • H04M1/724User interfaces specially adapted for cordless or mobile telephones
    • H04M1/72448User interfaces specially adapted for cordless or mobile telephones with means for adapting the functionality of the device according to specific conditions
    • H04M1/72454User interfaces specially adapted for cordless or mobile telephones with means for adapting the functionality of the device according to specific conditions according to context-related or environment-related conditions

Abstract

A screen projection method and electronic equipment relate to the technical field of terminals, can be applied to folding screen equipment, and are beneficial to realizing intelligent screen projection. The method is applied to first electronic equipment, the first electronic equipment comprises an inner screen and an outer screen, the inner screen is a foldable screen, and the method comprises the following steps: after receiving the operation of folding the inner screen from the unfolded state to the closed state, the first electronic device locks the inner screen in response to the operation of folding the inner screen from the unfolded state to the closed state, acquires screen throwing content from a target application program in at least one currently running application program, and then sends the screen throwing content to the second electronic device. The technical scheme is beneficial to the fact that a user can start the screen throwing of the electronic equipment by operating once, so that the screen throwing operation mode is simplified, and the screen throwing efficiency is improved.

Description

Screen projection method and electronic equipment
Technical Field
The application relates to the technical field of terminals, in particular to a screen projection method and electronic equipment.
Background
The screen projection technology refers to that the content on the electronic device a is projected onto the electronic device B by using a wireless communication technology, so that the electronic device B can display the content on the electronic device a. For example, through the screen projection technology, the content on the electronic device with a smaller display screen (such as a mobile phone, a tablet computer and the like) can be transmitted to the electronic device with a larger display screen (such as a television, a projector and the like), so that a user can watch the content on the electronic device with a smaller display screen on the electronic device with a larger display screen, and a better watching effect is achieved.
Along with the continuous popularization of the screen throwing technology, the intelligent screen throwing method has important practical value for the research of intelligent screen throwing.
Disclosure of Invention
The embodiment of the application provides a screen projection method and electronic equipment, which are beneficial to reducing the complexity of an operation mode of triggering the electronic equipment to initiate screen projection and improving the efficiency of triggering source terminal equipment to initiate screen projection.
In a first aspect, a screen projection method of an embodiment of the present application is applied to a first electronic device, where the first electronic device includes a first display screen, and the method includes:
the first electronic device receives a first operation;
responding to the first operation, the first electronic equipment extinguishes the first display screen, and acquires screen throwing content from a target application program of at least one currently running application program;
the first electronic device sends the screen projection content to the second electronic device.
In the embodiment of the invention, the first electronic equipment can respond to the first operation to initiate the screen, so that the user can initiate the screen by operating the first electronic equipment once, the complexity of an operation mode for triggering the electronic equipment to initiate the screen is reduced, and the efficiency for triggering the first electronic equipment to initiate the screen is improved.
In one possible design, the first operation is a quench operation. For example, clicking a power button, closing a cover of the smart cover. For another example, the first electronic device is a flip electronic device and the first operation is an operation of a device cover. For another example, the first display is a foldable display, and the first operation may be an operation of folding the first display from an unfolded state to a closed state. For the externally folded display screen, the first electronic device may turn off a partial area of the first display screen, or may turn off the entire first display screen in response to an operation of folding the first display screen from the unfolded state to the closed state. For another example, the first display is a retractable display, and the first operation may be an operation to retract the first display. For the retractable display, similar to the external folding display, the first electronic device may quench a partial area (i.e., the contracted portion) of the first display, or quench the entire first display, in response to the operation of contracting the first display.
In one possible design, after the first display screen is extinguished in response to the first operation, before the screen-cast content is acquired from the target application of the currently running at least one application, the method further includes:
Determining that an intelligent screen throwing function is started; and/or the number of the groups of groups,
and prompting the user whether to allow screen projection or not, and receiving the operation of allowing screen projection by the user.
Through the technical scheme, interaction with a user is improved.
In another possible design, before the first electronic device sends the screen-casting content to the second electronic device, the method further includes:
and judging the environment where the first electronic equipment is located so as to identify whether the user wants to throw the screen or not. For example, the first electronic device determines whether the display screen is blocked, for example, the first electronic device includes a first display screen, the first display screen is a foldable display screen and is an inner screen, and when the first electronic device further includes an outer screen, the first electronic device may determine whether the outer screen is blocked. As another example, the first electronic device may include only the first display screen, and the first electronic device may determine whether the first display screen is occluded. When the display screen of the first electronic device is not shielded, the first electronic device sends the screen throwing content to the second electronic device. Whether the first electronic device is placed in a bag or a pocket or the like can be identified by judging whether the display screen is shielded or not, and generally, the first electronic device is not used when the user places the first electronic device in the bag or the pocket, and the screen throwing is possibly not needed. Furthermore, the first electronic device can learn the behavior habit of the user using the electronic device through the history record of the user using the device, and further judge whether to send the screen content to the second electronic device according to the behavior habit of the user using the electronic device.
In one possible design, after the first electronic device sends the screen-casting content to the second electronic device, a control interface is displayed on the second display screen, where the control interface is used to implement a shortcut operation to the target application program. The second display may be part or all of the display area of the first display or one display different from the first display. Thereby facilitating the control of the screen contents by the user.
In one possible design, the first electronic device determines a control interface corresponding to the type of the target application program from preset control interfaces corresponding to the type of the application program, and displays the control interface corresponding to the type of the target application program on the second display screen. Helping to simplify implementation.
In one possible design, the first electronic device identifies a virtual button with a touch function in the target application program, and displays the control interface for controlling the target application program on the second display screen according to the virtual button with the touch function. And the reliability of the displayed control interface is improved.
In one possible design, when the first electronic device does not recognize the virtual button with the touch function in the target application program, determining a control interface corresponding to the type of the target application program from preset control interfaces corresponding to the type of the application program, and displaying the control interface corresponding to the type of the target application program on the external screen. Helping to simplify implementation.
In one possible design, the determining, by the first electronic device, a target application from at least one currently running application includes:
the first electronic device displays an identifier of at least one application program supporting a screen throwing function in the currently running at least one application program on the second display screen; after receiving the operation of selecting the identifier of the application program displayed on the external screen by the user, determining that the target application program is the application program identified by the identifier of the application program selected by the user in response to the operation of selecting the identifier of the application program displayed on the external screen by the user. Facilitating improved interaction between the device and the user.
In one possible design, after the first electronic device extinguishes the first display screen, before sending the screen-cast content to the second electronic device, the method further comprises:
The first electronic device obtains the identification of at least one electronic device;
the first electronic device determines an identification of a target electronic device from the identifications of the at least one electronic device, wherein the identification of the target electronic device is used for identifying the second electronic device.
In one possible design, the first electronic device determining an identification of a target electronic device from the identifications of the at least one electronic device includes:
the first electronic device determines an identification for identifying the private electronic device as an identification of the target electronic device from the identifications of the at least one electronic device. Which helps to reduce user operations.
In one possible design, the first electronic device determining an identification of a target electronic device from the identifications of the at least one electronic device includes:
the first electronic device displays an identification of at least one electronic device on a second display screen; and after receiving the operation that the user selects the identification of the electronic equipment displayed on the second display screen, responding to the operation, and taking the identification of the electronic equipment selected by the user as the identification of the target electronic equipment.
In one possible design, each of the identifications of the at least one electronic device is used to identify a public electronic device.
In one possible design, after the first electronic device transmits the on-screen content to a second electronic device, the method further includes:
and the first electronic equipment receives a second operation and responds to the second operation to stop screen projection. For example, after stopping the screen-casting, the first electronic device presents a user interface on which the screen-casting content is located on the first display screen. For example, the second operation may be an unlocking operation. Alternatively, in the case where the first display screen is a foldable display screen, the second operation may be an operation of expanding the first display screen from the closed state to the expanded state. Alternatively, in the case where the first display screen is a retractable display screen, the second operation may be an operation of expanding the first display screen, or the like.
In a second aspect, an embodiment of the present application further provides a method for screen projection control, which is applied to a first electronic device, where the first electronic device includes a first application program, and the method includes:
the first electronic equipment acquires screen throwing content from a first application program;
the first electronic device sends the screen projection content to a second electronic device;
after the first electronic equipment successfully projects a screen, determining a control interface corresponding to the type of the first application program from preset control interfaces corresponding to the type of the application program;
The first electronic device displays the determined control interface on a display screen, wherein the control interface comprises a virtual button with a touch function. After receiving an operation on a certain virtual button on the control interface, the first electronic device responds to the operation to control the display of the screen content on the second electronic device.
The embodiment of the application is beneficial to simplifying the mode of displaying the control interface by presetting the control interface corresponding to the type of the application program.
In one possible design, the first electronic device may configure a control interface corresponding to a type of application in the first electronic device before shipment; or the first electronic equipment acquires a control interface corresponding to the type of the application program from the server according to the application program installed on the first electronic equipment.
In a third aspect, an embodiment of the present application further provides another method for screen-projection control, which is applied to a first electronic device, where the first electronic device includes a first application program, and the method includes:
the first electronic equipment acquires screen throwing content from a first application program;
the first electronic device sends the screen projection content to a second electronic device;
And after the screen is successfully cast, the first electronic equipment identifies a virtual button with a touch control function from the first application program, and a control interface is displayed on the display screen according to the identified virtual button.
For example, the icon of the virtual button included on the control interface may be a result of the first electronic device re-laying out, clipping, and/or scaling the virtual button identified from the first application program.
The control interface includes virtual buttons having the same functions as virtual buttons identified by the first electronic device from the first application.
It should be noted that, after receiving an operation on a certain virtual button on the control interface, the first electronic device responds to the operation to implement control on presenting the screen content on the second electronic device.
Through the technical scheme, the control interface displayed by the first electronic equipment is more accurate, and user experience is improved.
In one possible design, the first electronic device identifies a virtual button with a touch function from a first application program, including:
the first electronic device identifies the virtual button with the touch function from the first application program according to the historical operation record of the first application program by a user, or a software development package (software development kit, SDK) interface provided by the first application program, or the position coordinates of the predefined virtual button in the first application program on the user interface. Or the first electronic equipment identifies the virtual button with the touch control function from the first application program by carrying out voice analysis on the first application program. Helping to simplify implementation.
In one possible design, when the first electronic device does not recognize the virtual button with the touch function from the first application program, determining a control interface corresponding to the type of the first application program from preset control interfaces corresponding to the type of the application program, and displaying the determined control interface on a display screen. The method is beneficial to simplifying the implementation mode and meeting the requirements of users.
In one possible design, the control interface further includes an identification of the alternative at least one application, and an identification of the alternative at least one screen-throwing device. The method and the device are beneficial to users to switch the application programs and/or the screen throwing devices according to the needs of the users.
In one possible design, the control interface further includes a virtual button for canceling the screen. The method is beneficial to users to actively stop screen casting according to requirements, and interaction between equipment and users is improved.
In a fourth aspect, an embodiment of the present application provides a chip, where the chip is coupled to a memory in a device, so that the chip invokes, when running, a program instruction stored in the memory, to implement the foregoing aspects of the embodiments of the present application and a method of any possible design related to the foregoing aspects.
In a fifth aspect, a computer storage medium according to an embodiment of the present application stores program instructions that, when executed on an electronic device, cause the device to perform the above-described aspects of the embodiments of the present application and any one of the possible designs related to the aspects.
In a sixth aspect, a computer program product according to an embodiment of the present application, when run on an electronic device, causes the electronic device to perform a method implementing the above aspects of the embodiments of the present application and any possible designs related to the aspects.
In addition, the technical effects of any one of the possible design manners in the fourth aspect to the sixth aspect may be referred to as technical effects of different design manners in the method part, which are not described herein.
Drawings
Fig. 1 is a schematic view of a scenario applied in an embodiment of the present application;
fig. 2 is a schematic hardware structure of an electronic device according to an embodiment of the present application;
fig. 3A is a schematic diagram of the physical form of an electronic device according to an embodiment of the present application;
FIG. 3B is a schematic diagram of another physical form of an electronic device according to an embodiment of the present application;
FIG. 4A is a schematic diagram of another physical form of an electronic device according to an embodiment of the present application;
FIG. 4B is a schematic diagram of another physical form of an electronic device according to an embodiment of the present application;
FIG. 4C is a schematic diagram of another physical form of an electronic device according to an embodiment of the present application;
FIG. 5A is a schematic diagram of another physical form of an electronic device according to an embodiment of the present application;
FIG. 5B is a schematic diagram of another physical form of an electronic device according to an embodiment of the present application;
FIG. 5C is a schematic diagram of another physical form of an electronic device according to an embodiment of the present application;
FIG. 6A is a schematic diagram of another physical form of an electronic device according to an embodiment of the present application;
FIG. 6B is a schematic diagram of another physical form of an electronic device according to an embodiment of the present application;
fig. 7 is a schematic diagram of a software architecture of an electronic device according to an embodiment of the present application;
fig. 8 is a schematic flow chart of a screen projection method according to an embodiment of the present application;
FIG. 9 is a schematic diagram of another scenario in which embodiments of the present application are applied;
FIG. 10 is a schematic diagram of a user interface according to an embodiment of the present application;
FIG. 11A is a schematic diagram of another user interface of an embodiment of the present application;
FIG. 11B is a schematic diagram of another user interface of an embodiment of the present application;
FIG. 11C is a schematic diagram of another user interface of an embodiment of the present application;
FIG. 11D is a schematic diagram of another user interface according to an embodiment of the present application;
FIG. 12A is a schematic diagram of another user interface of an embodiment of the present application;
FIG. 12B is a schematic diagram of another user interface of an embodiment of the present application;
FIG. 13 is a schematic diagram of another user interface of an embodiment of the present application;
FIG. 14A is a schematic diagram of a control interface according to an embodiment of the present application;
FIG. 14B is a schematic diagram of another control interface according to an embodiment of the present application;
FIG. 14C is a schematic diagram of another control interface according to an embodiment of the present application;
FIG. 14D is a schematic diagram of another control interface according to an embodiment of the present application;
FIG. 14E is a schematic diagram of another control interface according to an embodiment of the present application;
FIG. 14F is a schematic diagram of another control interface according to an embodiment of the present application;
FIG. 14G is a schematic diagram of another control interface according to an embodiment of the present application;
FIG. 15 is a schematic view of another user interface of an embodiment of the present application;
FIG. 16 is a flowchart of another screen projection method according to an embodiment of the present disclosure;
FIG. 17 is a schematic physical diagram of another electronic device according to an embodiment of the present application;
FIG. 18 is a flowchart of another screen projection method according to an embodiment of the present disclosure;
FIG. 19 is a schematic diagram of a physical form of another electronic device according to an embodiment of the present application;
FIG. 20 is a schematic physical diagram of another electronic device according to an embodiment of the present application;
Fig. 21 is a schematic structural diagram of another electronic device according to an embodiment of the present application;
fig. 22 is a schematic structural diagram of another electronic device according to an embodiment of the present application.
Detailed Description
It should be understood that in this application, unless otherwise indicated, "/" means or, for example, A/B may represent A or B; the term "and/or" in this application is merely an association relation describing an association object, and means that three kinds of relations may exist, for example, a and/or B may mean: a exists alone, A and B exist simultaneously, and B exists alone. "at least one" means one or more, and "a plurality" means two or more.
In this application, "exemplary," "in some embodiments," "in other embodiments," and the like are used to indicate an example, instance, or illustration. Any embodiment or design described herein as "exemplary" is not necessarily to be construed as preferred or advantageous over other embodiments or designs. Rather, the term use of an example is intended to present concepts in a concrete fashion.
In addition, the terms "first," "second," and the like, referred to in this application, are used for descriptive purposes only and are not to be construed as indicating or implying relative importance or implicitly indicating the number of technical features indicated or as indicating or implying order.
It should be understood that embodiments of the present application relate to at least two electronic devices. Specifically, the at least two electronic devices include a source device and a target device. The source device may be referred to as source device, and is an electronic device that initiates screen projection. The source device is used for sending the screen content. For example, the screen content may be video, audio, images, documents, games, etc., which are not limited thereto. The target end device may also be called a client device (client device) or a peer device, and is an electronic device that receives the screen content. For example, after receiving the screen content, the target device may present or display the screen content in a corresponding layout. It should be noted that, in the embodiment of the present application, the layout of the screen content at the target device and the layout at the source device may be different or the same.
In some embodiments, the source device may be a portable electronic device, such as a cell phone, tablet, laptop (Laptop), wearable device (e.g., smart watch), or the like. Exemplary embodiments of the portable electronic device described above include, but are not limited to, piggy-back
Figure BDA0004114622120000061
Or other operating system. It should be noted that, the physical form of the portable electronic device is not limited in the embodiments of the present application. For example, the portable electronic device may be a foldable device, a tablet device, a flip device, or the like. It should also be noted that the portable electronic device in the embodiments of the present application may further be provided with an intelligent protection sleeve. In other embodiments of the present application, the source device may also be an all-in-one machine, a desktop machine, or the like.
In some embodiments, the target end device may be a tablet, a kiosk, a desktop, a television, a display, a projector, a stereo, etc., electronic device that may be used to receive and present or display the screen content.
By way of example, fig. 1 shows an application scenario according to an embodiment of the present application. As shown in fig. 1, the electronic device 10 is a source device, and the electronic device 20 is a target device. The electronic device 10 may send the on-screen content to the electronic device 20 such that the on-screen content may be presented or displayed by the electronic device 20 for better viewing. Wherein the electronic device 10 and the electronic device 20 may establish a connection by wire (e.g., via a power line) and/or wireless (e.g., wireless fidelity (wireless fidelity, wi-fi), bluetooth, etc.). It should be noted that fig. 1 is only an example of an application scenario in the embodiment of the present application, and the embodiment of the present application does not limit the number of target end devices that receive the screen content sent by the source end device. Taking the electronic device 10 shown in fig. 1 as an example of a source device, the electronic device 10 may send the screen content to two or more electronic devices including the electronic device 20, or may send only the screen content to the electronic device 20.
However, because the operation mode of initiating the screen throwing by the source terminal device is complex at present and the user experience is poor, the embodiment of the application provides the screen throwing method, so that the source terminal device can respond to the first operation to initiate the screen throwing. The user can operate the source terminal device once, so that the source terminal device can realize screen projection, the complexity of an operation mode of initiating the screen projection by the trigger source terminal device is reduced, and the efficiency of initiating the screen projection by the trigger source terminal device is improved. In some embodiments, the first operation may be used to control the electronic device to turn off the screen, in which case the first operation may be referred to as a turn off operation. It should be noted that, when the electronic device in the embodiment of the present application is in a screen-off state, a black screen may be displayed and an unlocked screen may be displayed, a black screen may be displayed and a locked screen may be displayed, a default user interface may be displayed and an unlocked screen may be displayed, a default user interface may be displayed and a locked screen may be displayed, or a portion of the black screen may be displayed, a portion of the default user interface may be displayed, and the default user interface may include date and time information, and/or commonly used application icons, etc. For example, the content included in the default user interface may be set according to the requirement of the user, or may be set by the electronic device before leaving the factory. For example, the electronic device is in a screen locking state when the screen is turned off, and the first operation may also be referred to as a screen locking operation. When the source terminal equipment is in screen extinction, screen projection can be actively initiated, and screen projection content is sent to the target terminal equipment for presentation or display, so that a user can continuously check corresponding content at the target terminal equipment, and user experience is improved. Furthermore, in other embodiments of the present application, the first operation may be other operations, and in particular, an implementation of the first operation may be related to a physical form of the electronic device.
Embodiments of source and destination devices, and methods for using such source and destination devices, are described below.
Taking the source device as an example, the electronic device 10 in the application scenario shown in fig. 1 is taken as an example. By way of example, fig. 2 shows a schematic hardware structure of an electronic device 10 according to an embodiment of the present application. As shown in fig. 2, the electronic device 10 includes a processor 110, an internal memory 121, an external memory interface 122, a camera 131, a first display screen 141, a sensor module 150, an audio module 160, a speaker 161, a receiver 162, a microphone 163, an earphone interface 164, keys 170, a subscriber identity module (subscriber identification module, SIM) card interface 171, a universal serial bus (universal serial bus, USB) interface 172, a charge management module 180, a power management module 181, a battery 182, a mobile communication module 191, and a wireless communication module 192. In other embodiments, the electronic device 10 further includes a second display screen 142. The first display 141 and the second display 142 may be located on different sides of the electronic device 10, for example, the first display 141 may be located on a first side of the electronic device 10 (e.g., a front side of the electronic device 10) and the second display 142 may be located on a second side of the electronic device 10 (e.g., a back side of the electronic device 10). In addition, the electronic device 10 in the embodiment of the present application may further include a motor, an indicator, a mechanical rotating shaft, and the like.
It should be understood that the hardware configuration shown in fig. 2 is only one example. The source device of the embodiments of the present application may have more or fewer components than the electronic device 10 shown in the figures, may combine two or more components, or may have a different configuration of components. The various components shown in the figures may be implemented in hardware, software, or a combination of hardware and software, including one or more signal processing and/or application specific integrated circuits. It should be understood that the illustrated structure of the present embodiment does not constitute a specific limitation on the electronic device 10. In other embodiments of the present application, the electronic device 10 may include more or fewer components than shown, or certain components may be combined, or certain components may be split, or different arrangements of components. The illustrated components may be implemented in hardware, software, or a combination of software and hardware.
The processor 110 may include one or more processing units, such as: the processor 110 may include an application processor (application processor, AP), a modem, a graphics processor (graphics processing unit, GPU), an image signal processor (image signal processor, ISP), a video codec, a digital signal processor (digital signal processor, DSP), a baseband processor, and/or a neural Network Processor (NPU), etc. Wherein the different processing units may be separate devices or may be integrated in one or more processors.
In some embodiments, a buffer may also be provided in the processor 110 for storing portions of the program and/or data. By way of example, the cache in the processor 110 may be a cache memory. The buffer may be used to hold programs and/or data that the processor 110 has just used, generated, or recycled. If the processor 110 needs to use the program and/or data, it can be called directly from the buffer. The time for processor 110 to acquire programs and/or data is reduced, thereby helping to improve the efficiency of the system.
The internal memory 121 may be used to store programs and/or data. It should be noted that, in the embodiments of the present application, the program may also be referred to as a program instruction. In some embodiments, the internal memory 121 includes a stored program area and a stored data area. The storage program area may be used to store an operating system (such as an Android, IOS, etc., operating system), a computer program required by at least one function (such as screen locking, screen throwing, etc.), and the like. The storage data area may be used to store data created and/or acquired during use of the electronic device (e.g., identification of the target device, images), etc. By way of example, the processor 110 may implement one or more functions by invoking programs and/or data stored in the internal memory 121 to cause the electronic device 10 to perform a corresponding method. For example, the processor 110 invokes certain programs and/or data in the internal memory 121, so that the electronic device 10 executes the screen projection method provided in the embodiments of the present application, thereby improving the efficiency of initiating screen projection by the source device and improving the user experience. The internal memory 121 may employ a high-speed random access memory, a nonvolatile memory, or the like. For example, the non-volatile memory may include at least one of one or more disk storage devices, flash memory devices, and/or universal flash memory (universal flash storage, UFS), among others.
The external memory interface 122 may be used to connect an external memory card (e.g., a Micro SD card) to enable expansion of the memory capabilities of the electronic device 10. The external memory card communicates with the processor 110 via an external memory interface 122 to implement data storage functions. For example, the electronic device 10 may store files of images, music, video, etc. in an external memory card through the external memory interface 122.
The camera 131 may be used to capture or capture moving, still images, etc. Typically, the camera 131 includes a lens and an image sensor. The optical image generated by the object through the lens is projected onto the image sensor and then converted into an electric signal for subsequent processing. The image sensor may be, for example, a charge coupled device (charge coupled device, CCD) or a complementary metal oxide semiconductor (complementary metal-oxide-semiconductor, CMOS) phototransistor. The image sensor converts the optical signal into an electrical signal, and then transfers the electrical signal to the ISP to be converted into a digital image signal. It should be noted that, in the embodiment of the present application, the camera 131 may include one or more cameras.
The first display screen 141 may include a display panel for displaying a graphical user interface (graphical user interface, GUI). For convenience of description, the graphical user interface is simply referred to as a user interface. The electronic device 10 presents or displays corresponding content, such as video, text, images, virtual keys or buttons for enabling user interaction with the electronic device 10, etc., to a user by displaying a user interface on the first display screen 141. In some embodiments, the display panel may employ a liquid crystal display (liquid crystal display, LCD), an organic light-emitting diode (OLED), an active-matrix or active-matrix organic light-emitting diode (AMOLED), a flexible light-emitting diode (FLED), a Miniled, microLed, micro-OLED, a quantum dot light-emitting diode (quantum dot light emitting diodes, QLED), or the like. By way of example, the electronic device 10 may implement display functionality through a GPU, a first display screen 141, an application processor, or the like. It should be understood that the first display screen 141 in this embodiment of the present application may be a foldable screen or a non-foldable screen, which is not limited thereto. It should be noted that, the specific implementation of the second display screen 142 may refer to the specific implementation of the first display screen 141, which is not described herein.
The electronic device 10 is, for example, a screen-foldable electronic device, comprising a first display 141 and a second display 142, wherein the first display 141 is a foldable screen and the second display 142 is a non-foldable screen. Wherein the first display 141 is located on a first side of the electronic device 10, and the second display 142 is located on a second side of the electronic device 10, the first side being different from the second side. For example, the first surface of the electronic device 10 may be referred to as a front surface of the electronic device 10, and as shown in fig. 3A, the first display screen 141 is located on the first surface of the electronic device 10, and in fig. 3A, an included angle of the first display screen 141 is β. The second side of the electronic device 10 may be referred to as the back side of the electronic device 10, as shown in fig. 3B, the second display screen 142 may be located on the second side of the electronic device 10, and in fig. 3B, the included angle of the first display screen 141 is α, where it is noted that the value of the included angle of the first display screen 141 may be in the range of [0 °,180 ° ]. When the included angle of the first display screen 141 is 0 °, the first display screen 141 is in a folded state or a closed state, and when the included angle of the first display screen 141 is 180 °, the first display screen 141 is in an unfolded state. It should be noted that, the first display screen 141 may be referred to as an inner screen or a main screen, and the second display screen 142 may be referred to as an outer screen or an auxiliary screen.
As yet another example, the electronic device 10 is a screen non-foldable electronic device that includes a first display screen 141. By way of example, the electronic device 10 includes a first side and a second side, with the first display 141 being located on the first side of the electronic device. For example, a first side of the electronic device 10 is shown in fig. 4A, and a second side of the electronic device 10 is shown in fig. 4C. When the first side of the electronic device 10 is shown in fig. 4A and the second side of the electronic device 10 is shown in fig. 4C, the electronic device 10 includes only the first display 141. In other embodiments, the electronic device 10 may also include a second display screen 142. Wherein the first display 141 is located on a first side of the electronic device 10 and the second display 142 is located on a second side of the electronic device 10. For example, a first side of the electronic device 10 is shown in FIG. 4A. The second side of the electronic device 10 may be as shown in fig. 4B. It should be appreciated that when the first side of the electronic device 10 is shown in fig. 4A and the second side of the electronic device 10 is shown in fig. 4B, the first display screen 141 may be referred to as a primary screen and the second display screen 142 may be referred to as a secondary screen.
As yet another example, the electronic device 10 is a screen-foldable electronic device that includes a first display screen 141. The first display screen 141 is an electronic device with a foldable screen. The first display screen 141 may be in an unfolded state as shown in fig. 5A, and the first display screen 141 may be in a closed state or in a folded state as shown in fig. 5B. Fig. 5C is a schematic diagram of the first display screen 141 when the included angle is folded to β.
As yet another example, electronic device 10 is a screen-retractable electronic device that includes a first display screen 141. The state of the first display screen 141 after being extended may be as shown in fig. 6A, and the state of the first display screen 141 after being contracted may be as shown in fig. 6B.
The sensor module 150 may include one or more sensors. For example, a touch sensor 150A, a pressure sensor 150B, a distance sensor 150C, and the like. In other embodiments, the sensor module 150 may also include gyroscopes, acceleration sensors, fingerprint sensors, ambient light sensors, proximity light sensors, bone conduction sensors, temperature sensors, and the like.
The touch sensor 150A may also be referred to as a "touch panel". The touch sensor 150A may be provided to the first display screen 141 and/or the second display screen 142. Take the example that the touch sensor 150A is disposed on the first display 141. The touch sensor 150A and the first display screen 141 form a first touch screen, which is also called a "first touch screen". The touch sensor 150A is used to detect a touch operation acting thereon or thereabout. Touch sensor 150A may communicate the detected touch operation to an application processor to determine the touch event type. The electronic device 10 may provide visual output related to touch operations, etc. through the first display screen 141. In other embodiments, the touch sensor 150A may also be disposed on the surface of the electronic device 10 at a different location than the first display 141.
The pressure sensor 150B is used for sensing a pressure signal, and may convert the pressure signal into an electrical signal. For example, the pressure sensor 150B may be provided at the first display screen 141 and/or the second display screen 142. The touch operations that act on the same touch position but with different touch operation intensities may correspond to different operation instructions.
The distance sensor 150C, which may also be referred to as a distance sensor 150C or the like, is used to measure the distance. For example, in a shooting scene, the electronic device 10 may range using the distance sensor 150C to achieve quick focus. As another example, after the first display 141 is locked, the distance sensor 150C may also be used to determine whether the first display 141 and/or the second display 142 is blocked. For example, when the first side of the electronic device 10 is shown in fig. 3A and the second side of the electronic device 10 is shown in fig. 3B, if the first display screen 141 is in a closed state or a folded state, the distance sensor 150C may be used to determine whether the second display screen 142 is occluded. For another example, when the first side of the electronic device 10 is shown in fig. 4A and the second side of the electronic device 10 is shown in fig. 4B, the distance sensor 150C is used to determine whether the first display screen 141 and the second display screen 142 are blocked. For another example, when the first side of the electronic device 10 is shown in fig. 4A and the second side of the electronic device 10 is shown in fig. 4C, the distance sensor 150C is used to determine whether the first display screen 141 is blocked.
The electronic device 10 may implement audio functionality through an audio module 160, speaker 161, receiver 162, microphone 163, headphone interface 164, application processor, and the like. Such as an audio play function, a recording function, a voice wake-up function, etc.
The audio module 160 may be used to digital-to-analog convert, and/or analog-to-digital convert, audio data, and may also be used to encode and/or decode audio data. For example, the audio module 160 may be provided in the processor 110, or some functional modules of the audio module 160 may be provided in the processor 110.
The speaker 161, also called "horn", is used to convert audio data into sound and play the sound. For example, the electronic device 100 may listen to music, answer hands-free calls, or make voice prompts, etc., through the speaker 161.
A receiver 162, also known as a "earpiece," is used to convert audio data into sound and play the sound. For example, when electronic device 100 is answering a telephone call, it may be answered by bringing receiver 162 close to the human ear.
A microphone 163, also referred to as a "microphone," "microphone," is used to collect sound (e.g., ambient sounds, including sounds made by humans, sounds made by devices, etc.) and convert the sound into audio electrical data. When making a call or transmitting a voice, the user may make a sound near the microphone 163 through the mouth of the person, and the microphone 163 collects the sound made by the user. It should be noted that the electronic device may be provided with at least one microphone 163. For example, two microphones 163 are provided in the electronic apparatus, and a noise reduction function may be realized in addition to collecting sound. Also, for example, three, four or more microphones 163 may be provided in the electronic device, so that recognition of a sound source, or a directional recording function may be realized on the basis of realizing sound collection, noise reduction, or the like.
The earphone interface 164 is used to connect a wired earphone. The earphone interface 164 may be a USB interface 170, or may be a 3.5mm open mobile electronic device platform (open mobile terminal platform, OMTP) standard interface, a american cellular telecommunications industry association (cellular telecommunications industry association of the USA, CTIA) standard interface, or the like.
The keys 170 may include a power key, a volume key, etc. The keys 170 may be mechanical keys or virtual keys. The electronic device 10 may generate signal inputs related to user settings and function controls of the electronic device 10 in response to operation of the keys. For example, the electronic device 10 may lock the first display screen 141 in response to a pressing operation to the power key, and trigger execution of the screen projection method of the embodiment of the present application. It should be noted that, in the embodiments of the present application, the power key may also be referred to as a start key, a side key, etc., and the name of the power key is not limited.
The SIM card interface 171 is for connecting a SIM card. The SIM card may be inserted into the SIM card interface 171 or removed from the SIM card interface 171 to effect contact and separation with the electronic device 10. The electronic device 10 may support 1 or K SIM card interfaces 171, K being a positive integer greater than 1. The SIM card interface 171 may support Nano SIM cards, micro SIM cards, and/or SIM cards, etc. The same SIM card interface 171 may simultaneously insert a plurality of SIM cards. The types of the plurality of SIM cards can be the same or different. The SIM card interface 171 may also be compatible with different types of SIM cards. The SIM card interface 171 may also be compatible with external memory cards. The electronic device 10 interacts with the network through the SIM card to perform functions such as talking and data communication. In some embodiments, the electronic device 10 may also employ eSIM cards, namely: an embedded SIM card. The eSIM card can be embedded in the electronic device 10 and cannot be separated from the electronic device 10.
The USB interface 172 is an interface conforming to the USB standard specification, and may specifically be a Mini USB interface, a Micro USB interface, a USB Type C interface, or the like. The USB interface 172 may be used to connect a charger to charge the electronic device 10, or may be used to connect an earphone to the electronic device 10, through which sound is played. When USB interface 172 may be connected to a headset, it may be understood that: the USB interface 172 is used as a headphone interface. By way of example, USB interface 172 may be used to connect other electronic devices, such as AR devices, computers, etc., in addition to being a headset interface.
The charge management module 180 is configured to receive a charge input from a charger. The charger can be a wireless charger or a wired charger. In some wired charging embodiments, the charge management module 180 may receive a charging input of a wired charger through the USB interface 170. In some wireless charging embodiments, the charge management module 180 may receive wireless charging input through a wireless charging coil of the electronic device 10. The battery 182 may be charged by the charge management module 180, and the electronic device 10 may be powered by the power management module 180.
The power management module 181 is used to connect the battery 182, the charge management module 180 and the processor 110. The power management module 181 receives input from the battery 182 and/or the charge management module 180 to power the processor 110, the internal memory 121, the first camera 131, the second camera 132, the first display 141, etc. The power management module 181 may also be used to monitor battery capacity, battery cycle times, battery health (leakage, impedance), etc. In other embodiments, the power management module 181 may also be provided in the processor 110. In other embodiments, the power management module 181 and the charging management module 180 may also be provided in the same device.
The mobile communication module 191 may provide a solution for wireless communication including 2G/3G/4G/5G or the like applied on the electronic device 10. By way of example, the mobile communication module 191 may include filters, switches, power amplifiers, low noise amplifiers (low noise amplifier, LNA), and the like.
The wireless communication module 192 may provide solutions for wireless communication including WLAN (e.g., wi-Fi network), bluetooth (BT), global navigation satellite system (global navigation satellite system, GNSS), frequency modulation (frequency modulation, FM), near field communication technology (near field communication, NFC), infrared technology (IR), etc., as applied on the electronic device 10. The wireless communication module 192 may be one or more devices that integrate at least one communication processing module. For example, the electronic device 10 may send the on-screen content and/or the on-screen instructions to the target device via the wireless communication module 192.
In some embodiments, antenna 1 and mobile communication module 191 of electronic device 10 are coupled, and antenna 2 and wireless communication module 192 are coupled so that electronic device 10 may communicate with other devices. Specifically, the mobile communication module 191 may communicate with other devices through the antenna 1, and the wireless communication module 193 may communicate with other devices through the antenna 2.
It should be noted that, in the embodiment of the present application, the hardware architecture of the target device may be referred to the above description of the hardware architecture of the electronic device 10 in fig. 3, and will not be repeated here.
Fig. 7 illustrates a schematic software architecture of a source device and a target device according to an embodiment of the present application. As shown in fig. 5, the source device includes an input module 710A, a processing module 720A, and an output module 730A.
The input module 710A is configured to detect an operation of a user, and report the operation of the user to the processing module 720A. In this embodiment of the present application, the operation of the user may be a touch operation or a non-touch operation, for example, an operation of displaying a certain user interface on the first display screen 141 or the second display screen 142, an operation of folding the first display screen 141, an operation of pressing a power key, an operation of closing an outer cover of the device, an operation of closing a cover of the intelligent protection sleeve, and the like. Specifically, the input module 710A may detect the operation of the user through a mechanical rotation shaft, a touch sensor, a key, etc., which is not limited.
The processing module 720A is configured to receive the operation of the user reported by the input module 710A, and identify the operation type of the operation of the user reported by the input module 710. For example, when the operation type of the operation by the user is a screen-off operation, the screen is off for the first display screen 141, and after the screen is off for the first display screen 141, the screen is triggered to be thrown. By way of example, the processing module 720A includes an operation recognition module 721A, a screen throw determination module 722A, a content acquisition module 723A, a device acquisition module 724A, and the like. The operation recognition module 721 is configured to recognize an operation type of the user operation reported by the input module 710. The screen projection judging module 722A is configured to judge whether the intelligent screen projection function is started, or whether the first display screen 741 or the second display screen 742 is blocked, etc. The content acquisition module 723A is used to acquire screen contents. The device acquisition module 724A is configured to determine a target device for receiving the screen content.
The output module 730A is configured to establish a connection with a target device and send the screen content to the target device. Also, for example, the output module 730A is further configured to control the second display 742 or the first display 741 to display related information. For example, the output module 730A is configured to control the second display 742 or the first display 741 to display a control interface.
As illustrated in fig. 7, the target device includes an input module 710B, a processing module 720B, and an output module 730B.
The input module 710A is configured to establish a connection with the source device, and receive the screen content or the screen instruction sent by the source device. By way of example, input module 710B includes device connection module 711B, content interaction module 712B, and instruction interaction module 713B. The device connection module 711B is configured to establish a connection with a source device. The content interaction module 720B is configured to receive the screen content sent by the source device, and send the screen content to the processing module 720B. The instruction interaction module 713B is configured to receive a screen shot instruction sent by the source device, for example, cancel the screen shot instruction.
The processing module 720B is configured to, after receiving the screen-cast content sent by the content interaction module 720B, rearrange or clip the screen-cast content, and send the rearranged or clipped screen-cast content to the output module 730B.
The output module 730B is configured to present or display the rearranged screen content after receiving the rearranged screen content sent by the processing module 720B.
The following embodiments may be implemented in an electronic device having the above-described hardware structure and/or software structure. The screen projection method of the embodiment of the application is specifically described below by combining source terminal devices with different physical forms.
Example one:
the source device is an electronic device with a foldable screen. The source device comprises an inner screen and an outer screen, wherein the inner screen is a foldable screen, and the outer screen is a non-foldable screen. For example, the inner screen of the source device may be the first display screen 141 shown in fig. 3A or fig. 3B, and the outer screen of the source device may be the second display screen 142 shown in fig. 3B.
An example, as shown in fig. 8, is a schematic flow chart of a screen projection method according to an embodiment of the present application, and specifically includes the following steps.
In step 801, an operation of folding an inner screen of a source device from an unfolded state to a closed state is received while the inner screen is in use.
Wherein, the internal screen of the source device is in use, which can be understood as:
the source device displays a corresponding user interface, such as a main interface, a negative screen (-1 screen), a user interface of an application program, and the like, on the internal screen when the internal screen is in an expanded state and the internal screen is not locked. When the user displays the corresponding user interface on the inner screen, the user can perform corresponding operation according to the need, so that the source device responds to the operation of the user and performs corresponding display on the inner screen. That is, when the internal screen of the source device is in the unfolded state and the screen is not locked, a user can perform corresponding operation on the source device and control the internal screen to perform corresponding display when a certain user interface is displayed, so that the requirements of the user are met.
For example, when the home device is in an expanded state and the home screen is not in the open state, and when the home screen displays a desktop (the desktop may also be referred to as a main interface including icons of one or more applications), the home device detects an operation of clicking an icon (for example, an aiqi icon) of an application on the desktop by a user, and then displays a user interface of an aiqi on the home screen in response to the operation of clicking the aiqi icon. For another example, when the internal screen is in an expanded state and the internal screen is not locked, and the user displays a user interface of a certain application program on the internal screen, the user may perform corresponding operations on virtual buttons (or virtual keys) included on the user interface, so as to implement corresponding control. For example, when the source device detects an operation of a user on the user interface of the love (for example, an operation of clicking a virtual button for controlling the full-screen display of the video) when the user interface of the love is displayed on the inner screen, the source device responds to the operation, and displays the corresponding video on the inner screen in a full-screen manner, and plays the corresponding video, so that the user can watch the corresponding video. For another example, when the source device detects that the user searches for a route on the user interface of the hundred-degree map when the user interface of the hundred-degree map is displayed on the inner screen, the source device responds to the operation of searching for the route and displays a corresponding route search result on the inner screen, so that the user can conveniently reach a corresponding destination.
Typically, the user interface displayed by the source device on the internal screen is the user interface of the source device running the application in the foreground. Wherein the source device may run one or more applications in the foreground. In some embodiments, the source device may run one or more applications in the background while running one or more applications in the foreground. For example, if the application program running in the foreground of the source device is an aiqi technology, a user interface of the aiqi technology is displayed on the internal screen. For example, when the user interface of the aide is displayed on the internal screen, the source device may also run other applications in the background, such as payment treasures, weChat, etc.
For example, when the internal screen of the source device is in use, the external screen locks and blocks, or the external screen locks but displays a default user interface, where the default user interface may include information such as time and date, and specifically, the user may set information displayed on the default user interface as required.
Taking the source device as an example, the electronic device 10 in the application scenario shown in fig. 1 is taken as an example. For example, the first display 141 is an inner screen of the electronic device 10, the second display 142 is an outer screen of the electronic device 10, and the processor 110 of the source device may determine whether an operation of folding the inner screen from the unfolded state to the closed state is received by detecting an angular change of rotation of a mechanical rotation axis of the first display 141. For example, when the mechanical rotation axis of the first display screen 141 of the source device rotates so that the angle of the first display screen 141 is changed from 180 ° to 0 °, an event that the angle of the first display screen 141 is changed from 180 ° to 0 ° is reported to the processor 110, and when the event that the angle of the first display screen 141 is changed from 180 ° to 0 ° reported by the mechanical rotation axis is received, the processor 110 determines that an operation of folding the inner screen from the unfolded state to the closed state is received. For another example, the processor 110 of the source device may also determine whether an operation to fold the inner screen from the unfolded state to the closed state is received by collecting data from other sensors for sensing the angular change of the first display screen 141. It should be noted that, the embodiment of the present application does not limit the manner in which the source device specifically determines whether an operation of folding the inner screen from the unfolded state to the closed state is received.
In step 802, the source device extinguishes the inner screen in response to receiving an operation to fold the inner screen from the unfolded state to the closed state.
It should be noted that, when the inner screen is folded into the closed state, the inner screen cannot normally display a user interface or the like to a user, so that in order to save power consumption of the electronic device, the inner screen is turned off and is understood as being a black screen, and the inner screen is locked or is a black screen but is unlocked. In addition, when the source terminal equipment folds the inner screen into a closed state, the user cannot normally use the inner screen, namely, the user cannot normally operate the inner screen, so that the control of the source terminal equipment is realized. Taking the source device as an example of the electronic device 10 in the application scenario shown in fig. 1, in the case that the first display screen 141 is an inner screen of the electronic device 10 and the second display screen 142 is an outer screen of the electronic device 10, when determining that an operation of folding the first display screen 141 from an unfolded state to a closed state is received, the processor 110 of the electronic device 10 determines that the operation type is a screen-off operation according to the operation of folding the first display screen 141 from the unfolded state to the closed state, and then turns off the first display screen 141.
The source device may also automatically unlock the external screen in response to receiving an operation of folding the internal screen from the unfolded state to the closed state, and automatically map a user interface displayed on the internal screen to the external screen for display. For example, when the user interface of the love art is displayed on the inner screen, the outer screen is unlocked in response to an operation of folding the inner screen from the unfolded state to the closed state, and the user interface of the love art displayed on the inner screen is automatically mapped to the outer screen for display. Alternatively, the tablet responds to the operation of folding the inner screen from the unfolded state to the closed state, the outer screen continues to keep the black screen and/or lock the screen, or the outer screen continues to keep locking the screen and display a default user interface, and the like, which is not limited.
In step 803, the source device determines a target application from the currently running at least one application after quenching the internal screen in response to the operation of folding the internal screen from the unfolded state to the closed state.
It should be noted that the at least one application currently running may include an application currently running in the foreground and/or the background. For example, the target application may be an application satisfying the first preset condition among the at least one currently running application. The first preset condition may be set correspondingly according to the actual situation, which is not limited.
For example, the target application identifies applications in the whitelist for at least one of the applications currently running. The white list comprises an identification of an application program supporting the screen throwing function, and the identification can be set by a user according to the self requirement, can be set before the source terminal equipment leaves a factory, and can also be generated by learning the source terminal equipment according to a preset strategy. For example, the preset policy specifies that the audio/video type, the map type, the reading type and the instant messaging (instant messaging, IM) type are applications supporting the screen throwing function, and the source device may generate the whitelist according to the identifier of the application installed by itself and conforming to the type specified by the preset policy. For example, if the source device is provided with the love, weChat and payment treasures, the white list generated according to the preset strategy includes the identification of the love and the identification of the WeChat. It should be understood that when the source device detects that a new application is installed, it may determine whether the application meets a type specified by a preset policy, and when the application meets the type specified by the preset policy, add the identifier of the application to the whitelist. Alternatively, in other embodiments, the whitelist includes an identification of applications that do not support the drop-in function. In this case, the target application may identify applications that do not belong to the white list for at least one of the currently running applications.
In this embodiment of the present application, the identifier of the application may be a package name of the application, or an icon of the application, or may be customized according to needs, which is not limited.
For another example, the target application is an application in progress for a service in the currently running applications. Specifically, for video type applications, such as entertainment, cool, tech, etc., services are in progress, it can be understood that the video is playing; for applications of the music playing type, such as shelled shrimp music, internet cloud music, etc., services are in progress, which can be understood as music is being played; for map-type applications, such as a hundred degree map, a high-german map, etc., services are in progress, which can be understood as being in navigation, or searching, etc.; for applications of the game type, such as the king glowing, russian squares, etc., the service is in progress, it is understood that a game is in progress, etc. For applications of the instant messaging type, such as WeChat, QQ, etc., a service is in progress, it can be understood that it is in input, or in a voice call, or in a video call, or a file is being transferred, etc. The application program in progress may be an application program running in the foreground or an application program running in the background.
For another example, the target application is an application identified as currently running and that is on the white list and that is in progress for the service. The white list comprises the identification of the application program supporting the screen throwing function.
For another example, the target application is an application currently running in the foreground.
In some embodiments, the source device does not screen any more when none of the at least one currently running application satisfies the first preset condition. Or when the source device does not currently run the application program, the source device does not screen any more.
In step 804, the source device acquires the screen content from the target application program, and sends the screen content to the target device.
For example, the source device may send the screen content to the target device based on Miracast, airplay, dlna, or Hicast, techniques.
The source device and the target device may be connected before receiving an operation of folding the internal screen from the unfolded state to the closed state by the user, or may be connected after receiving an operation of folding the internal screen from the unfolded state to the closed state by the user. For example, after the source device extinguishes the internal screen, the source device determines a target device for receiving the screen content, and then initiates a connection establishment procedure to the target device. After the source terminal device establishes connection with the target terminal device, when the screen projection content is acquired from the target application program, the screen projection content is sent to the target terminal device.
For example, the source device may determine the target device that receives the cast content based on:
after the source device turns off the inner screen, the identification of the surrounding electronic devices supporting the screen throwing function can be obtained based on communication technologies such as Bluetooth and/or wi-fi, and the identification of the target device is determined according to the obtained identification of the surrounding electronic devices supporting the screen throwing function. The target device identifies the identified electronic device as a target end device for receiving the screen content. In some embodiments, to simplify the manner of determining the target device identifier, the target device identifier may be an identifier that satisfies the second preset condition among identifiers of surrounding electronic devices supporting the screen-casting function. The second preset condition may be set correspondingly according to actual needs, which is not limited.
For example, the target device identifier is an identifier located in a trusted list among identifiers of surrounding electronic devices supporting the screen-casting function. The trusted list comprises at least one identifier of the electronic device, which can be added by a user according to the needs of the user, or can be an identifier of the electronic device connected with the source device. Further, the identifier of the electronic device included in the trusted list may be an identifier of a private electronic device added by the user, or an identifier of a private electronic device that has been connected to the source device. In the embodiment of the present application, the private electronic device refers to an electronic device that is not a public place, for example, a television in home, a desktop in dormitory, and the like, and the electronic device in public place may be a display in a conference room, and the like.
In some embodiments, when the obtained identifiers of the surrounding electronic devices supporting the screen throwing function are not in the trusted list, the source device may obtain the geographical location information of itself, determine whether the current location is a public location according to the geographical location information of itself, and when the geographical location of itself is not a public location, select one identifier from the obtained identifiers of the surrounding electronic devices supporting the screen throwing function as the target device identifier.
For example, when the geographical position of the source terminal device is public, whether the external screen of the source terminal device is shielded is further judged, and if the external screen is not shielded, the identification of the surrounding electronic devices supporting the screen throwing function is displayed on the external screen. The user can select one identifier from the identifiers of the surrounding electronic devices supporting the screen throwing function displayed on the external screen as the target device identifier according to the self requirement. Taking the source device as the electronic device 10 as an example, as shown in fig. 9, the identifiers of the surrounding electronic devices supporting the screen-throwing function, which are acquired by the electronic device 10, include the identifier of the electronic device 20, the identifier of the electronic device 30 and the identifier of the electronic device 40, and then the electronic device 10 displays the identifier of the electronic device 20, the identifier of the electronic device 30 and the identifier of the electronic device 40 on an external screen, and as an example, as shown in fig. 10, the electronic device 10 displays the identifier of the electronic device 20, the identifier of the electronic device 30 and the identifier of the electronic device 40 on the external screen, wherein when the identifier of the electronic device 30 is selected by a user, the electronic device 10 determines that the target device identifier is the identifier of the electronic device 30. For example, in response to an operation of folding the inner screen from the unfolded state to the closed state, the electronic device 10 extinguishes the inner screen, and the outer screen continues to lock, based on the above scenario, in order to enable the user to operate the identifier of the electronic device 20, the identifier of the electronic device 30, and the identifier of the electronic device 40 displayed on the outer screen, the user needs to unlock the outer screen of the electronic device 10 first. For example, the electronic device 10 may unlock the external screen of the electronic device 10 by recognizing a face or fingerprint of the user. Or, the user operates the identifier of the electronic device 20, the identifier of the electronic device 30 and the identifier of the electronic device 40 displayed on the external screen, which are not limited by the external screen lock, that is, when the electronic device 10 locks the external screen, the identifier of the electronic device 20, the identifier of the electronic device 30 and the identifier of the electronic device 40 are displayed, and the user can perform corresponding operations on the identifier of the electronic device 20, the identifier of the electronic device 30 and the identifier of the electronic device 40 without unlocking the external screen.
In other embodiments of the present application, the electronic device 10 further displays a virtual button for controlling to cancel the screen on the external screen, and the user may cause the electronic device 10 to cancel the screen by clicking or touching the virtual button for controlling to cancel the screen. Therefore, a user can cancel screen throwing according to own requirements. Or, the electronic device 10 further displays prompt information that the user does not select the screen-throwing device beyond the preset duration on the external screen, and the screen-throwing is automatically canceled. For example, the preset duration may be 10s, 15s, etc., and the corresponding setting may be performed according to the user requirement. For example, the electronic device 10 may automatically blank or display a default user interface when the screen is cancelled.
Note that, in the embodiment of the present application, the identifier of the electronic device may include an icon of the electronic device, a name of the electronic device, and the like, which is not limited.
In other embodiments, when the external screen is occluded, for example, the source device is placed in a bag or the external screen is placed against a desktop, the source device is no longer on screen.
As another example, the source device may further determine the target device identifier from the identifiers of the surrounding electronic devices supporting the screen-throwing function, with reference to at least one of device capability information (such as whether a display screen can be included, a speaker is included, and touch is not included), device attribute information (such as resolution and sound effects of the display screen of the device), a current running state of the device (such as whether video, audio, or whether communication with other devices is being performed, etc.). The device capability information, the device attribute information and the current running state of the device may be acquired by the source device during the process of acquiring the identifier of the electronic device supporting the screen throwing function. For example, the target device identification identified electronic device may be an electronic device that is not currently playing video, that includes a display screen, and that has a display screen resolution greater than a first threshold. The first threshold may be set according to actual needs.
In the embodiment of the application, the source device may acquire the nearby connectable electronic device based on bluetooth and/or wi-fi scanning. Taking bluetooth as an example, when the source device obtains the identifiers of a plurality of electronic devices based on bluetooth scanning, one or more device identifiers can be determined from the identifiers of the plurality of electronic devices, then the source device establishes a connection with a target device identified by the determined one or more device identifiers, and sends screen contents to the target device identified by the determined one or more device identifiers. In particular, the source device may determine a device identifier from a plurality of device identifiers of the target devices, and send the screen content to the target device identified by the determined device identifier.
In order to improve interaction between the user and the electronic device, for example, after executing step 802, the source device determines whether the intelligent screen-throwing function is started, when the intelligent screen-throwing function is started, step 803 and step 804 are executed again, and when the intelligent screen-throwing function is not started, step 803 and step 804 are not executed any more. The intelligent screen throwing function is started or closed by the source equipment in response to the operation of a user. For example, a virtual button for controlling the ON or OFF of the intelligent screen throwing function is set ON the system setting interface, when the user switches the virtual button from OFF to ON, the source device starts the intelligent screen throwing function, and when the user switches the virtual button from ON to OFF, the source device closes the intelligent screen throwing function. The system setup interface may be a user interface 1100 as shown in fig. 11A. The user interface 1100 comprises a virtual button 1101, wherein the virtual button 1101 is used to control the on or off of the intelligent throw function. Also, for example, virtual buttons for controlling the on or off of the intelligent throw function may also be provided on other user interfaces, such as in a notification bar, or a system toolbar, or a control bar of a pull-up interface or a pull-down interface. For example, the pull-up interface may be displayed by the source device in response to a user sliding down on the outer or inner screen. For another example, the drop down interface may be displayed by the source device in response to a user sliding up on the outer or inner screen.
In other embodiments, the source device prompts the user whether to screen after performing step 802, or after determining that the intelligent screen-casting function has been turned on. For example, as shown in fig. 11B, the source device may display a prompt box 910 on the external screen, where the prompt box 1110 includes prompt information, and the confirm option "yes" and the deny option "no", where the prompt information is used to prompt the user whether to throw the screen, for example, the prompt information may be "please confirm whether to throw the screen? "etc. The source device may continue to perform steps 803 and 804 in response to the user selecting the confirm option "yes", and the source device may not continue to perform steps 803 and 804 in response to the user selecting the deny option "no". In addition, in some examples, after the prompt box 1110 is displayed on the external screen, if the operation of the user is not detected yet after the preset duration, the source device may default to the user to approve the screen, or default to the user to reject the screen. For example, the preset duration may be 10s, 15s, etc., which is not limited. In the event that the default user agrees to drop the screen, then the source device may continue to execute step 803 and step 804. In addition, it should be noted that, when the source device locks the external screen, the prompt box 1110 is displayed, and in order to improve security, the user needs to unlock the external screen by fingerprint, password or facial recognition before operating the prompt box 1110 displayed by the source device. Further, after the user agrees to throw the screen, the source device may display the user interface shown in fig. 11C or fig. 11D on the external screen, so that the user may cancel throwing the screen at any time. It should be further noted that, in the embodiment of the present application, the sequence of judging whether the intelligent screen-throwing function is started or not and prompting whether to throw the screen to the user may not be limited.
In some embodiments, when there is no connectable electronic device around or nearby, that is, when the source device does not obtain the identifier of at least one surrounding electronic device supporting the screen-projection function, the screen is not projected any more.
In another embodiment, as an alternative to step 803 and step 804, the method may further comprise, after step 802:
in step 805, after the source device locks the internal screen in response to the operation of folding the internal screen from the unfolded state to the closed state, it is determined whether the intelligent screen-throwing function is started, if the intelligent screen-throwing function is started, step 806 is executed, and if the intelligent screen-throwing function is not started, the present procedure is ended.
Step 806, the source device determines whether the external screen is blocked, if the external screen is not blocked, step 807 is executed, and if the external screen is blocked, the present process ends.
By way of example, the source device may determine whether the external screen is blocked by a distance sensor, a camera, or the like, for example, when the source device is placed in a bag, or in a pocket, or when the source device is placed on a table with the external screen facing downward, the source device may detect that the external screen is blocked by the distance sensor, the camera, or the like. For another example, when the source device is placed on a table, and the external screen is facing upwards, it may be determined that the external screen is not occluded by a camera that is on the same side as the external screen. It should be noted that the foregoing is merely illustrative of detecting whether the external screen is blocked, and the source device may also determine whether the external screen is blocked by other manners, for example, by using an artificial intelligence (artificial intelligence, AI) manner. Whether the external screen is shielded or not is judged to detect whether the user has an intention of needing to throw the screen or not, if the source device is identified to be put into a pocket or a bag, the source device considers that the user does not use the device any more, and the screen throwing is not triggered any more. In addition, the embodiment of the application can also be combined with other parameters (such as time, place and the like) acquired by other sensors (such as a positioning sensor) and the like to learn the behavior habit of the user using the device, so that the purpose of judging whether the user has an intention of needing to throw a screen or not can be achieved more accurately, and the reliability of triggering the screen throw of the source device can be improved.
In some embodiments, after executing step 806, the source device may prompt the user whether to drop the screen, for example, the prompt box 1110 shown in fig. 11B is displayed on the external screen, after the user agrees to drop the screen, the source device may acquire the identifier of the at least one application supporting the screen-drop function currently running and the identifier of the at least one electronic device for receiving the screen-drop content, may prompt the user on the external screen for the currently executing process, for example, when the source device acquires the identifier of the at least one application supporting the screen-drop function currently running, may display the content acquisition on the external screen, for example, as shown in fig. 11C, and further, when the source device acquires the identifier of the at least one electronic device for receiving the screen-drop content, may display the device acquisition on the external screen, for example, as shown in fig. 11D, and execute step 807 after the source device acquires the identifier of the at least one application supporting the screen-drop function currently running and the identifier of the at least one electronic device for receiving the screen-drop content.
Step 807, the source device displays the identifier of at least one currently running application program supporting the screen throwing function on the external screen; and displaying on the external screen an identification of at least one electronic device for receiving the screen-cast content.
The identifier of the currently running application program supporting the screen throwing function may be an identifier of an application program located in a white list in the currently running application program, where the white list includes an identifier of the at least one application program supporting the screen throwing function. The identification of the at least one electronic device for receiving the screen-cast content may be obtained by the source device based on bluetooth, wi-fi, or a combination thereof.
For example, the currently running at least one application program supporting the screen-throwing function includes an identification of an curiosity, an identification of cool dog music, an identification of tremble, and at least one electronic device for receiving the screen-throwing content is an identification of the electronic device 20, an identification of the electronic device 30, and an identification of the electronic device 40, and then the source device displays the identification of the curiosity, the identification of cool dog music, and the identification of tremble on the external screen, and displays the identification of the electronic device 20, the identification of the electronic device 30, and the identification of the electronic device 40 on the external screen. Wherein the identification of the currently running at least one application supporting the screen-cast function and the identification of the at least one electronic device for receiving the screen-cast content may be displayed on the same user interface, for example as shown in fig. 12A. The identification of the currently running at least one application supporting the screen-cast functionality may be displayed on the same user interface or on a different user interface than the identification of the at least one electronic device for receiving screen-cast content, for example, as shown in fig. 12B.
In some embodiments, in the case where the source device is currently running at least two applications supporting the screen-casting function, the step 807 is performed of displaying, on the external screen, an identification of one or more of the at least two applications currently running supporting the screen-casting function. For example, under the condition that the source device currently runs an application program supporting the screen throwing function, the user does not need to be prompted for the identification of the application program, and the screen throwing content is acquired from the application program supporting the screen throwing function. Or under the condition that the source device currently runs an application program supporting the screen throwing function, the user can be prompted for the identification of the application program, but the user can select the identification of the application program without operation, and the source device can acquire the screen throwing content from the application program.
In still other embodiments, in the case that the source device obtains the identities of at least two electronic devices for receiving the screen-cast content, the step 807 is performed to display, on the external screen, the identity of at least one electronic device of the identities of the at least two electronic devices for receiving the screen-cast content. For example, the source device only obtains the identifier of one electronic device for receiving the screen-casting content, and then the identifier of the electronic device may not need to be displayed on the external screen or the identifier of the electronic device may be displayed on the external screen, but the user may not need to operate to select the identifier of the electronic device.
It should be noted that, when none of the currently running applications support the screen-casting function, or none of the currently running applications, and/or the identifier of the electronic device for receiving the screen-casting content is not acquired, the source device does not cast the screen any more.
Step 808, after receiving the identifier of an application program selected by the user from the identifiers of at least one application program supporting the screen-throwing function currently running, the source device acquires screen-throwing content from the application program identified by the identifier of the application program selected by the user, and sends the screen-throwing content to the electronic device identified by the identifier of the electronic device selected by the user from the identifiers of the at least one electronic device for receiving the screen-throwing content. It should be noted that, an application identified by the identification of the application selected by the user may be referred to as a target application.
Further, in step 807, after receiving the identifier of the application selected by the user from the identifiers of the at least one application supporting the screen-casting function currently running and the identifier of the electronic device selected from the identifiers of the at least one electronic device for receiving the screen-casting content, or after determining the target device for receiving the screen-casting content, the source device displays the user interface in delivery on the external screen, for example, a user interface as shown in fig. 13.
It should be noted that, in the embodiment of the present application, the sequence of the source device acquiring the identifier of the application program supporting the screen-throwing function and the identifier of the electronic device for receiving the screen-throwing content is not limited, and the source device may simultaneously acquire the identifier of the application program supporting the screen-throwing function and the identifier of the electronic device for receiving the screen-throwing content, or may acquire the identifier of the application program supporting the screen-throwing function first, then acquire the identifier of the electronic device for receiving the screen-throwing content, or acquire the identifier of the electronic device for receiving the screen-throwing content first, and then acquire the identifier of the application program supporting the screen-throwing function.
Taking the example of acquiring the identification of the application program supporting the screen throwing function, and then acquiring the identification of the electronic device for receiving the screen throwing content. The source device may display the content in the acquiring (e.g., as shown in fig. 11C) on the external screen when acquiring the identifier of the currently running at least one application supporting the screen-casting function, and display the identifier of the at least one application supporting the screen-casting function on the external screen after acquiring the identifier of the currently running at least one application supporting the screen-casting function, and display the device in the acquiring (e.g., as shown in fig. 11D) on the external screen after responding to the identifier of the user selecting a certain application, and display the identifier of the at least one electronic device on the external screen after acquiring the identifier of the at least one electronic device for receiving the screen-casting content, and display the delivery in the external screen in response to the user selecting the identifier of a certain electronic device (e.g., as shown in fig. 13). In this case, the source device may obtain the on-screen content and the identification of at least one electronic device for receiving the on-screen content at the same time.
For example, when the source device does not acquire the identifier of the application program supporting the screen throwing function, the source device may prompt the user for a failure of screen throwing, and further, the source device may prompt the user for a reason of the failure of screen throwing. For another example, when the source device does not acquire the identifier of the electronic device for receiving the screen-throwing content, the source device may prompt the user for a failure of screen-throwing, and further, the source device may prompt the user for a reason of the failure of screen-throwing.
It should be noted that, in this embodiment of the present application, step 806 may be located before step 805, or may be located after step 807, or may be located after step 808, or step 806 may be performed simultaneously with step 807 or step 808, which is not limited to this, but the source device terminates the screen dropping process once detecting that the external screen is blocked.
Still further, based on steps 801 to 803, or based on steps 801, 802 and steps 805 to 807, the source device displays a control interface on the external screen after successful transmission of the screen content to the target device. The control interface comprises a virtual button with a touch control function. Wherein the virtual buttons with touch control functions are different for different application programs. After receiving the operation of the user on the control interface, the source device responds to the operation on the control interface to control the target application program, so that the purpose of controlling the screen throwing content is achieved. The operation of the user on the control interface may be an operation of a virtual button on the control interface for controlling a certain function, or may be a shortcut gesture operation on the control interface, which is not limited thereto.
For example, for video type applications, such as the aiqi art, when the screen content is content on the user interface of the aiqi art, the control interface may include a progress bar, a pause button, a fast forward button, a selection button, a sharpness button, and the like as shown in fig. 14A. For example, when the source device receives the operation of the definition button from the user and sets the definition from the standard definition to high definition, the voting content presented on the target device is switched from the standard definition to high definition.
For another example, for an application program of audio class, such as cool me music, when the screen content is content on the user interface of cool me music, the control interface may include a progress bar, a pause button, a fast forward button, a menu button, and the like as shown in fig. 14B. For example, when the source device receives an operation that the user clicks the pause button, the target device pauses playing audio.
For another example, for social video applications, such as trembling, when the screen content is content on a tremble user interface, the control interface may include touch area, collection, comment, share, and other functional buttons as shown in fig. 14C. The user may switch the video by sliding up and down in the touch area, or click to pause or start playing the video in the touch area, etc.
For another example, for a game-type application, such as greedy snakes, when the screen content is on the user interface of greedy snakes, the control interface may include up, down, left, right function buttons, and the like, as shown in FIG. 14D. Wherein, the user can control the moving direction of the snake through the up, down, left, right function buttons and the like. Alternatively, when the screen content is content on a user interface of a greedy snake, the control interface may also be as shown in fig. 14E, i.e., the control interface is a joystick to present the user with virtual buttons for controlling the game, which is more visual and vivid.
For another example, for short message applications, such as WeChat, QQ, etc., when the drop-in content is content on the user interface of WeChat, the control interface may include an input method as shown in FIG. 14F or FIG. 14G. When text input is used on the WeChat user interface, the control interface may be as shown in FIG. 14F. When voice input is employed on the WeChat user interface, the control interface may be as shown in FIG. 14G.
Specifically, after executing step 803, or in step 808, after receiving the identifier of the application selected by the user from the identifiers of the at least one application supporting the screen-throwing function currently running, in step 808, the source device uses the application identified by the identifier of the application selected by the user as the target application, and the source device may display a corresponding control interface on the external screen according to the target application.
Example 1: the source device may determine a control interface corresponding to a type to which the target application belongs from control interfaces corresponding to different preset types of applications, and then display the control interface corresponding to the target application on the external screen. For example, the target application is an curiosity, and the type of application to which the target application belongs is a video type. The source device may determine a control interface corresponding to the application program of the video type from preset control interfaces corresponding to different types of application programs, and then display a control interface corresponding to the type of the application program to which the curiosity belongs on the external screen. Helping to simplify implementation. It should be noted that the preset control interface may be different for different types of applications, for example, the preset control interface may be as shown in fig. 14A for a video type application, and the preset control interface may be as shown in fig. 14B for an audio type application. Specifically, the control interfaces corresponding to the different types of application programs may be preset in the electronic device before the device leaves the factory, or may be obtained in advance from the server by the electronic device according to the application program installed in the electronic device, etc.
Since for some applications, there are personality control buttons, such as some game types of applications, and the preset control interface may not meet the needs of the user, the embodiment of the present application further provides a method for displaying the control interface on the external screen.
Example 2: the source device identifies a virtual button (e.g., a virtual button may also be referred to as a User Interface (UI) element, a virtual button, or a control) with a touch function from a user interface of the target application, where the virtual button with the touch function may be clicked, touched, or pressed by a user to implement a corresponding function (e.g., pause playing, fast forwarding, etc.). Then, the electronic device performs rearrangement, cutting, scaling and the like on the identified virtual buttons with the touch control function, generates a control interface corresponding to the target application program, and displays the control interface on the external screen. The control interface comprises at least one virtual button, and the virtual button is used for enabling a user to realize quick operation on a target application program.
It should be noted that the at least one virtual button may include buttons having the same functions as all virtual buttons identified by the source device from the user interface of the target application, or may include buttons having the same functions as part of virtual buttons identified by the source device from the user interface of the target application.
Wherein the control interface comprises virtual buttons with the same functions as the virtual buttons identified on the user interface of the target application program, and the control interface can be realized in the following manner:
1. by mapping virtual buttons on the control interface to the user interface of the target application. For example, the position coordinates of the virtual button on the control interface that are identical to the virtual button functions identified from the user interface of the target application are mapped to the position coordinates of the virtual button identified from the user interface of the target application.
2. The control interface is generated by directly changing the layout of the virtual buttons having the touch function on the user interface of the target application program, so that the control interface includes the virtual buttons having the same functions as the virtual buttons identified on the user interface of the target application program.
3. By adapting the virtual buttons on the control interface to the generic service interface of the target application, the virtual buttons are made to function the same as the virtual buttons identified on the user interface of the target application.
In other embodiments, the at least one virtual button may also include other functional buttons other than the same as the virtual button function identified on the user interface of the target application, such as a cancel screen-cast button, a button to switch the target end device for receiving screen-cast content, and/or a button to switch the target application for screen-casting, etc. Taking the screen canceling button as an example, the electronic device receives the operation of clicking the screen canceling button by the user, and responds to the operation of clicking the screen canceling button by the user to finish screen projection.
Taking the target application as an example, the user interface of the target application may recognize the virtual button 1501 in a in fig. 15 as shown in a in fig. 15, and then re-clip and layout the icon of the virtual button 1501 to obtain the virtual button 1502 shown in fig. 15 as shown in B, where the virtual button 1502 is a virtual button included on the control interface corresponding to the queen and has the same function as the virtual button 1501, and in order to enable the user to perform the same function as when operating the virtual button 1501 when operating the virtual button 1502, the position coordinates of the virtual button 1502 are associated with the position coordinates of the virtual button 1501, for example.
In some embodiments, the source device may identify the virtual button with touch functionality based on a record of historical usage of the target application (e.g., a history of user click screen operations, etc.). Alternatively, the source device may identify the virtual button with touch functionality based on a software development kit (software development kit, SDK) interface provided by the target application. Alternatively, the source device may also identify virtual buttons for touch functions from a location area in a user interface of a predefined target application for placement of virtual buttons for touch functions. In addition, the virtual buttons with touch function may be identified in other manners, for example, the source device may identify the virtual buttons with touch function by performing semantic analysis on a user interface of the target application program (e.g., performing semantic analysis on registration information of the virtual buttons with touch function, etc.), performing image identification on a user interface of the target application program, and so on.
Further, because the capability of the electronic device is limited, for some applications, the virtual button with the touch function on the user interface may not be identified, so in order to simplify the implementation manner of displaying the control interface on the external screen, and further enable the user interface displayed on the external screen to meet the requirement of the user, the source device may determine, according to the control interface corresponding to the type of the target application from among the control interfaces corresponding to the types of different preconfigured applications, and display the control interface on the external screen when the virtual button with the touch function is not identified on the user interface of the target application. Specific implementation may be referred to the related description in example 1, and will not be described herein.
Take the target application as the wang person for example. The user interface of the target application may be shown as a in fig. 15, and when the source device does not recognize the virtual button with the touch function on the user interface shown as a in fig. 15, if the preset application type is that the control interface corresponding to the game type is shown as C in fig. 15, the source device displays the control interface shown as C in fig. 15 on the external screen. As shown in C of fig. 15, the control interface includes a touch area. The user can perform up-down, left-right sliding operation and the like in the touch area, and control of the game is achieved.
The foregoing merely illustrates a method for displaying, by the source device, a control interface corresponding to a target application on the external screen, and specific implementation of the method for displaying, by the source device, the control interface corresponding to the target application on the external screen is not limited.
In some embodiments, after the source device sends the screen content to the target device based on steps 801 to 803 or based on steps 801, 802 and steps 805 to 807, if the user is received to expand the internal screen from the closed state to the expanded state, in response to receiving the user to expand the internal screen from the closed state to the expanded state, the screen content is not sent to the target device, and a user interface of an application program where the screen content is located is displayed on the internal screen. For example, when the internal screen of the electronic device is in a closed state, video on the user interface of the aide is sent to the smart television, and when the internal screen of the electronic device is unfolded by a user to be in an unfolded state, the operation of unfolding the internal screen from the closed state to the unfolded state by the user is responded, screen throwing is stopped, and the user interface of the aide is mapped to the internal screen for display. Further, the external screen does not display a control interface corresponding to the curiosity any more, for example, the external screen can be turned off.
Further, based on steps 801 to 803, or based on steps 801, 802 and steps 805 to 807, after receiving the screen content sent by the source device, the target device may cut or rearrange the screen content and then present the screen content on the target device. For example, before sending the screen-throwing content, the source end device may cut or rearrange the content acquired from the target application program into the screen-throwing content suitable for the target end device to present according to the device attribute (such as resolution, touch capability, etc.) of the target end device, and then send the screen-throwing content to the target end device. Thereby helping the target end device to normally present the screen content.
It should be noted that, the method of displaying, by the source device, the control interface corresponding to the target application on the external screen in the embodiment of the present application may also be applied to other screen projection methods other than the embodiment of the present application, which is not limited.
Example two:
the source device is an electronic device, such as a tablet computer, a mobile phone, etc., which only includes the first display screen. For example, the first display of the source device may be the first display 141 shown in fig. 4A, which is located on the front side of the source device, and the back side of the source device does not include the display.
An example, as shown in fig. 16, is a schematic flow chart of another screen projection method according to an embodiment of the present application, and specifically includes the following steps.
In step 1601, a first display of the source device is in use, and a screen-off operation for the first display is received. The specific description that the first display screen of the source device is in use may be referred to the relevant description that the inner screen of the source device is in use in step 1601, which is not described herein.
The screen-off operation of the first display screen may be an operation of pressing a power key by a user, or a voice command by a user, or an operation of clicking a virtual key for controlling screen-off by a user, which is not limited.
In step 1602, the source device turns off the first display screen in response to the off operation on the first display screen.
In step 1603, after the first display screen is turned off, the source device determines whether the intelligent screen-throwing function is started, if the intelligent screen-throwing function is started, step 1604 is executed, and if the intelligent screen-throwing function is not started, the present procedure is ended.
In step 1604, the source device determines whether the first display screen is blocked, and if the first display screen is blocked, the stream ends. If the first display screen is not occluded, step 1605 is performed.
In step 1605, the source device determines a target application from the currently running application. The specific implementation manner of determining the target application from the currently running application by the source device may be referred to the description related to step 803, which is not described herein.
In step 1606, the source device obtains the screen content from the target application, and sends the screen content to the target device. The related implementation of step 1606 may refer to the related implementation of step 804, which is not described herein.
It should be noted that, after performing step 1602, the source device may skip step 1603 and step 1604, and perform step 1605 and step 1606.
In other embodiments, as an alternative to steps 1605 and 16076, after step 1602 or step 1604, the method may further comprise:
in step 1607, the source device displays, on a partial area of the first display screen, an identification of at least one currently running application supporting a screen-casting function, and an identification of at least one electronic device for receiving screen-casting content.
In this embodiment of the present application, the user may set, as required, the size and the position of the area for displaying the currently running at least one application program supporting the screen-throwing function and the at least one electronic device for receiving the screen-throwing content on the first display screen, or the size and the position of the area for displaying the currently running at least one application program supporting the screen-throwing function and the at least one electronic device for receiving the screen-throwing content on the first display screen are set by the source device before leaving the factory, which is not limited. By way of example, as shown in fig. 17, the source device displays an identification of at least one currently running application supporting a screen-casting function on a region 1700 of the first display 141; and displaying an identification of at least one electronic device for receiving the screen-cast content on the area 1700 of the first display screen 141. Specifically, the manner of displaying the identifier of the currently running at least one application program supporting the screen-throwing function and the identifier of the at least one electronic device for receiving the screen-throwing content in the partial area of the first display screen may refer to the related description in the example one, which is not repeated herein.
Further, the source device may further display, on a partial area of the first display screen, an identifier of at least one currently running application program supporting the screen-throwing function, and when displaying an identifier of at least one electronic device for receiving the screen-throwing content, time, date, and the like on the first display screen.
In step 1608, after receiving an identifier of an application program selected by a user from identifiers of at least one application program supporting a screen-throwing function currently running, the source device acquires screen-throwing content from an application program identified by the identifier of the application program selected by the user, and sends the screen-throwing content to an electronic device identified by the identifier of an electronic device selected by the user from identifiers of at least one electronic device for receiving the screen-throwing content.
In the embodiment of the application, when the first display screen is turned off, the user can perform corresponding operations on the identifier of the application program and the identifier of the electronic device displayed on the partial area of the first display screen, which is helpful to reduce the steps of user operations.
Further, in step 1608, after receiving the identifier of the application selected by the user from the identifiers of the at least one application supporting the screen-casting function currently running and the identifier of the electronic device selected from the identifiers of the at least one electronic device for receiving the screen-casting content, or after determining the target device for receiving the screen-casting content, the source device displays the user interface in delivery on a partial area of the first display screen, for example, a user interface as shown in fig. 13.
Still further, based on steps 1601 to 1606, or based on steps 1601, 1602, 1603, 1604, and steps 1607 to 1608, the source device displays a control interface on a partial area of the first display screen after successful transmission of the screen content to the target device. The relevant description of the control interface displayed on the partial area of the first display screen can be referred to as the relevant description of the control interface displayed on the external screen in example one.
The area on the first display screen for displaying the control interface and the area in step 1607 for displaying the application identifier and the electronic device identifier may be the same or different, which is not limited.
In some embodiments, after the source device sends the screen content to the target device based on steps 1601 to 1606, or based on steps 1601, 1602, 1603, 1604, and steps 1607 to 1608, if receiving the first display screen operation (e.g., the unlock operation) from the user, the screen content is not sent to the target device any more in response to receiving the unlock operation from the user on the first display screen, and the user interface of the application program where the screen content is located is displayed on the first display screen. The unlocking operation of the user on the first display screen may be an operation of inputting a fingerprint, an operation of inputting a password, or the like, which is not limited.
For example, after the first display screen of the electronic device is turned off, video on the user interface of the curiosity is sent to the smart television, when the first display screen of the electronic device is unlocked, the screen throwing is stopped, and the user interface of the curiosity is mapped to the first display screen for displaying.
Further, based on the steps 1601 to 1606, or based on the steps 1601, 1602, 1603, 1604 and the steps 1607 to 1608, the target device may cut or rearrange the screen content after receiving the screen content sent by the source device, and then present the screen content on the target device. For example, before sending the screen-throwing content, the source end device may cut or rearrange the content acquired from the target application program into the screen-throwing content suitable for the target end device to present according to the device attribute (such as resolution, touch capability, etc.) of the target end device, and then send the screen-throwing content to the target end device. Thereby helping the target end device to normally present the screen content.
Example three:
in contrast to the embodiments, the source device is an electronic device that includes a first display screen and a second display screen. For example, the first display screen of the source device may be the first display screen 141 shown in fig. 4A, which is located on the front side of the source device, and the second display screen of the source device may be the second display screen 142 shown in fig. 4B, which is located on the back side of the source device.
As shown in fig. 18, a flowchart of another screen projection method according to an embodiment of the present application specifically includes the following steps 1601 and 1602, and after executing step 1602, the following steps are further executed:
step 1803, after the source device turns off the first display screen, it determines whether the intelligent screen switching function is started, if the intelligent screen switching function is started, step 1804 is executed, and if the intelligent screen switching function is not started, the present process ends.
In step 1804, the source device determines whether the first display screen and the second display screen are blocked, and if both the first display screen and the second display screen are blocked, the stream ends. If either the first display or the second display is not occluded, step 1805 is performed.
For example, the source device first determines whether the second display screen is blocked, if the second display screen is not blocked, then step 1805 is executed, if the second display screen is blocked, then whether the first display screen is blocked, if the first display screen is not blocked, then step 1805 is executed, and if the first display screen is blocked, then the present process is ended.
In step 1805, the source device determines the target application from the currently running application. The specific implementation manner of determining the target application from the currently running application by the source device may be referred to the description related to step 803, which is not described herein.
In step 1806, the source device acquires the screen content from the target application program, and sends the screen content to the target device. The related implementation of step 1806 may refer to the related implementation of step 604, which is not described herein.
It should be noted that, after performing step 1602, the source device may skip step 1803 and step 1804, and perform step 1805 and step 1806.
In other embodiments, as an alternative to steps 1805 and 1806, after step 1803 or step 1804, the method may further include:
in step 1807, when the second display screen is not blocked, the source device displays, on the second display screen, an identifier of at least one currently running application program supporting the screen-casting function, and displays an identifier of at least one electronic device for receiving the screen-casting content. When the second display screen is blocked and the first display screen is not blocked, the source device displays the identification of at least one currently running application program supporting the screen throwing function on a partial area of the first display screen, and displays the identification of at least one electronic device for receiving the screen throwing content.
It should be noted that, a specific implementation manner of displaying, on the second display screen, the identifier of the currently running at least one application program supporting the screen-throwing function and displaying the identifier of the at least one electronic device for receiving the screen-throwing content may refer to the related description in the example one, which is not repeated herein. The manner of displaying the identifier of the currently running at least one application supporting the screen-casting function and the identifier of the at least one electronic device for receiving the screen-casting content in the partial area of the first display screen may be referred to in the related description in the second example, and will not be described herein.
In step 1808, after receiving an identifier of an application program selected by a user from identifiers of at least one application program supporting a screen-throwing function currently running, the source device acquires screen-throwing content from an application program identified by the identifier of the application program selected by the user, and sends the screen-throwing content to an electronic device identified by the identifier of an electronic device selected by the user from identifiers of at least one electronic device for receiving the screen-throwing content.
Further, in step 1808, after receiving the identifier of the application selected by the user from the identifiers of the at least one application supporting the screen-casting function currently running and the identifier of the electronic device selected from the identifiers of the at least one electronic device for receiving the screen-casting content, or after determining the target device for receiving the screen-casting content, the source device displays the user interface in delivery on a partial area of the first display screen, for example, a user interface as shown in fig. 13.
Still further, based on steps 1601, 1602 and steps 1803 to 1806, or based on steps 1601, 1602, 1803, 1804 and steps 1807 to 1808, the source device displays a control interface on a partial area of the first display screen or the second display screen after successful transmission of the screen content to the target device. The control interface comprises a virtual button with a touch control function. It should be noted that, for a specific implementation manner of displaying the control interface on the partial area of the first display screen or the second display screen, reference may be made to the description related to example one.
The area on the first display screen for displaying the control interface and the area in step 1808 for displaying the application identifier and the electronic device identifier may be the same or different, which is not limited.
Example four:
the source device is an electronic device including a flip cover, for example, a flip phone, as shown in fig. 19, and includes an inner screen and an outer screen, where the inner screen is located on the inner side of the phone cover, not shown in the figure, and the outer screen is located on the outer side of the phone cover. Taking the source end device as a flip phone as an example, in this scenario, compared with the electronic device with a foldable screen as the source end device in the example one, the screen throwing method is only different in that when the source end device is the flip phone, the source end device receives the operation of closing the mobile phone cover by the user, and extinguishes the screen in the internal screen. After the mobile phone cover is opened, the source device can respond to the user to output a password or press a preset key (such as a # key, a x key or a combination key, etc.) for a long time to unlock the inner screen. Namely, in the case of the flip phone, when the source device normally uses the inner screen, the operation of closing the mobile phone cover by the user is received, and then the screen locking is performed on the inner screen in response to the closing of the mobile phone cover by the user, and then the screen throwing is triggered, specifically, in the first example, the source device executes the steps after step 802, which are not repeated herein.
Example five:
the source device may also be an electronic device with a smart protective cover mounted thereon, taking the source device as an example of the electronic device 10, for example, as shown in fig. 20, the electronic device 10 includes a first display screen 141, and the electronic device 10 is clipped into the smart protective cover 20, where the smart protective cover 20 includes a visualization area 18. After the user closes the cover 16 of the smart protective sleeve 20, it is shown as C in fig. 20. After the user opens the cover 16 of the smart sleeve 20, it may be as shown at B in fig. 20.
In this scenario, compared with the example electronic device in which the source device is a foldable screen, the screen throwing method is different only in that, in the scenario of the electronic device with the intelligent protection sleeve installed, the source device receives an operation of closing the cover of the intelligent protection sleeve by a user, and extinguishes the screen of the first display screen. After the source device opens the cover of the intelligent protection sleeve, the first display screen can be unlocked in response to the user outputting a password, a fingerprint and the like. That is, in the case of the electronic device with the intelligent protection sleeve, when the source device normally uses the first display screen, the source device receives the operation of closing the cover of the intelligent protection sleeve by the user, and then, the screen locking is performed on the first display screen in response to the closing of the mobile phone cover by the user, and then, the screen is triggered, specifically, in the first example, the source device executes the steps after the step 602, which are not described herein again.
It should be noted that, in this scenario, the identifier of the application program, the identifier of the electronic device, and the control interface may be displayed in the visualization area 17 of the smart protective sheath 20, and the specific display manner may be referred to the related description in example one, which is not repeated here.
Furthermore, in other embodiments, the smart protective sheath may not include a visualization area when the source device does not need to prompt the user for an identification of an application, an identification of an electronic device, or a control interface.
Example six
The source device may also be another electronic device with a foldable screen. The source device comprises a first display screen, and the first display screen is a foldable screen. For example, the first display of the source device may be in an expanded state, such as the first display 141 shown in fig. 5A, and the first display of the source device may be in a closed state, such as shown in fig. 5B, and when the first display is in the closed state, such as shown in fig. 5B, the source device may present a corresponding interface to the user through the region 500 in the first display 141. In this scenario, the first display screen of the source device is in use in the unfolded state, and when receiving the operation of folding the first display screen from the unfolded state to the closed state, the first display screen triggers to throw the screen to the target device, and a specific screen throwing method can be referred to in the description of the first example. For example, when the source device receives an operation of folding the first display screen 141 from the unfolded state to the closed state, if the screen dropping fails, the content displayed on the first display screen 141 before receiving the operation may be mapped to be displayed in the area 500 of the folded first display screen 141, and if the screen dropping succeeds, a control interface may be displayed in the area 1500 of the first display screen 141, where a manner of displaying the control interface in the area 1500 of the first display screen 141 may be described in the related description in example one.
Further, in other embodiments, the first display of the source device is in a closed state, receives an operation of expanding the first display from the closed state to the expanded state, and stops screen projection in response to the operation. For example, after the source device stops screen-casting, a user interface where the screen-casting content is located may be automatically displayed on the first display screen.
Example seven
The source device may also be a screen-retractable electronic device, including a first display screen, where the first display screen is a retractable display screen. For example, the state of the first display of the source device after extension may be shown in fig. 6A, and the state of the first display after contraction may be shown in fig. 6B. In this scenario, the first display screen of the source device is in use in an extended state, and the first display screen is received to be contracted, so as to trigger the screen to be projected to the target device, and a specific screen projection method may be referred to in the description of example one. For example, after receiving the operation of shrinking the first display screen 141, if the screen dropping fails, the source device may map the content displayed on the first display screen 141 before receiving the operation to be displayed in the area 600 of the first display screen 141 after shrinking, and if the screen dropping succeeds, may display the control interface in the area 600 of the first display screen 141, where the manner of displaying the control interface in the area 600 of the first display screen 141 may be described in the related description in example one.
Further, in other embodiments, the first display of the source device receives an expansion operation on the first display in the contracted state, and stops screen dropping in response to the expansion operation. For example, after the source device stops screen-casting, a user interface where the screen-casting content is located may be automatically displayed on the first display screen.
The above embodiments may be used alone or in combination with each other to achieve different functions.
In the embodiments provided in the present application, the method provided in the embodiments of the present application is described from the point of view that the electronic device is the execution subject. In order to implement the functions in the methods provided in the embodiments of the present application, the electronic device may include a hardware structure and/or a software module, where the functions are implemented in the form of a hardware structure, a software module, or a hardware structure plus a software module. Some of the functions described above are performed in a hardware configuration, a software module, or a combination of hardware and software modules, depending on the specific application of the solution and design constraints.
Based on the same conception, fig. 21 shows an apparatus 2100 provided in the present application for performing the screen projection method shown in fig. 8, 16 or 18. By way of example, the device 2100 includes a processing module 2101 and a transceiver module 2102.
Illustratively, the processing module 2101 is configured to detect a user operation, and trigger the transceiver module 2102 to send the screen content to the target device in response to the user operation.
Based on the same conception, fig. 22 shows an apparatus 2200 provided herein. The device 2200 includes at least one processor 2210, a memory 2220, and a transceiver 2230. The processor 2210 is coupled to the memory 2220 and the transceiver 2230, where the coupling in the embodiments of the present application is an indirect coupling or communication connection between devices, units, or modules, which may be in electrical, mechanical, or other forms for information interaction between the devices, units, or modules. The connection medium between the transceiver 2230, the processor 2210, and the memory 2220 is not limited in the embodiment of the present application. For example, the embodiments of the present application are illustrated in fig. 22 as being connected by buses, which may be divided into address buses, data buses, control buses, etc., among the memory 2220, the processor 2210, and the transceiver 2230.
Specifically, memory 2220 is used to store program instructions.
The transceiver 2230 is configured to send screen content, control instructions, etc., to the target device.
Processor 2210 is used to invoke program instructions stored in memory 2220 to cause device 2200 to perform the screen casting method shown in fig. 8, 16, or 18.
In the embodiments of the present application, the processor 2210 may be a general purpose processor, a digital signal processor, an application specific integrated circuit, a field programmable gate array or other programmable logic device, a discrete gate or transistor logic device, or a discrete hardware component, where the methods, steps, and logic blocks disclosed in the embodiments of the present application may be implemented or performed. The general purpose processor may be a microprocessor or any conventional processor or the like. The steps of a method disclosed in connection with the embodiments of the present application may be embodied directly in a hardware processor for execution, or in a combination of hardware and software modules in the processor for execution.
In the embodiment of the present application, the memory 2220 may be a nonvolatile memory, such as a hard disk (HDD) or a Solid State Drive (SSD), and may also be a volatile memory (volatile memory), for example, a random-access memory (RAM). The memory is any other medium that can be used to carry or store desired program code in the form of instructions or data structures and that can be accessed by a computer, but is not limited to such. The memory in the embodiments of the present application may also be circuitry or any other device capable of implementing a memory function for storing program instructions and/or data.
It should be understood that the apparatus 1300 and the apparatus 2200 may be used to implement the methods shown in fig. 8, 16 or 18 of the embodiments of the present application, and the relevant features may be referred to above, which are not described herein.
It will be apparent to those skilled in the art that embodiments of the present application may be implemented in hardware, or firmware, or a combination thereof. When implemented in software, the functions described above may be stored on or transmitted over as one or more instructions or code on a computer-readable medium. Computer-readable media includes both computer storage media and communication media including any medium that facilitates transfer of a computer program from one place to another. A storage media may be any available media that can be accessed by a computer. Taking this as an example but not limited to: computer readable media can include RAM, ROM, electrically erasable programmable read-Only memory (electrically erasable programmable read Only memory, EEPROM), compact-disk-read-Only memory (CD-ROM) or other optical disk storage, magnetic disk storage or other magnetic storage devices, or any other medium that can be used to carry or store desired program code in the form of instructions or data structures and that can be accessed by a computer. Furthermore, it is possible to provide a device for the treatment of a disease. Any connection is properly termed a computer-readable medium. For example, if the software is transmitted from a website, server, or other remote source using a coaxial cable, fiber optic cable, twisted pair, digital subscriber line (digital subscriber line, DSL), or wireless technologies such as infrared, radio, and microwave, then the coaxial cable, fiber optic cable, twisted pair, DSL, or wireless technologies such as infrared, radio, and microwave are included in the fixing of the medium. As used in the embodiments of the present application, discs (disks) and disks include Compact Discs (CDs), laser discs, optical discs, digital versatile discs (digital video disc, DVDs), floppy disks, and blu-ray discs where disks usually reproduce data magnetically, while disks reproduce data optically with lasers. Combinations of the above should also be included within the scope of computer-readable media.
In summary, the foregoing is merely exemplary embodiments of the present application and is not intended to limit the scope of the present application. Any modification, equivalent replacement, improvement, etc. made according to the disclosure of the present application should be included in the protection scope of the present application.

Claims (15)

1. A method for projecting a screen, the method being applied to a first electronic device, the first electronic device comprising an inner screen, the inner screen being a foldable screen, the method comprising:
the first electronic equipment receives an operation of folding the inner screen from an unfolding state to a closing state;
responding to the operation of folding the inner screen from an unfolding state to a closing state, the first electronic equipment extinguishes the inner screen, and acquires screen throwing content from a target application program in at least one currently running application program;
and the first electronic equipment sends the screen projection content to the second electronic equipment.
2. The method of claim 1, wherein the first electronic device further comprises an external screen.
3. The method of claim 2, wherein, in response to the operation of folding the inner screen from the unfolded state to the closed state, after the inner screen of the first electronic device is extinguished, before acquiring the screen-casting content from the target application of the currently running at least one application, the method further comprises:
Determining that an intelligent screen throwing function is started; and/or
Prompting a user whether to allow screen projection or not, and receiving the operation of allowing screen projection by the user; and/or the number of the groups of groups,
and determining that the outer screen is not blocked.
4. A method as claimed in claim 3, wherein the method further comprises:
and if the outer screen is shielded, the first electronic equipment does not throw the screen any more.
5. The method of any of claims 2 to 4, wherein the first electronic device, after sending the on-screen content to a second electronic device, further comprises:
and the first electronic equipment displays a control interface on the external screen, wherein the control interface is used for realizing the shortcut operation of the target application program.
6. The method of claim 5, wherein the first electronic device displays a control interface on the external screen, comprising:
the first electronic equipment identifies a virtual button with a touch function in the target application program, and displays the control interface on the external screen according to the virtual button with the touch function; and/or the number of the groups of groups,
the first electronic device determines a control interface corresponding to the type of the target application program from preset control interfaces corresponding to the type of the application program, and displays the control interface corresponding to the type of the target application program on the external screen.
7. The method of any of claims 2 to 4, wherein before the first electronic device obtains the screen-cast content from a target application of the at least one currently running application, further comprising:
the first electronic equipment displays an identifier of at least one application program supporting a screen throwing function in at least one currently running application program on the external screen;
the first electronic equipment receives an operation of selecting an identifier of an application program displayed on an external screen by a user;
and responding to the operation of the user to select the identification of the application program displayed on the external screen, and determining that the target application program is the application program identified by the identification of the application program selected by the user.
8. The method of any of claims 2 to 4, wherein after the first electronic device extinguishes the internal screen of the first electronic device, before the screen-cast content is sent to the second electronic device, the method further comprising:
the first electronic device obtains the identification of at least one electronic device;
the first electronic device determines an identification of a target electronic device from the identifications of the at least one electronic device, wherein the identification of the target electronic device is used for identifying the second electronic device.
9. The method of claim 8, wherein the first electronic device determining an identity of a target electronic device from the identities of the at least one electronic device comprises:
the first electronic device determines an identification for identifying the private electronic device as an identification of the target electronic device from the identifications of the at least one electronic device.
10. The method of claim 8, wherein the first electronic device determining an identity of a target electronic device from the identities of the at least one electronic device comprises:
the first electronic device displays an identifier of at least one electronic device on the external screen;
the first electronic equipment receives an operation that a user selects the identification of the electronic equipment displayed on the external screen;
and the first electronic device takes the identification of the electronic device selected by the user as the identification of the target electronic device.
11. The method of claim 10, wherein each of the identifications of the at least one electronic device is used to identify a public electronic device.
12. The method of any of claims 1-4, wherein after the first electronic device sends the on-screen content to a second electronic device, the method further comprises:
The first electronic equipment receives an operation of expanding the inner screen from a closed state to an expanded state;
and responding to the operation of expanding the inner screen from the closed state to the expanded state, stopping screen casting by the first electronic equipment, and displaying screen casting content on the inner screen.
13. An electronic device comprising a processor and a memory;
program instructions are stored in the memory;
the program instructions, when executed, cause the electronic device to perform the method of any of claims 1 to 12.
14. A chip, characterized in that the chip is coupled to a memory in an electronic device such that the chip, when run, invokes program instructions stored in the memory, implementing the method according to any of claims 1 to 12.
15. A computer readable storage medium comprising program instructions which, when run on a device, cause the device to perform the method of any of claims 1 to 12.
CN202310208263.0A 2019-07-31 2019-07-31 Screen projection method and electronic equipment Pending CN116185324A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202310208263.0A CN116185324A (en) 2019-07-31 2019-07-31 Screen projection method and electronic equipment

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN202310208263.0A CN116185324A (en) 2019-07-31 2019-07-31 Screen projection method and electronic equipment
CN201910704758.6A CN112394891B (en) 2019-07-31 2019-07-31 Screen projection method and electronic equipment

Related Parent Applications (1)

Application Number Title Priority Date Filing Date
CN201910704758.6A Division CN112394891B (en) 2019-07-31 2019-07-31 Screen projection method and electronic equipment

Publications (1)

Publication Number Publication Date
CN116185324A true CN116185324A (en) 2023-05-30

Family

ID=74230363

Family Applications (2)

Application Number Title Priority Date Filing Date
CN202310208263.0A Pending CN116185324A (en) 2019-07-31 2019-07-31 Screen projection method and electronic equipment
CN201910704758.6A Active CN112394891B (en) 2019-07-31 2019-07-31 Screen projection method and electronic equipment

Family Applications After (1)

Application Number Title Priority Date Filing Date
CN201910704758.6A Active CN112394891B (en) 2019-07-31 2019-07-31 Screen projection method and electronic equipment

Country Status (2)

Country Link
CN (2) CN116185324A (en)
WO (1) WO2021018274A1 (en)

Families Citing this family (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115131547A (en) * 2021-03-25 2022-09-30 华为技术有限公司 Method, device and system for image interception by VR/AR equipment
CN113259757A (en) * 2021-04-08 2021-08-13 读书郎教育科技有限公司 Method for video screen projection by being convenient and fast to be compatible with multiple applications
CN113138737B (en) * 2021-04-16 2023-11-03 阿波罗智联(北京)科技有限公司 Display control method, device, equipment, medium and program product for screen-throwing scene
CN113268211B (en) * 2021-05-13 2023-05-12 维沃移动通信(杭州)有限公司 Image acquisition method, device, electronic equipment and storage medium
CN115373558A (en) * 2021-05-18 2022-11-22 广州视源电子科技股份有限公司 Screen projection method, device, equipment and storage medium
WO2023036082A1 (en) * 2021-09-09 2023-03-16 华为技术有限公司 System and method for displaying and controlling remote device task
CN114063951B (en) * 2021-09-26 2022-12-02 荣耀终端有限公司 Screen projection abnormity processing method and electronic equipment
CN114089940B (en) * 2021-11-18 2023-11-17 佛吉亚歌乐电子(丰城)有限公司 Screen projection method, device, equipment and storage medium
CN114428599A (en) * 2022-01-30 2022-05-03 深圳创维-Rgb电子有限公司 Screen projection brightness control method and device, storage medium and screen projector
CN114786058B (en) * 2022-04-27 2024-02-06 南京欧珀软件科技有限公司 Multimedia data display method, device, terminal and storage medium
CN116048350B (en) * 2022-07-08 2023-09-08 荣耀终端有限公司 Screen capturing method and electronic equipment
CN117850644A (en) * 2022-09-30 2024-04-09 华为技术有限公司 Window switching method and electronic equipment
CN115964011B (en) * 2023-03-16 2023-06-06 深圳市湘凡科技有限公司 Method and related device for displaying application interface based on multi-screen cooperation

Family Cites Families (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
GB9805198D0 (en) * 1998-03-11 1998-05-06 Maddock Alan Portable visual display device
JP2003101909A (en) * 2001-09-25 2003-04-04 Matsushita Electric Ind Co Ltd Portable electronic equipment and image display device
CN103369070A (en) * 2012-04-04 2013-10-23 朱洪来 Three-screen flip intelligent handset
KR20140140957A (en) * 2013-05-30 2014-12-10 삼성전자주식회사 Method for mirroring screen data, machine-readable storage medium and electronic device
US20140372896A1 (en) * 2013-06-14 2014-12-18 Microsoft Corporation User-defined shortcuts for actions above the lock screen
CN103399643A (en) * 2013-08-23 2013-11-20 深圳市金立通信设备有限公司 Application program starting method of flexible terminal and flexible terminal
KR102538955B1 (en) * 2016-03-02 2023-06-01 삼성전자 주식회사 Electronic apparatus and method for displaying and transmitting image thereof
CN107589973A (en) * 2017-08-29 2018-01-16 珠海格力电器股份有限公司 A kind of method, apparatus and electronic equipment for starting application
CN107659712A (en) * 2017-09-01 2018-02-02 咪咕视讯科技有限公司 A kind of method, apparatus and storage medium for throwing screen
CN109871147B (en) * 2019-02-22 2020-12-01 华为技术有限公司 Touch screen response method and electronic equipment
CN109992231B (en) * 2019-03-28 2021-07-23 维沃移动通信有限公司 Screen projection method and terminal
CN110058828B (en) * 2019-04-01 2022-06-21 Oppo广东移动通信有限公司 Application program display method and device, electronic equipment and storage medium
CN110308885B (en) * 2019-06-25 2022-04-01 维沃移动通信有限公司 Screen projection method and mobile terminal

Also Published As

Publication number Publication date
WO2021018274A1 (en) 2021-02-04
CN112394891B (en) 2023-02-03
CN112394891A (en) 2021-02-23

Similar Documents

Publication Publication Date Title
CN112394891B (en) Screen projection method and electronic equipment
KR101757870B1 (en) Mobile terminal and control method therof
CN106487584B (en) Management method, router and the mobile terminal of router
CN106209800B (en) Equipment Authority sharing method and apparatus
US11747953B2 (en) Display method and electronic device
US11435975B2 (en) Preview display method based on multi-angle and communication system
CN105516508A (en) Method and device for activating virtual SIM card
CN105163366A (en) Wireless network connection method and device
CN104965448A (en) Intelligent device control method and device
US11886830B2 (en) Voice call translation capability negotiation method and electronic device
KR20200011869A (en) Method and Apparatus for Establishing Device Connection
US20180144546A1 (en) Method, device and terminal for processing live shows
US20170230472A1 (en) Server apparatus and transmission system
EP4325338A1 (en) Display control method, electronic device, and computer storage medium
CN104837154A (en) Wireless access point control method and device
CN105426218A (en) Method and device for controlling audio playing
CN106101456A (en) A kind of method and device of call contact
CN105843708A (en) Data processing method and device
CN110601959A (en) Session message display method, device, terminal and storage medium
CN106453032A (en) Information pushing method, device and system
CN104881342A (en) Terminal testing method and device
CN111132047A (en) Network connection method and device
CN105227891A (en) A kind of video call method and device
CN106922005B (en) Method and device for accessing wireless access point and computer readable storage medium
CN105100135A (en) Network sharing setting method and device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination